Dec 04 10:15:02 crc systemd[1]: Starting Kubernetes Kubelet... Dec 04 10:15:02 crc restorecon[4678]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 10:15:02 crc restorecon[4678]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 10:15:03 crc restorecon[4678]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 10:15:03 crc restorecon[4678]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 10:15:03 crc restorecon[4678]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 10:15:03 crc restorecon[4678]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 10:15:03 crc restorecon[4678]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 10:15:03 crc restorecon[4678]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 10:15:03 crc restorecon[4678]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 04 10:15:03 crc kubenswrapper[4831]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 04 10:15:03 crc kubenswrapper[4831]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 04 10:15:03 crc kubenswrapper[4831]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 04 10:15:03 crc kubenswrapper[4831]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 04 10:15:03 crc kubenswrapper[4831]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 04 10:15:03 crc kubenswrapper[4831]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.149025 4831 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151549 4831 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151567 4831 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151572 4831 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151575 4831 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151580 4831 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151584 4831 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151587 4831 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151591 4831 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151596 4831 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151601 4831 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151605 4831 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151609 4831 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151614 4831 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151618 4831 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151621 4831 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151630 4831 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151635 4831 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151639 4831 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151643 4831 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151647 4831 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151650 4831 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151667 4831 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151671 4831 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151677 4831 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151682 4831 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151687 4831 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151691 4831 feature_gate.go:330] unrecognized feature gate: Example Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151695 4831 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151699 4831 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151703 4831 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151706 4831 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151710 4831 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151714 4831 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151718 4831 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151722 4831 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151725 4831 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151729 4831 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151732 4831 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151736 4831 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151739 4831 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151743 4831 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151747 4831 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151753 4831 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151757 4831 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151760 4831 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151764 4831 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151767 4831 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151770 4831 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151774 4831 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151777 4831 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151781 4831 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151785 4831 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151788 4831 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151792 4831 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151795 4831 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151799 4831 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151802 4831 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151805 4831 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151809 4831 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151813 4831 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151816 4831 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151821 4831 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151826 4831 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151830 4831 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151834 4831 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151837 4831 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151841 4831 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151844 4831 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151848 4831 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151851 4831 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.151855 4831 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.151932 4831 flags.go:64] FLAG: --address="0.0.0.0" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.151940 4831 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.151946 4831 flags.go:64] FLAG: --anonymous-auth="true" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.151954 4831 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.151959 4831 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.151964 4831 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.151969 4831 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.151974 4831 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.151978 4831 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.151983 4831 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.151987 4831 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.151992 4831 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.151996 4831 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152001 4831 flags.go:64] FLAG: --cgroup-root="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152005 4831 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152010 4831 flags.go:64] FLAG: --client-ca-file="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152014 4831 flags.go:64] FLAG: --cloud-config="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152019 4831 flags.go:64] FLAG: --cloud-provider="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152023 4831 flags.go:64] FLAG: --cluster-dns="[]" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152029 4831 flags.go:64] FLAG: --cluster-domain="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152033 4831 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152037 4831 flags.go:64] FLAG: --config-dir="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152041 4831 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152045 4831 flags.go:64] FLAG: --container-log-max-files="5" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152050 4831 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152054 4831 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152059 4831 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152063 4831 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152067 4831 flags.go:64] FLAG: --contention-profiling="false" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152072 4831 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152076 4831 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152080 4831 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152084 4831 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152090 4831 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152094 4831 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152099 4831 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152103 4831 flags.go:64] FLAG: --enable-load-reader="false" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152107 4831 flags.go:64] FLAG: --enable-server="true" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152111 4831 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152115 4831 flags.go:64] FLAG: --event-burst="100" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152120 4831 flags.go:64] FLAG: --event-qps="50" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152124 4831 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152128 4831 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152132 4831 flags.go:64] FLAG: --eviction-hard="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152137 4831 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152141 4831 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152145 4831 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152150 4831 flags.go:64] FLAG: --eviction-soft="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152154 4831 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152158 4831 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152162 4831 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152166 4831 flags.go:64] FLAG: --experimental-mounter-path="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152170 4831 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152174 4831 flags.go:64] FLAG: --fail-swap-on="true" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152178 4831 flags.go:64] FLAG: --feature-gates="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152183 4831 flags.go:64] FLAG: --file-check-frequency="20s" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152188 4831 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152192 4831 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152196 4831 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152200 4831 flags.go:64] FLAG: --healthz-port="10248" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152204 4831 flags.go:64] FLAG: --help="false" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152208 4831 flags.go:64] FLAG: --hostname-override="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152212 4831 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152217 4831 flags.go:64] FLAG: --http-check-frequency="20s" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152221 4831 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152225 4831 flags.go:64] FLAG: --image-credential-provider-config="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152229 4831 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152234 4831 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152239 4831 flags.go:64] FLAG: --image-service-endpoint="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152243 4831 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152247 4831 flags.go:64] FLAG: --kube-api-burst="100" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152251 4831 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152256 4831 flags.go:64] FLAG: --kube-api-qps="50" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152260 4831 flags.go:64] FLAG: --kube-reserved="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152264 4831 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152268 4831 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152272 4831 flags.go:64] FLAG: --kubelet-cgroups="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152276 4831 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152280 4831 flags.go:64] FLAG: --lock-file="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152284 4831 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152288 4831 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152292 4831 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152298 4831 flags.go:64] FLAG: --log-json-split-stream="false" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152302 4831 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152306 4831 flags.go:64] FLAG: --log-text-split-stream="false" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152309 4831 flags.go:64] FLAG: --logging-format="text" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152313 4831 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152319 4831 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152323 4831 flags.go:64] FLAG: --manifest-url="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152328 4831 flags.go:64] FLAG: --manifest-url-header="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152333 4831 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152337 4831 flags.go:64] FLAG: --max-open-files="1000000" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152342 4831 flags.go:64] FLAG: --max-pods="110" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152346 4831 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152350 4831 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152355 4831 flags.go:64] FLAG: --memory-manager-policy="None" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152358 4831 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152363 4831 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152367 4831 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152372 4831 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152381 4831 flags.go:64] FLAG: --node-status-max-images="50" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152385 4831 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152389 4831 flags.go:64] FLAG: --oom-score-adj="-999" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152393 4831 flags.go:64] FLAG: --pod-cidr="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152397 4831 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152404 4831 flags.go:64] FLAG: --pod-manifest-path="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152408 4831 flags.go:64] FLAG: --pod-max-pids="-1" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152412 4831 flags.go:64] FLAG: --pods-per-core="0" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152416 4831 flags.go:64] FLAG: --port="10250" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152420 4831 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152424 4831 flags.go:64] FLAG: --provider-id="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152428 4831 flags.go:64] FLAG: --qos-reserved="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152432 4831 flags.go:64] FLAG: --read-only-port="10255" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152436 4831 flags.go:64] FLAG: --register-node="true" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152440 4831 flags.go:64] FLAG: --register-schedulable="true" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152444 4831 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152453 4831 flags.go:64] FLAG: --registry-burst="10" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152457 4831 flags.go:64] FLAG: --registry-qps="5" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152461 4831 flags.go:64] FLAG: --reserved-cpus="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152465 4831 flags.go:64] FLAG: --reserved-memory="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152471 4831 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152475 4831 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152479 4831 flags.go:64] FLAG: --rotate-certificates="false" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152484 4831 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152488 4831 flags.go:64] FLAG: --runonce="false" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152493 4831 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152497 4831 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152502 4831 flags.go:64] FLAG: --seccomp-default="false" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152506 4831 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152511 4831 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152515 4831 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152519 4831 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152523 4831 flags.go:64] FLAG: --storage-driver-password="root" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152527 4831 flags.go:64] FLAG: --storage-driver-secure="false" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152532 4831 flags.go:64] FLAG: --storage-driver-table="stats" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152536 4831 flags.go:64] FLAG: --storage-driver-user="root" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152540 4831 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152544 4831 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152548 4831 flags.go:64] FLAG: --system-cgroups="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152552 4831 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152558 4831 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152561 4831 flags.go:64] FLAG: --tls-cert-file="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152565 4831 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152574 4831 flags.go:64] FLAG: --tls-min-version="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152579 4831 flags.go:64] FLAG: --tls-private-key-file="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152583 4831 flags.go:64] FLAG: --topology-manager-policy="none" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152587 4831 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152591 4831 flags.go:64] FLAG: --topology-manager-scope="container" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152595 4831 flags.go:64] FLAG: --v="2" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152600 4831 flags.go:64] FLAG: --version="false" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152606 4831 flags.go:64] FLAG: --vmodule="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152611 4831 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.152615 4831 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152720 4831 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152725 4831 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152729 4831 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152733 4831 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152737 4831 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152740 4831 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152745 4831 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152749 4831 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152753 4831 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152757 4831 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152760 4831 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152766 4831 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152770 4831 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152774 4831 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152778 4831 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152781 4831 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152785 4831 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152788 4831 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152792 4831 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152795 4831 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152799 4831 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152803 4831 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152810 4831 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152814 4831 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152817 4831 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152821 4831 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152824 4831 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152828 4831 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152831 4831 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152834 4831 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152838 4831 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152842 4831 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152847 4831 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152851 4831 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152855 4831 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152859 4831 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152864 4831 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152868 4831 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152872 4831 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152876 4831 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152881 4831 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152885 4831 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152889 4831 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152894 4831 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152898 4831 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152902 4831 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152906 4831 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152910 4831 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152915 4831 feature_gate.go:330] unrecognized feature gate: Example Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152919 4831 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152923 4831 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152927 4831 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152931 4831 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152934 4831 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152940 4831 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152944 4831 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152947 4831 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152951 4831 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152954 4831 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152958 4831 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152961 4831 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152965 4831 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152968 4831 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152972 4831 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152975 4831 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152979 4831 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152982 4831 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152985 4831 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152989 4831 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152992 4831 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.152996 4831 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.153007 4831 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.160075 4831 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.160116 4831 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160221 4831 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160230 4831 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160236 4831 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160241 4831 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160247 4831 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160253 4831 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160258 4831 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160263 4831 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160270 4831 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160279 4831 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160284 4831 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160290 4831 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160295 4831 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160301 4831 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160306 4831 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160311 4831 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160316 4831 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160322 4831 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160329 4831 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160334 4831 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160340 4831 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160344 4831 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160350 4831 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160355 4831 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160360 4831 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160365 4831 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160370 4831 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160375 4831 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160380 4831 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160385 4831 feature_gate.go:330] unrecognized feature gate: Example Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160389 4831 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160395 4831 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160401 4831 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160406 4831 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160410 4831 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160415 4831 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160420 4831 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160424 4831 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160428 4831 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160432 4831 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160437 4831 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160441 4831 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160446 4831 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160450 4831 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160455 4831 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160459 4831 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160463 4831 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160468 4831 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160472 4831 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160477 4831 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160481 4831 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160486 4831 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160490 4831 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160494 4831 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160500 4831 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160505 4831 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160509 4831 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160514 4831 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160518 4831 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160523 4831 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160527 4831 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160531 4831 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160536 4831 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160541 4831 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160546 4831 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160552 4831 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160558 4831 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160565 4831 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160571 4831 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160577 4831 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160582 4831 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.160591 4831 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160772 4831 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160783 4831 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160789 4831 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160794 4831 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160799 4831 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160804 4831 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160808 4831 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160813 4831 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160818 4831 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160822 4831 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160827 4831 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160833 4831 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160837 4831 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160842 4831 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160847 4831 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160853 4831 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160858 4831 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160864 4831 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160873 4831 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160878 4831 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160883 4831 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160888 4831 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160893 4831 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160899 4831 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160905 4831 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160911 4831 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160916 4831 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160921 4831 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160926 4831 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160930 4831 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160935 4831 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160941 4831 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160945 4831 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160950 4831 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160955 4831 feature_gate.go:330] unrecognized feature gate: Example Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160962 4831 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160968 4831 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160974 4831 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160980 4831 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160986 4831 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160991 4831 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.160996 4831 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.161002 4831 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.161007 4831 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.161013 4831 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.161017 4831 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.161023 4831 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.161028 4831 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.161033 4831 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.161037 4831 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.161042 4831 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.161047 4831 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.161052 4831 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.161058 4831 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.161062 4831 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.161067 4831 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.161072 4831 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.161077 4831 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.161081 4831 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.161086 4831 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.161091 4831 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.161096 4831 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.161101 4831 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.161105 4831 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.161110 4831 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.161116 4831 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.161123 4831 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.161129 4831 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.161134 4831 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.161139 4831 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.161144 4831 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.161152 4831 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.161526 4831 server.go:940] "Client rotation is on, will bootstrap in background" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.164430 4831 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.164516 4831 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.165036 4831 server.go:997] "Starting client certificate rotation" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.165060 4831 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.165235 4831 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-15 03:46:36.252069364 +0000 UTC Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.165341 4831 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1001h31m33.086733971s for next certificate rotation Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.169391 4831 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.171122 4831 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.181220 4831 log.go:25] "Validated CRI v1 runtime API" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.193619 4831 log.go:25] "Validated CRI v1 image API" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.195130 4831 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.198484 4831 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-04-10-10-29-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.198531 4831 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.213979 4831 manager.go:217] Machine: {Timestamp:2025-12-04 10:15:03.21274565 +0000 UTC m=+0.161920984 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:aaf9904b-e604-46a1-bdf5-7d2b7b9a992c BootID:ae8c7b9f-5f4a-44bc-a820-5600f29471a7 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:54:35:4d Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:54:35:4d Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:ed:e0:b7 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:cd:de:55 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:5f:2c:8e Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:15:30:0c Speed:-1 Mtu:1496} {Name:eth10 MacAddress:2e:d2:8a:54:12:35 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:0e:c6:0b:10:65:9a Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.214191 4831 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.214327 4831 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.214894 4831 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.215105 4831 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.215144 4831 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.215365 4831 topology_manager.go:138] "Creating topology manager with none policy" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.215378 4831 container_manager_linux.go:303] "Creating device plugin manager" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.215717 4831 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.215753 4831 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.215938 4831 state_mem.go:36] "Initialized new in-memory state store" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.216042 4831 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.216844 4831 kubelet.go:418] "Attempting to sync node with API server" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.216868 4831 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.216916 4831 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.216932 4831 kubelet.go:324] "Adding apiserver pod source" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.216945 4831 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.218934 4831 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.219500 4831 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.220115 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Dec 04 10:15:03 crc kubenswrapper[4831]: E1204 10:15:03.220214 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.220294 4831 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.220298 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Dec 04 10:15:03 crc kubenswrapper[4831]: E1204 10:15:03.220414 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.220820 4831 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.220849 4831 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.220858 4831 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.220867 4831 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.220882 4831 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.220891 4831 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.220900 4831 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.220914 4831 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.220924 4831 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.220933 4831 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.220947 4831 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.220956 4831 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.221130 4831 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.221876 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.222036 4831 server.go:1280] "Started kubelet" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.222290 4831 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 04 10:15:03 crc systemd[1]: Started Kubernetes Kubelet. Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.222653 4831 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.224175 4831 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.224204 4831 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.224261 4831 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 00:10:15.414976927 +0000 UTC Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.224802 4831 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1045h55m12.190186959s for next certificate rotation Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.224523 4831 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.224906 4831 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.224531 4831 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.224618 4831 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.225030 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Dec 04 10:15:03 crc kubenswrapper[4831]: E1204 10:15:03.225074 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Dec 04 10:15:03 crc kubenswrapper[4831]: E1204 10:15:03.224116 4831 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.146:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187dfba23cc07418 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-04 10:15:03.221621784 +0000 UTC m=+0.170797108,LastTimestamp:2025-12-04 10:15:03.221621784 +0000 UTC m=+0.170797108,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 04 10:15:03 crc kubenswrapper[4831]: E1204 10:15:03.224564 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.225486 4831 factory.go:55] Registering systemd factory Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.225501 4831 factory.go:221] Registration of the systemd container factory successfully Dec 04 10:15:03 crc kubenswrapper[4831]: E1204 10:15:03.227093 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="200ms" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.227962 4831 factory.go:153] Registering CRI-O factory Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.227985 4831 factory.go:221] Registration of the crio container factory successfully Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.228062 4831 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.228095 4831 factory.go:103] Registering Raw factory Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.228118 4831 manager.go:1196] Started watching for new ooms in manager Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.229392 4831 manager.go:319] Starting recovery of all containers Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.231418 4831 server.go:460] "Adding debug handlers to kubelet server" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.238415 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.238477 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.238489 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.238500 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.238512 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.238527 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.238542 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.238557 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.238574 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.238589 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.238604 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.238620 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.238635 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.238652 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.238684 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.238699 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.238740 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.238754 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.238766 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.238778 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.238786 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.238798 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.238810 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.238822 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.238833 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.238845 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.238857 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.238868 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.238902 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.238914 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.238925 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.238937 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.238948 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.238958 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.238968 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.238978 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.238989 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.238999 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239011 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239023 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239033 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239045 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239059 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239078 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239093 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239108 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239121 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239133 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239145 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239161 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239174 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239190 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239210 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239228 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239243 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239259 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239275 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239291 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239303 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239319 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239334 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239348 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239362 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239383 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239399 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239417 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239432 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239447 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239463 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239478 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239495 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239510 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239526 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239541 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239556 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239575 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239591 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239608 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239624 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239641 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239656 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239692 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239707 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239722 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239737 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239757 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239772 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239787 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239804 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239820 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239836 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239848 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239863 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239877 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239892 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239908 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239926 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239943 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239961 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.239975 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.240533 4831 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.240572 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.240591 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.240608 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.240622 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.240645 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.240677 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.240696 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.240715 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.240730 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.240745 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.240759 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.240775 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.240791 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.240806 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.240820 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.240833 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.240846 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.240862 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.240877 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.240889 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.240904 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.240921 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.240934 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.240951 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.240967 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.240981 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.240995 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241009 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241021 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241037 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241051 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241067 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241081 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241094 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241109 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241149 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241165 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241178 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241195 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241212 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241237 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241252 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241268 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241282 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241297 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241312 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241327 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241343 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241360 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241375 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241389 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241402 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241417 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241432 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241449 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241464 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241478 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241490 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241502 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241513 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241525 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241536 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241547 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241558 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241570 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241582 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241594 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241604 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241617 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241628 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241640 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241673 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241690 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241705 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241718 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241729 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241744 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241759 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241775 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241790 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241804 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241819 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241833 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241848 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241864 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241879 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241893 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241906 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241921 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241936 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241951 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241967 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241981 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.241995 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.242011 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.242025 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.242040 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.242055 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.242069 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.242082 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.242103 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.242120 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.242134 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.242148 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.242164 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.242180 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.242198 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.242212 4831 reconstruct.go:97] "Volume reconstruction finished" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.242222 4831 reconciler.go:26] "Reconciler: start to sync state" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.255140 4831 manager.go:324] Recovery completed Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.269095 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.271845 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.271887 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.271899 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.272727 4831 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.272859 4831 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.272874 4831 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.272949 4831 state_mem.go:36] "Initialized new in-memory state store" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.275061 4831 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.275112 4831 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.275144 4831 kubelet.go:2335] "Starting kubelet main sync loop" Dec 04 10:15:03 crc kubenswrapper[4831]: E1204 10:15:03.275187 4831 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.298072 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Dec 04 10:15:03 crc kubenswrapper[4831]: E1204 10:15:03.298420 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.321326 4831 policy_none.go:49] "None policy: Start" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.322421 4831 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.322480 4831 state_mem.go:35] "Initializing new in-memory state store" Dec 04 10:15:03 crc kubenswrapper[4831]: E1204 10:15:03.328254 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 04 10:15:03 crc kubenswrapper[4831]: E1204 10:15:03.375522 4831 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.385034 4831 manager.go:334] "Starting Device Plugin manager" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.385081 4831 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.385092 4831 server.go:79] "Starting device plugin registration server" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.385508 4831 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.385527 4831 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.385936 4831 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.386013 4831 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.386020 4831 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 04 10:15:03 crc kubenswrapper[4831]: E1204 10:15:03.395695 4831 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 04 10:15:03 crc kubenswrapper[4831]: E1204 10:15:03.427902 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="400ms" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.485817 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.487193 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.487252 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.487276 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.487322 4831 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 04 10:15:03 crc kubenswrapper[4831]: E1204 10:15:03.488006 4831 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.146:6443: connect: connection refused" node="crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.576217 4831 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.576378 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.577708 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.577739 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.577749 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.577846 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.578006 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.578062 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.578677 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.578710 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.578719 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.578776 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.578999 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.579068 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.580060 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.580104 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.580118 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.580157 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.580195 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.580217 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.580259 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.580433 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.580492 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.581005 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.581031 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.581045 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.581403 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.581423 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.581430 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.581557 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.581744 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.582988 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.583041 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.583056 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.583336 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.584808 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.584885 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.584896 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.585200 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.585249 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.587018 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.587125 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.587151 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.587253 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.587282 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.587290 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.647197 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.647280 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.647332 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.647377 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.647419 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.647459 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.647502 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.647537 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.647763 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.647801 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.647869 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.647908 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.647957 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.648002 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.648218 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.688388 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.689885 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.689944 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.689961 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.689990 4831 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 04 10:15:03 crc kubenswrapper[4831]: E1204 10:15:03.690493 4831 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.146:6443: connect: connection refused" node="crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.749804 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.749909 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.749943 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.749975 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.750006 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.750012 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.750039 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.750053 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.750071 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.750081 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.750056 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.750100 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.750097 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.750130 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.750139 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.750142 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.750159 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.750187 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.750220 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.750224 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.750252 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.750254 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.750276 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.750282 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.750286 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.750394 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.750339 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.750355 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.750256 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.750356 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: E1204 10:15:03.829555 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="800ms" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.934942 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.949092 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.964039 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-efb405decb488bbe3fddc391babfebb5a80fc94b4440d9298281348de387ecb7 WatchSource:0}: Error finding container efb405decb488bbe3fddc391babfebb5a80fc94b4440d9298281348de387ecb7: Status 404 returned error can't find the container with id efb405decb488bbe3fddc391babfebb5a80fc94b4440d9298281348de387ecb7 Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.975908 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-5948bf49063faf663102dfa942895bf2c391e2af1b969829a7f184c0d703413e WatchSource:0}: Error finding container 5948bf49063faf663102dfa942895bf2c391e2af1b969829a7f184c0d703413e: Status 404 returned error can't find the container with id 5948bf49063faf663102dfa942895bf2c391e2af1b969829a7f184c0d703413e Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.983049 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: I1204 10:15:03.992437 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 04 10:15:03 crc kubenswrapper[4831]: W1204 10:15:03.999322 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-45f7d97e6098ef8ba401afc6687a061274638745b3c834e86d88b67662a667c3 WatchSource:0}: Error finding container 45f7d97e6098ef8ba401afc6687a061274638745b3c834e86d88b67662a667c3: Status 404 returned error can't find the container with id 45f7d97e6098ef8ba401afc6687a061274638745b3c834e86d88b67662a667c3 Dec 04 10:15:04 crc kubenswrapper[4831]: W1204 10:15:04.006520 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-c4434b980d12502c79245fb4c2d4c6035bfa40c1699e3ae79cbe8a405470fe09 WatchSource:0}: Error finding container c4434b980d12502c79245fb4c2d4c6035bfa40c1699e3ae79cbe8a405470fe09: Status 404 returned error can't find the container with id c4434b980d12502c79245fb4c2d4c6035bfa40c1699e3ae79cbe8a405470fe09 Dec 04 10:15:04 crc kubenswrapper[4831]: I1204 10:15:04.016994 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 10:15:04 crc kubenswrapper[4831]: W1204 10:15:04.040412 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-7a1d20ee91ac170294484efad6d5cb21a1d2ce01d6920f1ffdab47a590eaff8c WatchSource:0}: Error finding container 7a1d20ee91ac170294484efad6d5cb21a1d2ce01d6920f1ffdab47a590eaff8c: Status 404 returned error can't find the container with id 7a1d20ee91ac170294484efad6d5cb21a1d2ce01d6920f1ffdab47a590eaff8c Dec 04 10:15:04 crc kubenswrapper[4831]: I1204 10:15:04.090715 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:04 crc kubenswrapper[4831]: I1204 10:15:04.091778 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:04 crc kubenswrapper[4831]: I1204 10:15:04.091816 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:04 crc kubenswrapper[4831]: I1204 10:15:04.091827 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:04 crc kubenswrapper[4831]: I1204 10:15:04.091851 4831 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 04 10:15:04 crc kubenswrapper[4831]: E1204 10:15:04.092305 4831 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.146:6443: connect: connection refused" node="crc" Dec 04 10:15:04 crc kubenswrapper[4831]: I1204 10:15:04.223105 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Dec 04 10:15:04 crc kubenswrapper[4831]: I1204 10:15:04.279239 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5948bf49063faf663102dfa942895bf2c391e2af1b969829a7f184c0d703413e"} Dec 04 10:15:04 crc kubenswrapper[4831]: I1204 10:15:04.280484 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"efb405decb488bbe3fddc391babfebb5a80fc94b4440d9298281348de387ecb7"} Dec 04 10:15:04 crc kubenswrapper[4831]: I1204 10:15:04.281880 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7a1d20ee91ac170294484efad6d5cb21a1d2ce01d6920f1ffdab47a590eaff8c"} Dec 04 10:15:04 crc kubenswrapper[4831]: I1204 10:15:04.282993 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c4434b980d12502c79245fb4c2d4c6035bfa40c1699e3ae79cbe8a405470fe09"} Dec 04 10:15:04 crc kubenswrapper[4831]: I1204 10:15:04.284033 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"45f7d97e6098ef8ba401afc6687a061274638745b3c834e86d88b67662a667c3"} Dec 04 10:15:04 crc kubenswrapper[4831]: W1204 10:15:04.504798 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Dec 04 10:15:04 crc kubenswrapper[4831]: E1204 10:15:04.504922 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Dec 04 10:15:04 crc kubenswrapper[4831]: E1204 10:15:04.630262 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="1.6s" Dec 04 10:15:04 crc kubenswrapper[4831]: W1204 10:15:04.713441 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Dec 04 10:15:04 crc kubenswrapper[4831]: E1204 10:15:04.713566 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Dec 04 10:15:04 crc kubenswrapper[4831]: W1204 10:15:04.757341 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Dec 04 10:15:04 crc kubenswrapper[4831]: E1204 10:15:04.757460 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Dec 04 10:15:04 crc kubenswrapper[4831]: W1204 10:15:04.852289 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Dec 04 10:15:04 crc kubenswrapper[4831]: E1204 10:15:04.852367 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Dec 04 10:15:04 crc kubenswrapper[4831]: I1204 10:15:04.893032 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:04 crc kubenswrapper[4831]: I1204 10:15:04.894727 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:04 crc kubenswrapper[4831]: I1204 10:15:04.894756 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:04 crc kubenswrapper[4831]: I1204 10:15:04.894766 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:04 crc kubenswrapper[4831]: I1204 10:15:04.894787 4831 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 04 10:15:04 crc kubenswrapper[4831]: E1204 10:15:04.895280 4831 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.146:6443: connect: connection refused" node="crc" Dec 04 10:15:05 crc kubenswrapper[4831]: I1204 10:15:05.223453 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Dec 04 10:15:05 crc kubenswrapper[4831]: I1204 10:15:05.289564 4831 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="2536b44f913751976bc76295ac52a233a3de641e3fc272594a4bd94a5c788755" exitCode=0 Dec 04 10:15:05 crc kubenswrapper[4831]: I1204 10:15:05.289695 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:05 crc kubenswrapper[4831]: I1204 10:15:05.289691 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"2536b44f913751976bc76295ac52a233a3de641e3fc272594a4bd94a5c788755"} Dec 04 10:15:05 crc kubenswrapper[4831]: I1204 10:15:05.290918 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:05 crc kubenswrapper[4831]: I1204 10:15:05.290960 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:05 crc kubenswrapper[4831]: I1204 10:15:05.290977 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:05 crc kubenswrapper[4831]: I1204 10:15:05.292100 4831 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="ae20d8a14fc0db716ccf1450078acdb98f14e5522b17a00b243c67300d7e0b43" exitCode=0 Dec 04 10:15:05 crc kubenswrapper[4831]: I1204 10:15:05.292157 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"ae20d8a14fc0db716ccf1450078acdb98f14e5522b17a00b243c67300d7e0b43"} Dec 04 10:15:05 crc kubenswrapper[4831]: I1204 10:15:05.292221 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:05 crc kubenswrapper[4831]: I1204 10:15:05.293689 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:05 crc kubenswrapper[4831]: I1204 10:15:05.293739 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:05 crc kubenswrapper[4831]: I1204 10:15:05.293757 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:05 crc kubenswrapper[4831]: I1204 10:15:05.298323 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"941587f1cf00e0b2279e8f08f1bad8e477f14a02f6cfc8ed858d0196e01f29f0"} Dec 04 10:15:05 crc kubenswrapper[4831]: I1204 10:15:05.298388 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ed6c54394c01d6097ec29247b0676fc04989db88066d482eb053b5e9b040b630"} Dec 04 10:15:05 crc kubenswrapper[4831]: I1204 10:15:05.298413 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8d3ec5dcde986c3e369253eb2c4fc8d6e646c9c1557bc5c28a98a6c71562d04d"} Dec 04 10:15:05 crc kubenswrapper[4831]: I1204 10:15:05.298433 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"61e3c29b09715bd660a94365179740c46dd6bbeb04b333ecb5dadbf8cdea5cf4"} Dec 04 10:15:05 crc kubenswrapper[4831]: I1204 10:15:05.298536 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:05 crc kubenswrapper[4831]: I1204 10:15:05.300151 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:05 crc kubenswrapper[4831]: I1204 10:15:05.300196 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:05 crc kubenswrapper[4831]: I1204 10:15:05.300213 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:05 crc kubenswrapper[4831]: I1204 10:15:05.315529 4831 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a" exitCode=0 Dec 04 10:15:05 crc kubenswrapper[4831]: I1204 10:15:05.315690 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a"} Dec 04 10:15:05 crc kubenswrapper[4831]: I1204 10:15:05.315888 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:05 crc kubenswrapper[4831]: I1204 10:15:05.318161 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:05 crc kubenswrapper[4831]: I1204 10:15:05.318221 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:05 crc kubenswrapper[4831]: I1204 10:15:05.318247 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:05 crc kubenswrapper[4831]: I1204 10:15:05.326127 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:05 crc kubenswrapper[4831]: I1204 10:15:05.327363 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:05 crc kubenswrapper[4831]: I1204 10:15:05.327418 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:05 crc kubenswrapper[4831]: I1204 10:15:05.327442 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:05 crc kubenswrapper[4831]: I1204 10:15:05.328244 4831 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="bd04a4f7f5fb72ebef465ad390d97c7f6d50dd81eccfb30291293be20949bdc0" exitCode=0 Dec 04 10:15:05 crc kubenswrapper[4831]: I1204 10:15:05.328301 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"bd04a4f7f5fb72ebef465ad390d97c7f6d50dd81eccfb30291293be20949bdc0"} Dec 04 10:15:05 crc kubenswrapper[4831]: I1204 10:15:05.328473 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:05 crc kubenswrapper[4831]: I1204 10:15:05.330048 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:05 crc kubenswrapper[4831]: I1204 10:15:05.330188 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:05 crc kubenswrapper[4831]: I1204 10:15:05.330306 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:06 crc kubenswrapper[4831]: I1204 10:15:06.332231 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8"} Dec 04 10:15:06 crc kubenswrapper[4831]: I1204 10:15:06.332273 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f"} Dec 04 10:15:06 crc kubenswrapper[4831]: I1204 10:15:06.332282 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5"} Dec 04 10:15:06 crc kubenswrapper[4831]: I1204 10:15:06.332292 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8"} Dec 04 10:15:06 crc kubenswrapper[4831]: I1204 10:15:06.332300 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320"} Dec 04 10:15:06 crc kubenswrapper[4831]: I1204 10:15:06.332323 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:06 crc kubenswrapper[4831]: I1204 10:15:06.333573 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:06 crc kubenswrapper[4831]: I1204 10:15:06.333607 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:06 crc kubenswrapper[4831]: I1204 10:15:06.333619 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:06 crc kubenswrapper[4831]: I1204 10:15:06.334739 4831 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="6644662acf4761d9eed9d15d31fa2e6065b1b7ce16bfff24cd4f18bd56e31c4a" exitCode=0 Dec 04 10:15:06 crc kubenswrapper[4831]: I1204 10:15:06.334788 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"6644662acf4761d9eed9d15d31fa2e6065b1b7ce16bfff24cd4f18bd56e31c4a"} Dec 04 10:15:06 crc kubenswrapper[4831]: I1204 10:15:06.334872 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:06 crc kubenswrapper[4831]: I1204 10:15:06.335503 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:06 crc kubenswrapper[4831]: I1204 10:15:06.335542 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:06 crc kubenswrapper[4831]: I1204 10:15:06.335554 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:06 crc kubenswrapper[4831]: I1204 10:15:06.337812 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"9bc05b9fff58df35fc8bfab430467a16ae22396ce878934c9dd2ad0a21043f74"} Dec 04 10:15:06 crc kubenswrapper[4831]: I1204 10:15:06.337872 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:06 crc kubenswrapper[4831]: I1204 10:15:06.338921 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:06 crc kubenswrapper[4831]: I1204 10:15:06.338956 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:06 crc kubenswrapper[4831]: I1204 10:15:06.338975 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:06 crc kubenswrapper[4831]: I1204 10:15:06.340145 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:06 crc kubenswrapper[4831]: I1204 10:15:06.340155 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:06 crc kubenswrapper[4831]: I1204 10:15:06.340146 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"745df7e6079182e2ba297b8e268d8a1647ce0f856e99f8f90d3dc92a6a071087"} Dec 04 10:15:06 crc kubenswrapper[4831]: I1204 10:15:06.340290 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"aa8d355d4394ba0854fc067550b15f70e9e72a2a1efb0d7f981574db2f81836c"} Dec 04 10:15:06 crc kubenswrapper[4831]: I1204 10:15:06.340314 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"8fcdee1a860ddfe046b4983d9b2422cf6c993be574f549cb3667179d92944884"} Dec 04 10:15:06 crc kubenswrapper[4831]: I1204 10:15:06.341020 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:06 crc kubenswrapper[4831]: I1204 10:15:06.341046 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:06 crc kubenswrapper[4831]: I1204 10:15:06.341055 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:06 crc kubenswrapper[4831]: I1204 10:15:06.341053 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:06 crc kubenswrapper[4831]: I1204 10:15:06.341079 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:06 crc kubenswrapper[4831]: I1204 10:15:06.341097 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:06 crc kubenswrapper[4831]: I1204 10:15:06.495381 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:06 crc kubenswrapper[4831]: I1204 10:15:06.496546 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:06 crc kubenswrapper[4831]: I1204 10:15:06.496579 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:06 crc kubenswrapper[4831]: I1204 10:15:06.496592 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:06 crc kubenswrapper[4831]: I1204 10:15:06.496618 4831 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 04 10:15:06 crc kubenswrapper[4831]: I1204 10:15:06.995090 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 10:15:07 crc kubenswrapper[4831]: I1204 10:15:07.346462 4831 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="f9e29a98ab25f4c62a80f791b5f116cf1909d58e5f82015498e865308c130490" exitCode=0 Dec 04 10:15:07 crc kubenswrapper[4831]: I1204 10:15:07.346598 4831 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 10:15:07 crc kubenswrapper[4831]: I1204 10:15:07.346618 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:07 crc kubenswrapper[4831]: I1204 10:15:07.346639 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:07 crc kubenswrapper[4831]: I1204 10:15:07.346737 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:07 crc kubenswrapper[4831]: I1204 10:15:07.346789 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:07 crc kubenswrapper[4831]: I1204 10:15:07.346875 4831 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 10:15:07 crc kubenswrapper[4831]: I1204 10:15:07.346976 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:07 crc kubenswrapper[4831]: I1204 10:15:07.347870 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"f9e29a98ab25f4c62a80f791b5f116cf1909d58e5f82015498e865308c130490"} Dec 04 10:15:07 crc kubenswrapper[4831]: I1204 10:15:07.348605 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:07 crc kubenswrapper[4831]: I1204 10:15:07.348690 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:07 crc kubenswrapper[4831]: I1204 10:15:07.348710 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:07 crc kubenswrapper[4831]: I1204 10:15:07.348834 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:07 crc kubenswrapper[4831]: I1204 10:15:07.348879 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:07 crc kubenswrapper[4831]: I1204 10:15:07.348897 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:07 crc kubenswrapper[4831]: I1204 10:15:07.349069 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:07 crc kubenswrapper[4831]: I1204 10:15:07.349115 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:07 crc kubenswrapper[4831]: I1204 10:15:07.349155 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:07 crc kubenswrapper[4831]: I1204 10:15:07.349126 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:07 crc kubenswrapper[4831]: I1204 10:15:07.349096 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:07 crc kubenswrapper[4831]: I1204 10:15:07.349185 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:07 crc kubenswrapper[4831]: I1204 10:15:07.349198 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:07 crc kubenswrapper[4831]: I1204 10:15:07.349168 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:07 crc kubenswrapper[4831]: I1204 10:15:07.349197 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:08 crc kubenswrapper[4831]: I1204 10:15:08.355043 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8dbfd3c5a45c64ec6513591b5c24eca8f73775bd202538f0a41a7376cc88c811"} Dec 04 10:15:08 crc kubenswrapper[4831]: I1204 10:15:08.355120 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9c765e7a3cb9404267fa79add6c9d7fdb4bd6b23adaa487b95fcbb75eef2243b"} Dec 04 10:15:08 crc kubenswrapper[4831]: I1204 10:15:08.355140 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"739642c8aa1750cbbd451838de4c838cee04484de09e6cdf58a7f4d2defec704"} Dec 04 10:15:08 crc kubenswrapper[4831]: I1204 10:15:08.355163 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"45325e02400d532e2046dfd4ac0380bdbb4c7a826c2596eccb9ecd4df3c12d9e"} Dec 04 10:15:08 crc kubenswrapper[4831]: I1204 10:15:08.808603 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 10:15:08 crc kubenswrapper[4831]: I1204 10:15:08.808766 4831 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 10:15:08 crc kubenswrapper[4831]: I1204 10:15:08.808798 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:08 crc kubenswrapper[4831]: I1204 10:15:08.810185 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:08 crc kubenswrapper[4831]: I1204 10:15:08.810218 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:08 crc kubenswrapper[4831]: I1204 10:15:08.810228 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:08 crc kubenswrapper[4831]: I1204 10:15:08.810891 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 10:15:08 crc kubenswrapper[4831]: I1204 10:15:08.810947 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:08 crc kubenswrapper[4831]: I1204 10:15:08.811723 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:08 crc kubenswrapper[4831]: I1204 10:15:08.811752 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:08 crc kubenswrapper[4831]: I1204 10:15:08.811760 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:09 crc kubenswrapper[4831]: I1204 10:15:09.108572 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 10:15:09 crc kubenswrapper[4831]: I1204 10:15:09.363950 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7725a58cec8ccec6d13a46b488f6c63858267e73137a1fb7fa76fc9956cf4527"} Dec 04 10:15:09 crc kubenswrapper[4831]: I1204 10:15:09.364021 4831 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 10:15:09 crc kubenswrapper[4831]: I1204 10:15:09.364079 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:09 crc kubenswrapper[4831]: I1204 10:15:09.364110 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:09 crc kubenswrapper[4831]: I1204 10:15:09.364906 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:09 crc kubenswrapper[4831]: I1204 10:15:09.364982 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:09 crc kubenswrapper[4831]: I1204 10:15:09.365017 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:09 crc kubenswrapper[4831]: I1204 10:15:09.365102 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:09 crc kubenswrapper[4831]: I1204 10:15:09.365122 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:09 crc kubenswrapper[4831]: I1204 10:15:09.365131 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:09 crc kubenswrapper[4831]: I1204 10:15:09.540330 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 04 10:15:09 crc kubenswrapper[4831]: I1204 10:15:09.775384 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 10:15:09 crc kubenswrapper[4831]: I1204 10:15:09.775611 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:09 crc kubenswrapper[4831]: I1204 10:15:09.776797 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:09 crc kubenswrapper[4831]: I1204 10:15:09.776826 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:09 crc kubenswrapper[4831]: I1204 10:15:09.776837 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:09 crc kubenswrapper[4831]: I1204 10:15:09.812336 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 10:15:09 crc kubenswrapper[4831]: I1204 10:15:09.919279 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 04 10:15:09 crc kubenswrapper[4831]: I1204 10:15:09.996125 4831 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 10:15:09 crc kubenswrapper[4831]: I1204 10:15:09.996229 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 10:15:10 crc kubenswrapper[4831]: I1204 10:15:10.366640 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:10 crc kubenswrapper[4831]: I1204 10:15:10.366768 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:10 crc kubenswrapper[4831]: I1204 10:15:10.368235 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:10 crc kubenswrapper[4831]: I1204 10:15:10.368283 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:10 crc kubenswrapper[4831]: I1204 10:15:10.368294 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:10 crc kubenswrapper[4831]: I1204 10:15:10.368932 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:10 crc kubenswrapper[4831]: I1204 10:15:10.368973 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:10 crc kubenswrapper[4831]: I1204 10:15:10.368986 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:11 crc kubenswrapper[4831]: I1204 10:15:11.369324 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:11 crc kubenswrapper[4831]: I1204 10:15:11.370882 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:11 crc kubenswrapper[4831]: I1204 10:15:11.370944 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:11 crc kubenswrapper[4831]: I1204 10:15:11.370970 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:13 crc kubenswrapper[4831]: I1204 10:15:13.035430 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 10:15:13 crc kubenswrapper[4831]: I1204 10:15:13.035787 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:13 crc kubenswrapper[4831]: I1204 10:15:13.037292 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:13 crc kubenswrapper[4831]: I1204 10:15:13.037347 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:13 crc kubenswrapper[4831]: I1204 10:15:13.037374 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:13 crc kubenswrapper[4831]: I1204 10:15:13.041700 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 10:15:13 crc kubenswrapper[4831]: I1204 10:15:13.375105 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:13 crc kubenswrapper[4831]: I1204 10:15:13.376506 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:13 crc kubenswrapper[4831]: I1204 10:15:13.376573 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:13 crc kubenswrapper[4831]: I1204 10:15:13.376592 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:13 crc kubenswrapper[4831]: E1204 10:15:13.395815 4831 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 04 10:15:15 crc kubenswrapper[4831]: I1204 10:15:15.273544 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 10:15:15 crc kubenswrapper[4831]: I1204 10:15:15.273863 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:15 crc kubenswrapper[4831]: I1204 10:15:15.275521 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:15 crc kubenswrapper[4831]: I1204 10:15:15.275589 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:15 crc kubenswrapper[4831]: I1204 10:15:15.275611 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:15 crc kubenswrapper[4831]: I1204 10:15:15.278749 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 10:15:15 crc kubenswrapper[4831]: I1204 10:15:15.380132 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:15 crc kubenswrapper[4831]: I1204 10:15:15.381586 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:15 crc kubenswrapper[4831]: I1204 10:15:15.381690 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:15 crc kubenswrapper[4831]: I1204 10:15:15.381720 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:16 crc kubenswrapper[4831]: I1204 10:15:16.223900 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 04 10:15:16 crc kubenswrapper[4831]: E1204 10:15:16.231528 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Dec 04 10:15:16 crc kubenswrapper[4831]: E1204 10:15:16.497817 4831 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Dec 04 10:15:16 crc kubenswrapper[4831]: I1204 10:15:16.760723 4831 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 04 10:15:16 crc kubenswrapper[4831]: I1204 10:15:16.760796 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 04 10:15:16 crc kubenswrapper[4831]: I1204 10:15:16.764257 4831 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 04 10:15:16 crc kubenswrapper[4831]: I1204 10:15:16.764317 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 04 10:15:19 crc kubenswrapper[4831]: I1204 10:15:19.114347 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 10:15:19 crc kubenswrapper[4831]: I1204 10:15:19.114771 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:19 crc kubenswrapper[4831]: I1204 10:15:19.116748 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:19 crc kubenswrapper[4831]: I1204 10:15:19.116795 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:19 crc kubenswrapper[4831]: I1204 10:15:19.116807 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:19 crc kubenswrapper[4831]: I1204 10:15:19.119262 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 10:15:19 crc kubenswrapper[4831]: I1204 10:15:19.390099 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:19 crc kubenswrapper[4831]: I1204 10:15:19.391254 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:19 crc kubenswrapper[4831]: I1204 10:15:19.391294 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:19 crc kubenswrapper[4831]: I1204 10:15:19.391307 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:19 crc kubenswrapper[4831]: I1204 10:15:19.581073 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 04 10:15:19 crc kubenswrapper[4831]: I1204 10:15:19.581344 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:19 crc kubenswrapper[4831]: I1204 10:15:19.583308 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:19 crc kubenswrapper[4831]: I1204 10:15:19.583379 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:19 crc kubenswrapper[4831]: I1204 10:15:19.583406 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:19 crc kubenswrapper[4831]: I1204 10:15:19.604831 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 04 10:15:19 crc kubenswrapper[4831]: I1204 10:15:19.698494 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:19 crc kubenswrapper[4831]: I1204 10:15:19.699834 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:19 crc kubenswrapper[4831]: I1204 10:15:19.699876 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:19 crc kubenswrapper[4831]: I1204 10:15:19.699888 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:19 crc kubenswrapper[4831]: I1204 10:15:19.699917 4831 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 04 10:15:19 crc kubenswrapper[4831]: E1204 10:15:19.704937 4831 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 04 10:15:19 crc kubenswrapper[4831]: I1204 10:15:19.995861 4831 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 10:15:19 crc kubenswrapper[4831]: I1204 10:15:19.995948 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 10:15:20 crc kubenswrapper[4831]: I1204 10:15:20.392549 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:20 crc kubenswrapper[4831]: I1204 10:15:20.393960 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:20 crc kubenswrapper[4831]: I1204 10:15:20.394015 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:20 crc kubenswrapper[4831]: I1204 10:15:20.394035 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:21 crc kubenswrapper[4831]: I1204 10:15:21.735533 4831 trace.go:236] Trace[1836725955]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (04-Dec-2025 10:15:07.522) (total time: 14213ms): Dec 04 10:15:21 crc kubenswrapper[4831]: Trace[1836725955]: ---"Objects listed" error: 14213ms (10:15:21.735) Dec 04 10:15:21 crc kubenswrapper[4831]: Trace[1836725955]: [14.213105985s] [14.213105985s] END Dec 04 10:15:21 crc kubenswrapper[4831]: I1204 10:15:21.735561 4831 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 04 10:15:21 crc kubenswrapper[4831]: I1204 10:15:21.735639 4831 trace.go:236] Trace[1080138117]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (04-Dec-2025 10:15:07.371) (total time: 14363ms): Dec 04 10:15:21 crc kubenswrapper[4831]: Trace[1080138117]: ---"Objects listed" error: 14363ms (10:15:21.735) Dec 04 10:15:21 crc kubenswrapper[4831]: Trace[1080138117]: [14.363777897s] [14.363777897s] END Dec 04 10:15:21 crc kubenswrapper[4831]: I1204 10:15:21.735683 4831 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 04 10:15:21 crc kubenswrapper[4831]: I1204 10:15:21.737551 4831 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 04 10:15:21 crc kubenswrapper[4831]: I1204 10:15:21.762617 4831 trace.go:236] Trace[1735662573]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (04-Dec-2025 10:15:06.819) (total time: 14943ms): Dec 04 10:15:21 crc kubenswrapper[4831]: Trace[1735662573]: ---"Objects listed" error: 14943ms (10:15:21.762) Dec 04 10:15:21 crc kubenswrapper[4831]: Trace[1735662573]: [14.943308528s] [14.943308528s] END Dec 04 10:15:21 crc kubenswrapper[4831]: I1204 10:15:21.762651 4831 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 04 10:15:21 crc kubenswrapper[4831]: I1204 10:15:21.766969 4831 trace.go:236] Trace[1421066187]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (04-Dec-2025 10:15:07.625) (total time: 14140ms): Dec 04 10:15:21 crc kubenswrapper[4831]: Trace[1421066187]: ---"Objects listed" error: 14140ms (10:15:21.765) Dec 04 10:15:21 crc kubenswrapper[4831]: Trace[1421066187]: [14.140326776s] [14.140326776s] END Dec 04 10:15:21 crc kubenswrapper[4831]: I1204 10:15:21.767028 4831 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.078164 4831 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:53362->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.078214 4831 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:53368->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.078224 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:53362->192.168.126.11:17697: read: connection reset by peer" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.078275 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:53368->192.168.126.11:17697: read: connection reset by peer" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.078722 4831 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.078800 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.079103 4831 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.079132 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.228464 4831 apiserver.go:52] "Watching apiserver" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.231101 4831 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.231563 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-dns/node-resolver-xc5vd","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.232061 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:15:22 crc kubenswrapper[4831]: E1204 10:15:22.232187 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.232073 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:15:22 crc kubenswrapper[4831]: E1204 10:15:22.232348 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.232061 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.232446 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.232541 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.232601 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-xc5vd" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.232611 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:15:22 crc kubenswrapper[4831]: E1204 10:15:22.232879 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.235509 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.235896 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.236153 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.236305 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.236446 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.236676 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.236789 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.236857 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.236883 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.236906 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.236921 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.237200 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.260732 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.270417 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.279916 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.294927 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.305042 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.313098 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.323075 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.325608 4831 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.331000 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.341360 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.341412 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.341440 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.341464 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.341525 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.341548 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.341570 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.341590 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.341614 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.341642 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.341692 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.341716 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.341735 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.341762 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.341783 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.341774 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.341804 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.341863 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.341890 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.342468 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.342688 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.342691 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.342935 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.342977 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.343035 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.343240 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.343332 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.343339 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.343412 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.343456 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.343497 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.343534 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.343568 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.343597 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.343630 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.343683 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.343710 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.343742 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.343773 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.343801 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.343830 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.343862 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.343890 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.343917 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.343943 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.343970 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.344003 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.344031 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.344056 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.344084 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.344111 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.344136 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.344163 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.344194 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.344220 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.344250 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.344370 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.344396 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.344428 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.344460 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.344483 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.344510 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.344537 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.344560 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.344590 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.344616 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.344643 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.344689 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.344717 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.344746 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.345624 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.345100 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.346442 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.346594 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.346648 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.346681 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.346740 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.346779 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.346818 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.346853 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.346897 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.346910 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.346933 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.346941 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.347015 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.347055 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.345234 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.347083 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.347116 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.347147 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.347198 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.347227 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.347288 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.347428 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.347473 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.347513 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.347546 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.347575 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.347599 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.347632 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.347728 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.347776 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.349150 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.345249 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.349218 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.345546 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.345733 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.345734 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.345765 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.346067 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.346145 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.346224 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.346372 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.346389 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.346402 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.347137 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.347146 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.347418 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.347695 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.347767 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.347797 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.347766 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.348092 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.348108 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.348232 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.348315 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.348523 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.348798 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.349370 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.349258 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.349633 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.349679 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.349712 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.349738 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.349760 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.349787 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.349813 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.349835 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.349845 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.349865 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.349894 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.349921 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.349924 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.349946 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.349978 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.350012 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.350039 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.350278 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.350334 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.350339 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.350480 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.350750 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.350793 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.350823 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.350853 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 10:15:22 crc kubenswrapper[4831]: E1204 10:15:22.350913 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:15:22.850858167 +0000 UTC m=+19.800033511 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.350976 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.351059 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.351102 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.351150 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.351202 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.351335 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.351379 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.351384 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.351457 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.351489 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.351515 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.351527 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.351536 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.351635 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.351723 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.351758 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.351622 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.351787 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.351803 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.351818 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.351896 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.351926 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.351937 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.351990 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.352039 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.352090 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.352208 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.352234 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.352254 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.352316 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.352365 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.352397 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.352426 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.352452 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.352481 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.352512 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.352538 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.352570 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.352580 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.352598 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.352623 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.352619 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.352644 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.352673 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.352619 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.353002 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.353022 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.353170 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.353151 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.353207 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.353234 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.353344 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.353379 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.353504 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.353544 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.353568 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.353587 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.353607 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.353626 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.353647 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.353686 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.353712 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.353730 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.354363 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.354394 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.354416 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.354432 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.354451 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.354471 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.354492 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.354508 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.354530 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.354549 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.354567 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.354587 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.354606 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.354622 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.354751 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.355034 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.355063 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.355081 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.355102 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.355170 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.355191 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.355213 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.355281 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.355823 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.355855 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.355877 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.355899 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.355916 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.355938 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.355959 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.355984 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.356013 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.356033 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.356051 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.356070 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.357729 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.357847 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.357893 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.357923 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.357959 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.357992 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.358028 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.358062 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.358095 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l96vv\" (UniqueName: \"kubernetes.io/projected/f2a02957-2de3-4874-b43e-85be9e748dab-kube-api-access-l96vv\") pod \"node-resolver-xc5vd\" (UID: \"f2a02957-2de3-4874-b43e-85be9e748dab\") " pod="openshift-dns/node-resolver-xc5vd" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.358131 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.358163 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f2a02957-2de3-4874-b43e-85be9e748dab-hosts-file\") pod \"node-resolver-xc5vd\" (UID: \"f2a02957-2de3-4874-b43e-85be9e748dab\") " pod="openshift-dns/node-resolver-xc5vd" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.358195 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.358232 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.358268 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.358458 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.358569 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.358631 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.359604 4831 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.359627 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.359645 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.359683 4831 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.359698 4831 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.359712 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.359724 4831 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.359741 4831 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.359759 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.359773 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.359792 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.359808 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.359821 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.359834 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.359852 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.359865 4831 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.359877 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.359893 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.359913 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.359929 4831 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.359939 4831 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.359950 4831 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.359961 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.362323 4831 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.359972 4831 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.363218 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.363269 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.363294 4831 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.363362 4831 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.363437 4831 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.363459 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.363481 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.363512 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.363533 4831 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.363558 4831 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.363580 4831 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.363610 4831 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.363630 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.363654 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.363711 4831 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.363735 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.363759 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.363781 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.363810 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.363833 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.363855 4831 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.363876 4831 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.363904 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.363929 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.365236 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.363954 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.365388 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.365414 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.365435 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.365448 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.365459 4831 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.365474 4831 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.365486 4831 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.365701 4831 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.365776 4831 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.354372 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.379248 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: E1204 10:15:22.379260 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.379309 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: E1204 10:15:22.379330 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 10:15:22 crc kubenswrapper[4831]: E1204 10:15:22.379353 4831 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.379518 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.379813 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: E1204 10:15:22.379934 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 10:15:22.879410065 +0000 UTC m=+19.828585559 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.380033 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.380048 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.380241 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.380313 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.356474 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.356574 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.356624 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.356784 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.356201 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.356900 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.356931 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.357492 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.357502 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.357507 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.357690 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.357996 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.358216 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.358434 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.359187 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.359272 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.359320 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.359441 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.360222 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: E1204 10:15:22.360369 4831 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 10:15:22 crc kubenswrapper[4831]: E1204 10:15:22.380680 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 10:15:22.880652038 +0000 UTC m=+19.829827352 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.360497 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.360846 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.360992 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.361885 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.365585 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.365890 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.366096 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.366082 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.368389 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.368576 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.370544 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.374862 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.374955 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.374990 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.375625 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.375650 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.376941 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.376975 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.377056 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.377118 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.377309 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.377359 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.377446 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.377548 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.377711 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.377841 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.377968 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.378082 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.378138 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.378156 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.370376 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.378298 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.378330 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.378556 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.378597 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.378645 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.378697 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.378781 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.378916 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.378996 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.379039 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.379137 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.379230 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.356267 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.379265 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.380881 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.380969 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.381218 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: E1204 10:15:22.381333 4831 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 10:15:22 crc kubenswrapper[4831]: E1204 10:15:22.383203 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 10:15:22.883188515 +0000 UTC m=+19.832363939 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.381736 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.381385 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.381575 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.381841 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.382069 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.382114 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.384163 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.384383 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.384468 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.384719 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.385045 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.385107 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.385120 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.385401 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.385511 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.385565 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.385677 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.385746 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.386129 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.386215 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.386390 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.386571 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.386783 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.386851 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.387123 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.387644 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.392909 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.392980 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.393002 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.393555 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.394410 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.396168 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.397225 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.397575 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.400030 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.402081 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.402154 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.402378 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.402387 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.402783 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.402033 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.405727 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.407156 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.407359 4831 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8" exitCode=255 Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.407400 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8"} Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.407517 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.407937 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: E1204 10:15:22.408834 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.408840 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: E1204 10:15:22.408859 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 10:15:22 crc kubenswrapper[4831]: E1204 10:15:22.408903 4831 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.408955 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: E1204 10:15:22.408967 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 10:15:22.908948069 +0000 UTC m=+19.858123383 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.409267 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.409403 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.409434 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.409837 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.410182 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.410910 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.411449 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.410489 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.411985 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.412156 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.412209 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.412260 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.412743 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.413936 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.414222 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.414367 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.418334 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.419110 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.419301 4831 scope.go:117] "RemoveContainer" containerID="83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.420204 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.425903 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.429282 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.432241 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.445334 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.446638 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.459396 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466215 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466267 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l96vv\" (UniqueName: \"kubernetes.io/projected/f2a02957-2de3-4874-b43e-85be9e748dab-kube-api-access-l96vv\") pod \"node-resolver-xc5vd\" (UID: \"f2a02957-2de3-4874-b43e-85be9e748dab\") " pod="openshift-dns/node-resolver-xc5vd" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466300 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f2a02957-2de3-4874-b43e-85be9e748dab-hosts-file\") pod \"node-resolver-xc5vd\" (UID: \"f2a02957-2de3-4874-b43e-85be9e748dab\") " pod="openshift-dns/node-resolver-xc5vd" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466336 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466386 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466399 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466412 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466423 4831 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466435 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466447 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466459 4831 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466471 4831 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466481 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466494 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466504 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466517 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466528 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466540 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466552 4831 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466563 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466574 4831 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466585 4831 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466599 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466611 4831 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466624 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466635 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466646 4831 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466676 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466689 4831 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466701 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466713 4831 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466726 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466736 4831 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466747 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466767 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466780 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466794 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466806 4831 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466817 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466878 4831 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466891 4831 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466902 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466913 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466924 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466951 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466964 4831 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466975 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466987 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466989 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466998 4831 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467057 4831 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467067 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467077 4831 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467085 4831 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467094 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.466875 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f2a02957-2de3-4874-b43e-85be9e748dab-hosts-file\") pod \"node-resolver-xc5vd\" (UID: \"f2a02957-2de3-4874-b43e-85be9e748dab\") " pod="openshift-dns/node-resolver-xc5vd" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467103 4831 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467153 4831 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467166 4831 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467180 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467053 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467196 4831 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467242 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467256 4831 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467266 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467278 4831 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467289 4831 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467302 4831 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467313 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467345 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467354 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467363 4831 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467372 4831 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467381 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467390 4831 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467398 4831 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467406 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467414 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467421 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467430 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467438 4831 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467447 4831 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467455 4831 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467463 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467471 4831 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467478 4831 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467487 4831 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467494 4831 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467502 4831 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467510 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467519 4831 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467527 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467537 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467547 4831 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467554 4831 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467563 4831 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467572 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467581 4831 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467589 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467597 4831 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467605 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467613 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467622 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467631 4831 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467640 4831 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467648 4831 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467680 4831 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467692 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467703 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467716 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467724 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467733 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467742 4831 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467752 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467763 4831 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467772 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467781 4831 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467792 4831 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467803 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467814 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467826 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467892 4831 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467908 4831 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467920 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467929 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467937 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467945 4831 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467954 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467962 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467971 4831 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467979 4831 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467988 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.467995 4831 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.468004 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.468013 4831 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.468021 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.468029 4831 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.468037 4831 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.468045 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.468053 4831 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.468061 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.468071 4831 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.468078 4831 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.469931 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.476531 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.482489 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l96vv\" (UniqueName: \"kubernetes.io/projected/f2a02957-2de3-4874-b43e-85be9e748dab-kube-api-access-l96vv\") pod \"node-resolver-xc5vd\" (UID: \"f2a02957-2de3-4874-b43e-85be9e748dab\") " pod="openshift-dns/node-resolver-xc5vd" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.487303 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.545375 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.553047 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.559025 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-xc5vd" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.566309 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 10:15:22 crc kubenswrapper[4831]: W1204 10:15:22.567089 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-2ca9691b1764372deab27ff5c314009eca094dd2ad98337eb5e21e0b8f883a72 WatchSource:0}: Error finding container 2ca9691b1764372deab27ff5c314009eca094dd2ad98337eb5e21e0b8f883a72: Status 404 returned error can't find the container with id 2ca9691b1764372deab27ff5c314009eca094dd2ad98337eb5e21e0b8f883a72 Dec 04 10:15:22 crc kubenswrapper[4831]: W1204 10:15:22.570279 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2a02957_2de3_4874_b43e_85be9e748dab.slice/crio-8f8377edf7372e0e6272a43d3166b8095147559ce622ead2c1b2dc54cc273e4f WatchSource:0}: Error finding container 8f8377edf7372e0e6272a43d3166b8095147559ce622ead2c1b2dc54cc273e4f: Status 404 returned error can't find the container with id 8f8377edf7372e0e6272a43d3166b8095147559ce622ead2c1b2dc54cc273e4f Dec 04 10:15:22 crc kubenswrapper[4831]: W1204 10:15:22.583996 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-27278f1c84de4bac5406bed46a5a987e178bc2b9f210cdaefb4a13bee785c44a WatchSource:0}: Error finding container 27278f1c84de4bac5406bed46a5a987e178bc2b9f210cdaefb4a13bee785c44a: Status 404 returned error can't find the container with id 27278f1c84de4bac5406bed46a5a987e178bc2b9f210cdaefb4a13bee785c44a Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.871996 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:15:22 crc kubenswrapper[4831]: E1204 10:15:22.872199 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:15:23.872162557 +0000 UTC m=+20.821337871 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.973388 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.973432 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.973452 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:15:22 crc kubenswrapper[4831]: I1204 10:15:22.973471 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:15:22 crc kubenswrapper[4831]: E1204 10:15:22.973543 4831 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 10:15:22 crc kubenswrapper[4831]: E1204 10:15:22.973595 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 10:15:23.97358177 +0000 UTC m=+20.922757084 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 10:15:22 crc kubenswrapper[4831]: E1204 10:15:22.973595 4831 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 10:15:22 crc kubenswrapper[4831]: E1204 10:15:22.973636 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 10:15:22 crc kubenswrapper[4831]: E1204 10:15:22.973674 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 10:15:22 crc kubenswrapper[4831]: E1204 10:15:22.973603 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 10:15:22 crc kubenswrapper[4831]: E1204 10:15:22.973747 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 10:15:22 crc kubenswrapper[4831]: E1204 10:15:22.973767 4831 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 10:15:22 crc kubenswrapper[4831]: E1204 10:15:22.973697 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 10:15:23.973677353 +0000 UTC m=+20.922852747 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 10:15:22 crc kubenswrapper[4831]: E1204 10:15:22.973689 4831 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 10:15:22 crc kubenswrapper[4831]: E1204 10:15:22.973840 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 10:15:23.973819727 +0000 UTC m=+20.922995041 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 10:15:22 crc kubenswrapper[4831]: E1204 10:15:22.973858 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 10:15:23.973851327 +0000 UTC m=+20.923026641 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.279568 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.280112 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.280928 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.281535 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.282089 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.282585 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.283192 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.283726 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.284431 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.284981 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.285470 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.286148 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.286719 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.287226 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.290021 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.290543 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.291264 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.291986 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.292762 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.294776 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.295260 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.295835 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.296649 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.297295 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.298189 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.298806 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.302361 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.302892 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.303436 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.304308 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.304772 4831 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.304871 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.304884 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:23Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.307170 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.307634 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.308036 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.310065 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.310737 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.311695 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.312382 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.314846 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.315395 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.316072 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.317145 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.318283 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.318845 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.319823 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.320324 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.321599 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.322192 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.323044 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.323475 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.323965 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.323975 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:23Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.324928 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.325365 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.346893 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:23Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.362014 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:23Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.375162 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:23Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.389553 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7266f967-3803-4ef3-9609-5a9c540a8305\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 10:15:16.683634 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 10:15:16.685267 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-316787488/tls.crt::/tmp/serving-cert-316787488/tls.key\\\\\\\"\\\\nI1204 10:15:22.063722 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 10:15:22.066433 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 10:15:22.066471 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 10:15:22.066492 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 10:15:22.066499 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 10:15:22.070852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 10:15:22.070873 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070879 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 10:15:22.070889 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 10:15:22.070892 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI1204 10:15:22.070891 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 10:15:22.070895 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 10:15:22.073753 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:23Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.411296 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-xc5vd" event={"ID":"f2a02957-2de3-4874-b43e-85be9e748dab","Type":"ContainerStarted","Data":"c94431d54539f9725aa9b32ac709bc9406dd0f1a1f82c3de77b3fefe7a34e3b7"} Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.411343 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-xc5vd" event={"ID":"f2a02957-2de3-4874-b43e-85be9e748dab","Type":"ContainerStarted","Data":"8f8377edf7372e0e6272a43d3166b8095147559ce622ead2c1b2dc54cc273e4f"} Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.412869 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"a3fade3c6b87d8b8d465b4e59ca3685265f09365b19feab5f4d31c18fecdea5d"} Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.412899 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"2ca9691b1764372deab27ff5c314009eca094dd2ad98337eb5e21e0b8f883a72"} Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.414239 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6b5814902f9f02bed9eb4600d77993f20ad463f2f78064f66755641f509913f4"} Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.414258 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"87d1e9dabb13deee46d2c9ce72c90d2ed09c3e293b11b987c0c4be5b9a3cebe3"} Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.414267 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"bfd8240b862e3d6bf729670d3fba9a1983f15db4b249631850f1a4bda007dff5"} Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.416208 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.417703 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc"} Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.420887 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.422894 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:23Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.424615 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"27278f1c84de4bac5406bed46a5a987e178bc2b9f210cdaefb4a13bee785c44a"} Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.445307 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:23Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.457867 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7266f967-3803-4ef3-9609-5a9c540a8305\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 10:15:16.683634 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 10:15:16.685267 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-316787488/tls.crt::/tmp/serving-cert-316787488/tls.key\\\\\\\"\\\\nI1204 10:15:22.063722 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 10:15:22.066433 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 10:15:22.066471 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 10:15:22.066492 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 10:15:22.066499 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 10:15:22.070852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 10:15:22.070873 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070879 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 10:15:22.070889 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 10:15:22.070892 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI1204 10:15:22.070891 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 10:15:22.070895 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 10:15:22.073753 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:23Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.469240 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94431d54539f9725aa9b32ac709bc9406dd0f1a1f82c3de77b3fefe7a34e3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:23Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.483295 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:23Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.496961 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fade3c6b87d8b8d465b4e59ca3685265f09365b19feab5f4d31c18fecdea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:23Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.506886 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:23Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.519176 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:23Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.536330 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:23Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.554004 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5814902f9f02bed9eb4600d77993f20ad463f2f78064f66755641f509913f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d1e9dabb13deee46d2c9ce72c90d2ed09c3e293b11b987c0c4be5b9a3cebe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:23Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.881486 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:15:23 crc kubenswrapper[4831]: E1204 10:15:23.881683 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:15:25.881639899 +0000 UTC m=+22.830815213 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.982823 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.982874 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.982929 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:15:23 crc kubenswrapper[4831]: I1204 10:15:23.982953 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:15:23 crc kubenswrapper[4831]: E1204 10:15:23.983013 4831 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 10:15:23 crc kubenswrapper[4831]: E1204 10:15:23.983034 4831 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 10:15:23 crc kubenswrapper[4831]: E1204 10:15:23.983067 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 10:15:23 crc kubenswrapper[4831]: E1204 10:15:23.983082 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 10:15:23 crc kubenswrapper[4831]: E1204 10:15:23.983103 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 10:15:23 crc kubenswrapper[4831]: E1204 10:15:23.983089 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 10:15:25.983068662 +0000 UTC m=+22.932243976 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 10:15:23 crc kubenswrapper[4831]: E1204 10:15:23.983115 4831 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 10:15:23 crc kubenswrapper[4831]: E1204 10:15:23.983133 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 10:15:25.983119044 +0000 UTC m=+22.932294398 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 10:15:23 crc kubenswrapper[4831]: E1204 10:15:23.983088 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 10:15:23 crc kubenswrapper[4831]: E1204 10:15:23.983150 4831 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 10:15:23 crc kubenswrapper[4831]: E1204 10:15:23.983153 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 10:15:25.983144644 +0000 UTC m=+22.932320038 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 10:15:23 crc kubenswrapper[4831]: E1204 10:15:23.983185 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 10:15:25.983170185 +0000 UTC m=+22.932345499 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.275650 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.275650 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:15:24 crc kubenswrapper[4831]: E1204 10:15:24.275791 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:15:24 crc kubenswrapper[4831]: E1204 10:15:24.275836 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.275650 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:15:24 crc kubenswrapper[4831]: E1204 10:15:24.275911 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.446852 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-zk2rt"] Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.447562 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.448503 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-5g27v"] Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.448752 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.449213 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-g76nn"] Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.450239 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.450697 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.451474 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.451648 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.452154 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.452481 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.452648 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.452815 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.452904 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.453288 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.454084 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.454132 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.462076 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.473179 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:24Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.497006 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:24Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.519158 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5814902f9f02bed9eb4600d77993f20ad463f2f78064f66755641f509913f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d1e9dabb13deee46d2c9ce72c90d2ed09c3e293b11b987c0c4be5b9a3cebe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:24Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.542023 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fade3c6b87d8b8d465b4e59ca3685265f09365b19feab5f4d31c18fecdea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:24Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.562300 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:24Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.582639 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7266f967-3803-4ef3-9609-5a9c540a8305\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 10:15:16.683634 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 10:15:16.685267 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-316787488/tls.crt::/tmp/serving-cert-316787488/tls.key\\\\\\\"\\\\nI1204 10:15:22.063722 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 10:15:22.066433 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 10:15:22.066471 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 10:15:22.066492 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 10:15:22.066499 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 10:15:22.070852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 10:15:22.070873 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070879 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 10:15:22.070889 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 10:15:22.070892 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI1204 10:15:22.070891 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 10:15:22.070895 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 10:15:22.073753 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:24Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.588774 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6a78509-d612-4338-8562-9b0627c1793f-host-var-lib-kubelet\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.588807 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f4f6422d-d5d2-4e56-8f87-84846b4b98eb-os-release\") pod \"multus-additional-cni-plugins-zk2rt\" (UID: \"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\") " pod="openshift-multus/multus-additional-cni-plugins-zk2rt" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.588836 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c6a78509-d612-4338-8562-9b0627c1793f-multus-cni-dir\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.588855 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c6a78509-d612-4338-8562-9b0627c1793f-cni-binary-copy\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.588870 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c6a78509-d612-4338-8562-9b0627c1793f-host-run-k8s-cni-cncf-io\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.588883 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c6a78509-d612-4338-8562-9b0627c1793f-multus-conf-dir\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.588916 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f4f6422d-d5d2-4e56-8f87-84846b4b98eb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zk2rt\" (UID: \"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\") " pod="openshift-multus/multus-additional-cni-plugins-zk2rt" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.588943 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c6a78509-d612-4338-8562-9b0627c1793f-host-var-lib-cni-multus\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.588965 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f4f6422d-d5d2-4e56-8f87-84846b4b98eb-cnibin\") pod \"multus-additional-cni-plugins-zk2rt\" (UID: \"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\") " pod="openshift-multus/multus-additional-cni-plugins-zk2rt" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.588990 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f4f6422d-d5d2-4e56-8f87-84846b4b98eb-system-cni-dir\") pod \"multus-additional-cni-plugins-zk2rt\" (UID: \"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\") " pod="openshift-multus/multus-additional-cni-plugins-zk2rt" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.589022 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8475bb26-8864-4d49-935b-db7d4cb73387-rootfs\") pod \"machine-config-daemon-g76nn\" (UID: \"8475bb26-8864-4d49-935b-db7d4cb73387\") " pod="openshift-machine-config-operator/machine-config-daemon-g76nn" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.589058 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c6a78509-d612-4338-8562-9b0627c1793f-host-run-netns\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.589073 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6a78509-d612-4338-8562-9b0627c1793f-etc-kubernetes\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.589086 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8475bb26-8864-4d49-935b-db7d4cb73387-proxy-tls\") pod \"machine-config-daemon-g76nn\" (UID: \"8475bb26-8864-4d49-935b-db7d4cb73387\") " pod="openshift-machine-config-operator/machine-config-daemon-g76nn" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.589177 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt7gj\" (UniqueName: \"kubernetes.io/projected/8475bb26-8864-4d49-935b-db7d4cb73387-kube-api-access-mt7gj\") pod \"machine-config-daemon-g76nn\" (UID: \"8475bb26-8864-4d49-935b-db7d4cb73387\") " pod="openshift-machine-config-operator/machine-config-daemon-g76nn" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.589214 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c6a78509-d612-4338-8562-9b0627c1793f-hostroot\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.589234 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c6a78509-d612-4338-8562-9b0627c1793f-os-release\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.589255 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c6a78509-d612-4338-8562-9b0627c1793f-multus-daemon-config\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.589274 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f4f6422d-d5d2-4e56-8f87-84846b4b98eb-cni-binary-copy\") pod \"multus-additional-cni-plugins-zk2rt\" (UID: \"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\") " pod="openshift-multus/multus-additional-cni-plugins-zk2rt" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.589307 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5j6t\" (UniqueName: \"kubernetes.io/projected/f4f6422d-d5d2-4e56-8f87-84846b4b98eb-kube-api-access-j5j6t\") pod \"multus-additional-cni-plugins-zk2rt\" (UID: \"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\") " pod="openshift-multus/multus-additional-cni-plugins-zk2rt" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.589339 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c6a78509-d612-4338-8562-9b0627c1793f-host-var-lib-cni-bin\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.589361 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gp29\" (UniqueName: \"kubernetes.io/projected/c6a78509-d612-4338-8562-9b0627c1793f-kube-api-access-5gp29\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.589383 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c6a78509-d612-4338-8562-9b0627c1793f-system-cni-dir\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.589398 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c6a78509-d612-4338-8562-9b0627c1793f-cnibin\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.589412 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c6a78509-d612-4338-8562-9b0627c1793f-multus-socket-dir-parent\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.589428 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f4f6422d-d5d2-4e56-8f87-84846b4b98eb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zk2rt\" (UID: \"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\") " pod="openshift-multus/multus-additional-cni-plugins-zk2rt" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.589458 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8475bb26-8864-4d49-935b-db7d4cb73387-mcd-auth-proxy-config\") pod \"machine-config-daemon-g76nn\" (UID: \"8475bb26-8864-4d49-935b-db7d4cb73387\") " pod="openshift-machine-config-operator/machine-config-daemon-g76nn" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.589487 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c6a78509-d612-4338-8562-9b0627c1793f-host-run-multus-certs\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.590719 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94431d54539f9725aa9b32ac709bc9406dd0f1a1f82c3de77b3fefe7a34e3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:24Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.601251 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:24Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.612562 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zk2rt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:24Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.620754 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94431d54539f9725aa9b32ac709bc9406dd0f1a1f82c3de77b3fefe7a34e3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:24Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.630419 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:24Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.639577 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8475bb26-8864-4d49-935b-db7d4cb73387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g76nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:24Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.650649 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:24Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.664068 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5g27v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6a78509-d612-4338-8562-9b0627c1793f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5g27v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:24Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.676690 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7266f967-3803-4ef3-9609-5a9c540a8305\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 10:15:16.683634 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 10:15:16.685267 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-316787488/tls.crt::/tmp/serving-cert-316787488/tls.key\\\\\\\"\\\\nI1204 10:15:22.063722 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 10:15:22.066433 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 10:15:22.066471 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 10:15:22.066492 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 10:15:22.066499 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 10:15:22.070852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 10:15:22.070873 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070879 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 10:15:22.070889 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 10:15:22.070892 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI1204 10:15:22.070891 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 10:15:22.070895 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 10:15:22.073753 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:24Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.690473 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f4f6422d-d5d2-4e56-8f87-84846b4b98eb-system-cni-dir\") pod \"multus-additional-cni-plugins-zk2rt\" (UID: \"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\") " pod="openshift-multus/multus-additional-cni-plugins-zk2rt" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.690532 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8475bb26-8864-4d49-935b-db7d4cb73387-rootfs\") pod \"machine-config-daemon-g76nn\" (UID: \"8475bb26-8864-4d49-935b-db7d4cb73387\") " pod="openshift-machine-config-operator/machine-config-daemon-g76nn" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.690569 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c6a78509-d612-4338-8562-9b0627c1793f-host-run-netns\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.690600 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6a78509-d612-4338-8562-9b0627c1793f-etc-kubernetes\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.690630 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8475bb26-8864-4d49-935b-db7d4cb73387-proxy-tls\") pod \"machine-config-daemon-g76nn\" (UID: \"8475bb26-8864-4d49-935b-db7d4cb73387\") " pod="openshift-machine-config-operator/machine-config-daemon-g76nn" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.690639 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f4f6422d-d5d2-4e56-8f87-84846b4b98eb-system-cni-dir\") pod \"multus-additional-cni-plugins-zk2rt\" (UID: \"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\") " pod="openshift-multus/multus-additional-cni-plugins-zk2rt" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.690703 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6a78509-d612-4338-8562-9b0627c1793f-etc-kubernetes\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.690708 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt7gj\" (UniqueName: \"kubernetes.io/projected/8475bb26-8864-4d49-935b-db7d4cb73387-kube-api-access-mt7gj\") pod \"machine-config-daemon-g76nn\" (UID: \"8475bb26-8864-4d49-935b-db7d4cb73387\") " pod="openshift-machine-config-operator/machine-config-daemon-g76nn" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.690724 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c6a78509-d612-4338-8562-9b0627c1793f-host-run-netns\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.690734 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8475bb26-8864-4d49-935b-db7d4cb73387-rootfs\") pod \"machine-config-daemon-g76nn\" (UID: \"8475bb26-8864-4d49-935b-db7d4cb73387\") " pod="openshift-machine-config-operator/machine-config-daemon-g76nn" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.690777 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c6a78509-d612-4338-8562-9b0627c1793f-hostroot\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.690800 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c6a78509-d612-4338-8562-9b0627c1793f-hostroot\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.690822 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f4f6422d-d5d2-4e56-8f87-84846b4b98eb-cni-binary-copy\") pod \"multus-additional-cni-plugins-zk2rt\" (UID: \"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\") " pod="openshift-multus/multus-additional-cni-plugins-zk2rt" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.690848 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5j6t\" (UniqueName: \"kubernetes.io/projected/f4f6422d-d5d2-4e56-8f87-84846b4b98eb-kube-api-access-j5j6t\") pod \"multus-additional-cni-plugins-zk2rt\" (UID: \"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\") " pod="openshift-multus/multus-additional-cni-plugins-zk2rt" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.690873 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c6a78509-d612-4338-8562-9b0627c1793f-os-release\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.690892 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c6a78509-d612-4338-8562-9b0627c1793f-multus-daemon-config\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.690910 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gp29\" (UniqueName: \"kubernetes.io/projected/c6a78509-d612-4338-8562-9b0627c1793f-kube-api-access-5gp29\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.690941 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c6a78509-d612-4338-8562-9b0627c1793f-host-var-lib-cni-bin\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.690957 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f4f6422d-d5d2-4e56-8f87-84846b4b98eb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zk2rt\" (UID: \"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\") " pod="openshift-multus/multus-additional-cni-plugins-zk2rt" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.690977 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8475bb26-8864-4d49-935b-db7d4cb73387-mcd-auth-proxy-config\") pod \"machine-config-daemon-g76nn\" (UID: \"8475bb26-8864-4d49-935b-db7d4cb73387\") " pod="openshift-machine-config-operator/machine-config-daemon-g76nn" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.690994 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c6a78509-d612-4338-8562-9b0627c1793f-system-cni-dir\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.691011 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c6a78509-d612-4338-8562-9b0627c1793f-cnibin\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.691026 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c6a78509-d612-4338-8562-9b0627c1793f-multus-socket-dir-parent\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.691051 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c6a78509-d612-4338-8562-9b0627c1793f-host-run-multus-certs\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.691069 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6a78509-d612-4338-8562-9b0627c1793f-host-var-lib-kubelet\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.691085 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f4f6422d-d5d2-4e56-8f87-84846b4b98eb-os-release\") pod \"multus-additional-cni-plugins-zk2rt\" (UID: \"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\") " pod="openshift-multus/multus-additional-cni-plugins-zk2rt" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.691104 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c6a78509-d612-4338-8562-9b0627c1793f-multus-cni-dir\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.691121 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c6a78509-d612-4338-8562-9b0627c1793f-cni-binary-copy\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.691136 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c6a78509-d612-4338-8562-9b0627c1793f-host-run-k8s-cni-cncf-io\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.691131 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c6a78509-d612-4338-8562-9b0627c1793f-host-run-multus-certs\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.691153 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c6a78509-d612-4338-8562-9b0627c1793f-multus-conf-dir\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.691169 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f4f6422d-d5d2-4e56-8f87-84846b4b98eb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zk2rt\" (UID: \"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\") " pod="openshift-multus/multus-additional-cni-plugins-zk2rt" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.691184 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c6a78509-d612-4338-8562-9b0627c1793f-host-var-lib-cni-bin\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.691195 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c6a78509-d612-4338-8562-9b0627c1793f-cnibin\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.691196 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c6a78509-d612-4338-8562-9b0627c1793f-host-var-lib-cni-multus\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.691226 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c6a78509-d612-4338-8562-9b0627c1793f-host-var-lib-cni-multus\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.691243 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f4f6422d-d5d2-4e56-8f87-84846b4b98eb-cnibin\") pod \"multus-additional-cni-plugins-zk2rt\" (UID: \"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\") " pod="openshift-multus/multus-additional-cni-plugins-zk2rt" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.691239 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c6a78509-d612-4338-8562-9b0627c1793f-os-release\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.691243 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c6a78509-d612-4338-8562-9b0627c1793f-multus-conf-dir\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.691133 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c6a78509-d612-4338-8562-9b0627c1793f-multus-socket-dir-parent\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.691270 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c6a78509-d612-4338-8562-9b0627c1793f-host-run-k8s-cni-cncf-io\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.691338 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6a78509-d612-4338-8562-9b0627c1793f-host-var-lib-kubelet\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.691334 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f4f6422d-d5d2-4e56-8f87-84846b4b98eb-cnibin\") pod \"multus-additional-cni-plugins-zk2rt\" (UID: \"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\") " pod="openshift-multus/multus-additional-cni-plugins-zk2rt" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.691342 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f4f6422d-d5d2-4e56-8f87-84846b4b98eb-os-release\") pod \"multus-additional-cni-plugins-zk2rt\" (UID: \"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\") " pod="openshift-multus/multus-additional-cni-plugins-zk2rt" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.691421 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c6a78509-d612-4338-8562-9b0627c1793f-system-cni-dir\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.691443 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f4f6422d-d5d2-4e56-8f87-84846b4b98eb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zk2rt\" (UID: \"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\") " pod="openshift-multus/multus-additional-cni-plugins-zk2rt" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.691510 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c6a78509-d612-4338-8562-9b0627c1793f-multus-cni-dir\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.691714 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8475bb26-8864-4d49-935b-db7d4cb73387-mcd-auth-proxy-config\") pod \"machine-config-daemon-g76nn\" (UID: \"8475bb26-8864-4d49-935b-db7d4cb73387\") " pod="openshift-machine-config-operator/machine-config-daemon-g76nn" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.691825 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c6a78509-d612-4338-8562-9b0627c1793f-multus-daemon-config\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.691836 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c6a78509-d612-4338-8562-9b0627c1793f-cni-binary-copy\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.692276 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f4f6422d-d5d2-4e56-8f87-84846b4b98eb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zk2rt\" (UID: \"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\") " pod="openshift-multus/multus-additional-cni-plugins-zk2rt" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.692323 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f4f6422d-d5d2-4e56-8f87-84846b4b98eb-cni-binary-copy\") pod \"multus-additional-cni-plugins-zk2rt\" (UID: \"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\") " pod="openshift-multus/multus-additional-cni-plugins-zk2rt" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.694250 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zk2rt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:24Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.707900 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fade3c6b87d8b8d465b4e59ca3685265f09365b19feab5f4d31c18fecdea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:24Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.718987 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:24Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.730221 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:24Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.742759 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5814902f9f02bed9eb4600d77993f20ad463f2f78064f66755641f509913f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d1e9dabb13deee46d2c9ce72c90d2ed09c3e293b11b987c0c4be5b9a3cebe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:24Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.783033 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8475bb26-8864-4d49-935b-db7d4cb73387-proxy-tls\") pod \"machine-config-daemon-g76nn\" (UID: \"8475bb26-8864-4d49-935b-db7d4cb73387\") " pod="openshift-machine-config-operator/machine-config-daemon-g76nn" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.783053 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt7gj\" (UniqueName: \"kubernetes.io/projected/8475bb26-8864-4d49-935b-db7d4cb73387-kube-api-access-mt7gj\") pod \"machine-config-daemon-g76nn\" (UID: \"8475bb26-8864-4d49-935b-db7d4cb73387\") " pod="openshift-machine-config-operator/machine-config-daemon-g76nn" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.783101 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gp29\" (UniqueName: \"kubernetes.io/projected/c6a78509-d612-4338-8562-9b0627c1793f-kube-api-access-5gp29\") pod \"multus-5g27v\" (UID: \"c6a78509-d612-4338-8562-9b0627c1793f\") " pod="openshift-multus/multus-5g27v" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.783757 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5j6t\" (UniqueName: \"kubernetes.io/projected/f4f6422d-d5d2-4e56-8f87-84846b4b98eb-kube-api-access-j5j6t\") pod \"multus-additional-cni-plugins-zk2rt\" (UID: \"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\") " pod="openshift-multus/multus-additional-cni-plugins-zk2rt" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.834597 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4xzkp"] Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.835537 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.837249 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.837749 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.838180 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.838500 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.838507 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.838910 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.840211 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.852653 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7266f967-3803-4ef3-9609-5a9c540a8305\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 10:15:16.683634 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 10:15:16.685267 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-316787488/tls.crt::/tmp/serving-cert-316787488/tls.key\\\\\\\"\\\\nI1204 10:15:22.063722 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 10:15:22.066433 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 10:15:22.066471 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 10:15:22.066492 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 10:15:22.066499 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 10:15:22.070852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 10:15:22.070873 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070879 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 10:15:22.070889 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 10:15:22.070892 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI1204 10:15:22.070891 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 10:15:22.070895 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 10:15:22.073753 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:24Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.865488 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zk2rt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:24Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.878262 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:24Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.890204 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:24Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.892628 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-host-cni-netd\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.892736 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-var-lib-openvswitch\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.892773 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-etc-openvswitch\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.892811 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-host-run-ovn-kubernetes\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.892845 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-host-cni-bin\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.892883 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-run-openvswitch\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.892918 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-systemd-units\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.892949 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb7xl\" (UniqueName: \"kubernetes.io/projected/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-kube-api-access-tb7xl\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.893024 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-node-log\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.893105 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-host-kubelet\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.893149 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-host-slash\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.893184 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-ovnkube-config\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.893203 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-ovnkube-script-lib\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.893238 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-run-systemd\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.893255 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-run-ovn\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.893273 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-log-socket\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.893289 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-ovn-node-metrics-cert\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.893311 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-host-run-netns\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.893328 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-env-overrides\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.893361 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.908789 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5814902f9f02bed9eb4600d77993f20ad463f2f78064f66755641f509913f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d1e9dabb13deee46d2c9ce72c90d2ed09c3e293b11b987c0c4be5b9a3cebe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:24Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.921019 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fade3c6b87d8b8d465b4e59ca3685265f09365b19feab5f4d31c18fecdea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:24Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.936861 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94431d54539f9725aa9b32ac709bc9406dd0f1a1f82c3de77b3fefe7a34e3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:24Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.948145 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:24Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.960453 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8475bb26-8864-4d49-935b-db7d4cb73387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g76nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:24Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.978906 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:24Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.990544 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5g27v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6a78509-d612-4338-8562-9b0627c1793f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5g27v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:24Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.994036 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-host-cni-netd\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.994114 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-host-run-ovn-kubernetes\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.994145 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-var-lib-openvswitch\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.994180 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-etc-openvswitch\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.994192 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-host-cni-netd\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.994208 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-host-cni-bin\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.994225 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-host-run-ovn-kubernetes\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.994247 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-var-lib-openvswitch\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.994237 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-run-openvswitch\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.994285 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-run-openvswitch\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.994297 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-systemd-units\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.994314 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-host-cni-bin\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.994320 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb7xl\" (UniqueName: \"kubernetes.io/projected/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-kube-api-access-tb7xl\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.994291 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-etc-openvswitch\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.994374 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-node-log\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.994417 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-systemd-units\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.994453 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-host-kubelet\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.994425 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-host-kubelet\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.994495 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-host-slash\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.994501 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-node-log\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.994520 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-ovnkube-config\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.994566 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-host-slash\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.994601 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-ovnkube-script-lib\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.994635 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-run-systemd\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.994656 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-run-ovn\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.994709 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-log-socket\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.994725 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-run-systemd\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.994735 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-ovn-node-metrics-cert\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.994743 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-run-ovn\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.994769 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-log-socket\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.994816 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-host-run-netns\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.994841 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-env-overrides\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.994878 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.994881 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-host-run-netns\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.994959 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.995388 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-ovnkube-config\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.995436 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-env-overrides\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.995475 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-ovnkube-script-lib\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:24 crc kubenswrapper[4831]: I1204 10:15:24.998298 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-ovn-node-metrics-cert\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:25 crc kubenswrapper[4831]: I1204 10:15:25.008518 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb7xl\" (UniqueName: \"kubernetes.io/projected/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-kube-api-access-tb7xl\") pod \"ovnkube-node-4xzkp\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:25 crc kubenswrapper[4831]: I1204 10:15:25.015773 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xzkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:25Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:25 crc kubenswrapper[4831]: I1204 10:15:25.062387 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" Dec 04 10:15:25 crc kubenswrapper[4831]: I1204 10:15:25.071469 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5g27v" Dec 04 10:15:25 crc kubenswrapper[4831]: W1204 10:15:25.072465 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4f6422d_d5d2_4e56_8f87_84846b4b98eb.slice/crio-ff9f764a5f29c210a48db5e9fcc820741acedb7d955afbab8ba17266cd322e99 WatchSource:0}: Error finding container ff9f764a5f29c210a48db5e9fcc820741acedb7d955afbab8ba17266cd322e99: Status 404 returned error can't find the container with id ff9f764a5f29c210a48db5e9fcc820741acedb7d955afbab8ba17266cd322e99 Dec 04 10:15:25 crc kubenswrapper[4831]: I1204 10:15:25.079220 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" Dec 04 10:15:25 crc kubenswrapper[4831]: I1204 10:15:25.149036 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:25 crc kubenswrapper[4831]: W1204 10:15:25.162529 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1261b9db_fe52_4fbc_9a9c_7e0c3486276e.slice/crio-e9eea486c4c1a5a5149fe823358e24fb2e8b0d0101dcbffe45d9305f5b002602 WatchSource:0}: Error finding container e9eea486c4c1a5a5149fe823358e24fb2e8b0d0101dcbffe45d9305f5b002602: Status 404 returned error can't find the container with id e9eea486c4c1a5a5149fe823358e24fb2e8b0d0101dcbffe45d9305f5b002602 Dec 04 10:15:25 crc kubenswrapper[4831]: I1204 10:15:25.431824 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"cc448500592bc03a05d95154710642daf9362093c92bed1334c84b3f23ebf556"} Dec 04 10:15:25 crc kubenswrapper[4831]: I1204 10:15:25.433576 4831 generic.go:334] "Generic (PLEG): container finished" podID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerID="dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f" exitCode=0 Dec 04 10:15:25 crc kubenswrapper[4831]: I1204 10:15:25.433631 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" event={"ID":"1261b9db-fe52-4fbc-9a9c-7e0c3486276e","Type":"ContainerDied","Data":"dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f"} Dec 04 10:15:25 crc kubenswrapper[4831]: I1204 10:15:25.433649 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" event={"ID":"1261b9db-fe52-4fbc-9a9c-7e0c3486276e","Type":"ContainerStarted","Data":"e9eea486c4c1a5a5149fe823358e24fb2e8b0d0101dcbffe45d9305f5b002602"} Dec 04 10:15:25 crc kubenswrapper[4831]: I1204 10:15:25.435248 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" event={"ID":"8475bb26-8864-4d49-935b-db7d4cb73387","Type":"ContainerStarted","Data":"d6bf0cca6fa6a2c105fe80613113f4e04f5252d70eb5de23b53632ba8479e070"} Dec 04 10:15:25 crc kubenswrapper[4831]: I1204 10:15:25.435275 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" event={"ID":"8475bb26-8864-4d49-935b-db7d4cb73387","Type":"ContainerStarted","Data":"5399a139e421589fc50e4bad256402be123e37600f5c39eadf919ce767e9eeeb"} Dec 04 10:15:25 crc kubenswrapper[4831]: I1204 10:15:25.435284 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" event={"ID":"8475bb26-8864-4d49-935b-db7d4cb73387","Type":"ContainerStarted","Data":"786e0722bed5f6b797f458f9f56face2161469886563586fee0fec20273e0214"} Dec 04 10:15:25 crc kubenswrapper[4831]: I1204 10:15:25.436937 4831 generic.go:334] "Generic (PLEG): container finished" podID="f4f6422d-d5d2-4e56-8f87-84846b4b98eb" containerID="7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4" exitCode=0 Dec 04 10:15:25 crc kubenswrapper[4831]: I1204 10:15:25.436992 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" event={"ID":"f4f6422d-d5d2-4e56-8f87-84846b4b98eb","Type":"ContainerDied","Data":"7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4"} Dec 04 10:15:25 crc kubenswrapper[4831]: I1204 10:15:25.437009 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" event={"ID":"f4f6422d-d5d2-4e56-8f87-84846b4b98eb","Type":"ContainerStarted","Data":"ff9f764a5f29c210a48db5e9fcc820741acedb7d955afbab8ba17266cd322e99"} Dec 04 10:15:25 crc kubenswrapper[4831]: I1204 10:15:25.438137 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5g27v" event={"ID":"c6a78509-d612-4338-8562-9b0627c1793f","Type":"ContainerStarted","Data":"8109a06b8e0bea96f12e0e1cf757a34d1da4d89159cac1247613d59cfa5645df"} Dec 04 10:15:25 crc kubenswrapper[4831]: I1204 10:15:25.438161 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5g27v" event={"ID":"c6a78509-d612-4338-8562-9b0627c1793f","Type":"ContainerStarted","Data":"bba63aaf9626fa04857c62668743096bad76b0c77eb52423ba7a7bbd3164909c"} Dec 04 10:15:25 crc kubenswrapper[4831]: I1204 10:15:25.456515 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:25Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:25 crc kubenswrapper[4831]: I1204 10:15:25.478540 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5g27v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6a78509-d612-4338-8562-9b0627c1793f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5g27v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:25Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:25 crc kubenswrapper[4831]: I1204 10:15:25.509000 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xzkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:25Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:25 crc kubenswrapper[4831]: I1204 10:15:25.527320 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7266f967-3803-4ef3-9609-5a9c540a8305\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 10:15:16.683634 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 10:15:16.685267 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-316787488/tls.crt::/tmp/serving-cert-316787488/tls.key\\\\\\\"\\\\nI1204 10:15:22.063722 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 10:15:22.066433 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 10:15:22.066471 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 10:15:22.066492 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 10:15:22.066499 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 10:15:22.070852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 10:15:22.070873 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070879 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 10:15:22.070889 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 10:15:22.070892 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI1204 10:15:22.070891 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 10:15:22.070895 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 10:15:22.073753 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:25Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:25 crc kubenswrapper[4831]: I1204 10:15:25.541039 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zk2rt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:25Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:25 crc kubenswrapper[4831]: I1204 10:15:25.553862 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc448500592bc03a05d95154710642daf9362093c92bed1334c84b3f23ebf556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:25Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:25 crc kubenswrapper[4831]: I1204 10:15:25.568282 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:25Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:25 crc kubenswrapper[4831]: I1204 10:15:25.583800 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5814902f9f02bed9eb4600d77993f20ad463f2f78064f66755641f509913f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d1e9dabb13deee46d2c9ce72c90d2ed09c3e293b11b987c0c4be5b9a3cebe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:25Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:25 crc kubenswrapper[4831]: I1204 10:15:25.599377 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fade3c6b87d8b8d465b4e59ca3685265f09365b19feab5f4d31c18fecdea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:25Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:25 crc kubenswrapper[4831]: I1204 10:15:25.615524 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94431d54539f9725aa9b32ac709bc9406dd0f1a1f82c3de77b3fefe7a34e3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:25Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:25 crc kubenswrapper[4831]: I1204 10:15:25.628858 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:25Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:25 crc kubenswrapper[4831]: I1204 10:15:25.638103 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8475bb26-8864-4d49-935b-db7d4cb73387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g76nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:25Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:25 crc kubenswrapper[4831]: I1204 10:15:25.648644 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:25Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:25 crc kubenswrapper[4831]: I1204 10:15:25.658261 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8475bb26-8864-4d49-935b-db7d4cb73387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bf0cca6fa6a2c105fe80613113f4e04f5252d70eb5de23b53632ba8479e070\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399a139e421589fc50e4bad256402be123e37600f5c39eadf919ce767e9eeeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g76nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:25Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:25 crc kubenswrapper[4831]: I1204 10:15:25.666321 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94431d54539f9725aa9b32ac709bc9406dd0f1a1f82c3de77b3fefe7a34e3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:25Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:25 crc kubenswrapper[4831]: I1204 10:15:25.681858 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xzkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:25Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:25 crc kubenswrapper[4831]: I1204 10:15:25.692183 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:25Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:25 crc kubenswrapper[4831]: I1204 10:15:25.703538 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5g27v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6a78509-d612-4338-8562-9b0627c1793f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8109a06b8e0bea96f12e0e1cf757a34d1da4d89159cac1247613d59cfa5645df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5g27v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:25Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:25 crc kubenswrapper[4831]: I1204 10:15:25.725058 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zk2rt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:25Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:25 crc kubenswrapper[4831]: I1204 10:15:25.736385 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7266f967-3803-4ef3-9609-5a9c540a8305\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 10:15:16.683634 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 10:15:16.685267 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-316787488/tls.crt::/tmp/serving-cert-316787488/tls.key\\\\\\\"\\\\nI1204 10:15:22.063722 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 10:15:22.066433 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 10:15:22.066471 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 10:15:22.066492 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 10:15:22.066499 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 10:15:22.070852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 10:15:22.070873 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070879 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 10:15:22.070889 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 10:15:22.070892 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI1204 10:15:22.070891 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 10:15:22.070895 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 10:15:22.073753 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:25Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:25 crc kubenswrapper[4831]: I1204 10:15:25.745404 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:25Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:25 crc kubenswrapper[4831]: I1204 10:15:25.754965 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5814902f9f02bed9eb4600d77993f20ad463f2f78064f66755641f509913f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d1e9dabb13deee46d2c9ce72c90d2ed09c3e293b11b987c0c4be5b9a3cebe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:25Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:25 crc kubenswrapper[4831]: I1204 10:15:25.767259 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fade3c6b87d8b8d465b4e59ca3685265f09365b19feab5f4d31c18fecdea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:25Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:25 crc kubenswrapper[4831]: I1204 10:15:25.779737 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc448500592bc03a05d95154710642daf9362093c92bed1334c84b3f23ebf556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:25Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:25 crc kubenswrapper[4831]: I1204 10:15:25.905416 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:15:25 crc kubenswrapper[4831]: E1204 10:15:25.905539 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:15:29.905519905 +0000 UTC m=+26.854695229 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.006534 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.006578 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.006629 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.006674 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:15:26 crc kubenswrapper[4831]: E1204 10:15:26.006718 4831 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 10:15:26 crc kubenswrapper[4831]: E1204 10:15:26.006779 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 10:15:30.006765643 +0000 UTC m=+26.955940957 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 10:15:26 crc kubenswrapper[4831]: E1204 10:15:26.006777 4831 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 10:15:26 crc kubenswrapper[4831]: E1204 10:15:26.006777 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 10:15:26 crc kubenswrapper[4831]: E1204 10:15:26.006919 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 10:15:26 crc kubenswrapper[4831]: E1204 10:15:26.006940 4831 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 10:15:26 crc kubenswrapper[4831]: E1204 10:15:26.006792 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 10:15:26 crc kubenswrapper[4831]: E1204 10:15:26.006975 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 10:15:26 crc kubenswrapper[4831]: E1204 10:15:26.006989 4831 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 10:15:26 crc kubenswrapper[4831]: E1204 10:15:26.006864 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 10:15:30.006842015 +0000 UTC m=+26.956017409 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 10:15:26 crc kubenswrapper[4831]: E1204 10:15:26.007044 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 10:15:30.00703286 +0000 UTC m=+26.956208284 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 10:15:26 crc kubenswrapper[4831]: E1204 10:15:26.007060 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 10:15:30.007051891 +0000 UTC m=+26.956227325 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.104989 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.111214 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.111281 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.111292 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.111421 4831 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.119027 4831 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.119192 4831 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.120161 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.120208 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.120217 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.120232 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.120244 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:26Z","lastTransitionTime":"2025-12-04T10:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:26 crc kubenswrapper[4831]: E1204 10:15:26.140767 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae8c7b9f-5f4a-44bc-a820-5600f29471a7\\\",\\\"systemUUID\\\":\\\"aaf9904b-e604-46a1-bdf5-7d2b7b9a992c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.147835 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.147865 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.147873 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.147887 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.147895 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:26Z","lastTransitionTime":"2025-12-04T10:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:26 crc kubenswrapper[4831]: E1204 10:15:26.168055 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae8c7b9f-5f4a-44bc-a820-5600f29471a7\\\",\\\"systemUUID\\\":\\\"aaf9904b-e604-46a1-bdf5-7d2b7b9a992c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.174422 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.174466 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.174477 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.174495 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.174506 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:26Z","lastTransitionTime":"2025-12-04T10:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:26 crc kubenswrapper[4831]: E1204 10:15:26.197979 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae8c7b9f-5f4a-44bc-a820-5600f29471a7\\\",\\\"systemUUID\\\":\\\"aaf9904b-e604-46a1-bdf5-7d2b7b9a992c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.207456 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.207748 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.207827 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.207927 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.208004 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:26Z","lastTransitionTime":"2025-12-04T10:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:26 crc kubenswrapper[4831]: E1204 10:15:26.225867 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae8c7b9f-5f4a-44bc-a820-5600f29471a7\\\",\\\"systemUUID\\\":\\\"aaf9904b-e604-46a1-bdf5-7d2b7b9a992c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.229048 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.229080 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.229091 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.229106 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.229116 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:26Z","lastTransitionTime":"2025-12-04T10:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:26 crc kubenswrapper[4831]: E1204 10:15:26.240541 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae8c7b9f-5f4a-44bc-a820-5600f29471a7\\\",\\\"systemUUID\\\":\\\"aaf9904b-e604-46a1-bdf5-7d2b7b9a992c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:26 crc kubenswrapper[4831]: E1204 10:15:26.240693 4831 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.242447 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.242471 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.242479 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.242494 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.242502 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:26Z","lastTransitionTime":"2025-12-04T10:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.275910 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.275965 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.275910 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:15:26 crc kubenswrapper[4831]: E1204 10:15:26.276042 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:15:26 crc kubenswrapper[4831]: E1204 10:15:26.276101 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:15:26 crc kubenswrapper[4831]: E1204 10:15:26.276233 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.294421 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-9ft9l"] Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.294848 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9ft9l" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.296467 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.296525 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.296998 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.297253 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.312208 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7266f967-3803-4ef3-9609-5a9c540a8305\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 10:15:16.683634 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 10:15:16.685267 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-316787488/tls.crt::/tmp/serving-cert-316787488/tls.key\\\\\\\"\\\\nI1204 10:15:22.063722 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 10:15:22.066433 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 10:15:22.066471 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 10:15:22.066492 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 10:15:22.066499 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 10:15:22.070852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 10:15:22.070873 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070879 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 10:15:22.070889 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 10:15:22.070892 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI1204 10:15:22.070891 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 10:15:22.070895 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 10:15:22.073753 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.324710 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zk2rt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.335950 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.344984 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.345024 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.345036 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.345053 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.345066 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:26Z","lastTransitionTime":"2025-12-04T10:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.348235 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5814902f9f02bed9eb4600d77993f20ad463f2f78064f66755641f509913f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d1e9dabb13deee46d2c9ce72c90d2ed09c3e293b11b987c0c4be5b9a3cebe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.365246 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fade3c6b87d8b8d465b4e59ca3685265f09365b19feab5f4d31c18fecdea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.376341 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc448500592bc03a05d95154710642daf9362093c92bed1334c84b3f23ebf556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.386474 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94431d54539f9725aa9b32ac709bc9406dd0f1a1f82c3de77b3fefe7a34e3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.400506 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.410838 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l7pt\" (UniqueName: \"kubernetes.io/projected/f3938322-cab2-412a-91e4-904ce2d99adf-kube-api-access-4l7pt\") pod \"node-ca-9ft9l\" (UID: \"f3938322-cab2-412a-91e4-904ce2d99adf\") " pod="openshift-image-registry/node-ca-9ft9l" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.410903 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f3938322-cab2-412a-91e4-904ce2d99adf-serviceca\") pod \"node-ca-9ft9l\" (UID: \"f3938322-cab2-412a-91e4-904ce2d99adf\") " pod="openshift-image-registry/node-ca-9ft9l" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.410970 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f3938322-cab2-412a-91e4-904ce2d99adf-host\") pod \"node-ca-9ft9l\" (UID: \"f3938322-cab2-412a-91e4-904ce2d99adf\") " pod="openshift-image-registry/node-ca-9ft9l" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.412701 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8475bb26-8864-4d49-935b-db7d4cb73387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bf0cca6fa6a2c105fe80613113f4e04f5252d70eb5de23b53632ba8479e070\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399a139e421589fc50e4bad256402be123e37600f5c39eadf919ce767e9eeeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g76nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.424775 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5g27v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6a78509-d612-4338-8562-9b0627c1793f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8109a06b8e0bea96f12e0e1cf757a34d1da4d89159cac1247613d59cfa5645df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5g27v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.445151 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xzkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.445933 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" event={"ID":"1261b9db-fe52-4fbc-9a9c-7e0c3486276e","Type":"ContainerStarted","Data":"c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea"} Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.445974 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" event={"ID":"1261b9db-fe52-4fbc-9a9c-7e0c3486276e","Type":"ContainerStarted","Data":"ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96"} Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.445986 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" event={"ID":"1261b9db-fe52-4fbc-9a9c-7e0c3486276e","Type":"ContainerStarted","Data":"3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3"} Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.445998 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" event={"ID":"1261b9db-fe52-4fbc-9a9c-7e0c3486276e","Type":"ContainerStarted","Data":"99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140"} Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.446008 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" event={"ID":"1261b9db-fe52-4fbc-9a9c-7e0c3486276e","Type":"ContainerStarted","Data":"cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa"} Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.446018 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" event={"ID":"1261b9db-fe52-4fbc-9a9c-7e0c3486276e","Type":"ContainerStarted","Data":"648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92"} Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.447788 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.447811 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.447818 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.447831 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.447839 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:26Z","lastTransitionTime":"2025-12-04T10:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.449993 4831 generic.go:334] "Generic (PLEG): container finished" podID="f4f6422d-d5d2-4e56-8f87-84846b4b98eb" containerID="aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb" exitCode=0 Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.450453 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" event={"ID":"f4f6422d-d5d2-4e56-8f87-84846b4b98eb","Type":"ContainerDied","Data":"aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb"} Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.458804 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9ft9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3938322-cab2-412a-91e4-904ce2d99adf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4l7pt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9ft9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.472407 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.485400 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7266f967-3803-4ef3-9609-5a9c540a8305\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 10:15:16.683634 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 10:15:16.685267 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-316787488/tls.crt::/tmp/serving-cert-316787488/tls.key\\\\\\\"\\\\nI1204 10:15:22.063722 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 10:15:22.066433 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 10:15:22.066471 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 10:15:22.066492 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 10:15:22.066499 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 10:15:22.070852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 10:15:22.070873 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070879 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 10:15:22.070889 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 10:15:22.070892 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI1204 10:15:22.070891 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 10:15:22.070895 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 10:15:22.073753 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.500616 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zk2rt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.512236 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f3938322-cab2-412a-91e4-904ce2d99adf-host\") pod \"node-ca-9ft9l\" (UID: \"f3938322-cab2-412a-91e4-904ce2d99adf\") " pod="openshift-image-registry/node-ca-9ft9l" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.512476 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l7pt\" (UniqueName: \"kubernetes.io/projected/f3938322-cab2-412a-91e4-904ce2d99adf-kube-api-access-4l7pt\") pod \"node-ca-9ft9l\" (UID: \"f3938322-cab2-412a-91e4-904ce2d99adf\") " pod="openshift-image-registry/node-ca-9ft9l" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.512566 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f3938322-cab2-412a-91e4-904ce2d99adf-serviceca\") pod \"node-ca-9ft9l\" (UID: \"f3938322-cab2-412a-91e4-904ce2d99adf\") " pod="openshift-image-registry/node-ca-9ft9l" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.512383 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f3938322-cab2-412a-91e4-904ce2d99adf-host\") pod \"node-ca-9ft9l\" (UID: \"f3938322-cab2-412a-91e4-904ce2d99adf\") " pod="openshift-image-registry/node-ca-9ft9l" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.514242 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f3938322-cab2-412a-91e4-904ce2d99adf-serviceca\") pod \"node-ca-9ft9l\" (UID: \"f3938322-cab2-412a-91e4-904ce2d99adf\") " pod="openshift-image-registry/node-ca-9ft9l" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.514782 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc448500592bc03a05d95154710642daf9362093c92bed1334c84b3f23ebf556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.525727 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.533984 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l7pt\" (UniqueName: \"kubernetes.io/projected/f3938322-cab2-412a-91e4-904ce2d99adf-kube-api-access-4l7pt\") pod \"node-ca-9ft9l\" (UID: \"f3938322-cab2-412a-91e4-904ce2d99adf\") " pod="openshift-image-registry/node-ca-9ft9l" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.538694 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5814902f9f02bed9eb4600d77993f20ad463f2f78064f66755641f509913f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d1e9dabb13deee46d2c9ce72c90d2ed09c3e293b11b987c0c4be5b9a3cebe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.550141 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fade3c6b87d8b8d465b4e59ca3685265f09365b19feab5f4d31c18fecdea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.550783 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.550827 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.550839 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.550857 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.550869 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:26Z","lastTransitionTime":"2025-12-04T10:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.558830 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94431d54539f9725aa9b32ac709bc9406dd0f1a1f82c3de77b3fefe7a34e3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.569471 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.579732 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8475bb26-8864-4d49-935b-db7d4cb73387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bf0cca6fa6a2c105fe80613113f4e04f5252d70eb5de23b53632ba8479e070\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399a139e421589fc50e4bad256402be123e37600f5c39eadf919ce767e9eeeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g76nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.591858 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.603790 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5g27v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6a78509-d612-4338-8562-9b0627c1793f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8109a06b8e0bea96f12e0e1cf757a34d1da4d89159cac1247613d59cfa5645df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5g27v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.620707 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xzkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.631111 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9ft9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3938322-cab2-412a-91e4-904ce2d99adf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4l7pt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9ft9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.652711 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.652755 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.652767 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.652787 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.652799 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:26Z","lastTransitionTime":"2025-12-04T10:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.664324 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9ft9l" Dec 04 10:15:26 crc kubenswrapper[4831]: W1204 10:15:26.720157 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3938322_cab2_412a_91e4_904ce2d99adf.slice/crio-2c19df994c5a957a2eb0f43fa8b9365e7695f9bd12cd04c4cd41410e10e0f933 WatchSource:0}: Error finding container 2c19df994c5a957a2eb0f43fa8b9365e7695f9bd12cd04c4cd41410e10e0f933: Status 404 returned error can't find the container with id 2c19df994c5a957a2eb0f43fa8b9365e7695f9bd12cd04c4cd41410e10e0f933 Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.755418 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.755458 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.755467 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.755483 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.755492 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:26Z","lastTransitionTime":"2025-12-04T10:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.858468 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.858518 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.858527 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.858566 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.858595 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:26Z","lastTransitionTime":"2025-12-04T10:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.961112 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.961149 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.961157 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.961169 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.961178 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:26Z","lastTransitionTime":"2025-12-04T10:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:26 crc kubenswrapper[4831]: I1204 10:15:26.999007 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.002185 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.008867 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.015853 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7266f967-3803-4ef3-9609-5a9c540a8305\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 10:15:16.683634 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 10:15:16.685267 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-316787488/tls.crt::/tmp/serving-cert-316787488/tls.key\\\\\\\"\\\\nI1204 10:15:22.063722 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 10:15:22.066433 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 10:15:22.066471 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 10:15:22.066492 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 10:15:22.066499 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 10:15:22.070852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 10:15:22.070873 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070879 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 10:15:22.070889 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 10:15:22.070892 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI1204 10:15:22.070891 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 10:15:22.070895 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 10:15:22.073753 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.037311 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zk2rt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.049051 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.059785 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5814902f9f02bed9eb4600d77993f20ad463f2f78064f66755641f509913f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d1e9dabb13deee46d2c9ce72c90d2ed09c3e293b11b987c0c4be5b9a3cebe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.062975 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.063008 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.063016 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.063031 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.063040 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:27Z","lastTransitionTime":"2025-12-04T10:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.071401 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fade3c6b87d8b8d465b4e59ca3685265f09365b19feab5f4d31c18fecdea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.081246 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc448500592bc03a05d95154710642daf9362093c92bed1334c84b3f23ebf556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.090474 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94431d54539f9725aa9b32ac709bc9406dd0f1a1f82c3de77b3fefe7a34e3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.100075 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.109930 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8475bb26-8864-4d49-935b-db7d4cb73387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bf0cca6fa6a2c105fe80613113f4e04f5252d70eb5de23b53632ba8479e070\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399a139e421589fc50e4bad256402be123e37600f5c39eadf919ce767e9eeeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g76nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.120910 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.131058 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5g27v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6a78509-d612-4338-8562-9b0627c1793f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8109a06b8e0bea96f12e0e1cf757a34d1da4d89159cac1247613d59cfa5645df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5g27v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.147435 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xzkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.158825 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9ft9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3938322-cab2-412a-91e4-904ce2d99adf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4l7pt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9ft9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.165224 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.165273 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.165295 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.165315 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.165329 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:27Z","lastTransitionTime":"2025-12-04T10:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.170895 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fade3c6b87d8b8d465b4e59ca3685265f09365b19feab5f4d31c18fecdea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.181610 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc448500592bc03a05d95154710642daf9362093c92bed1334c84b3f23ebf556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.192411 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.204485 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5814902f9f02bed9eb4600d77993f20ad463f2f78064f66755641f509913f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d1e9dabb13deee46d2c9ce72c90d2ed09c3e293b11b987c0c4be5b9a3cebe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.218469 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8a90f1a-4f3b-4007-8635-52d0af689af0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3ec5dcde986c3e369253eb2c4fc8d6e646c9c1557bc5c28a98a6c71562d04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e3c29b09715bd660a94365179740c46dd6bbeb04b333ecb5dadbf8cdea5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6c54394c01d6097ec29247b0676fc04989db88066d482eb053b5e9b040b630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://941587f1cf00e0b2279e8f08f1bad8e477f14a02f6cfc8ed858d0196e01f29f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.227716 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94431d54539f9725aa9b32ac709bc9406dd0f1a1f82c3de77b3fefe7a34e3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.237971 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.248641 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8475bb26-8864-4d49-935b-db7d4cb73387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bf0cca6fa6a2c105fe80613113f4e04f5252d70eb5de23b53632ba8479e070\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399a139e421589fc50e4bad256402be123e37600f5c39eadf919ce767e9eeeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g76nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.259764 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.267865 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.267910 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.267921 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.267941 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.267988 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:27Z","lastTransitionTime":"2025-12-04T10:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.276903 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5g27v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6a78509-d612-4338-8562-9b0627c1793f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8109a06b8e0bea96f12e0e1cf757a34d1da4d89159cac1247613d59cfa5645df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5g27v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.322805 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xzkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.355841 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9ft9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3938322-cab2-412a-91e4-904ce2d99adf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4l7pt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9ft9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.369557 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.369591 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.369599 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.369613 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.369623 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:27Z","lastTransitionTime":"2025-12-04T10:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.404242 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7266f967-3803-4ef3-9609-5a9c540a8305\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 10:15:16.683634 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 10:15:16.685267 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-316787488/tls.crt::/tmp/serving-cert-316787488/tls.key\\\\\\\"\\\\nI1204 10:15:22.063722 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 10:15:22.066433 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 10:15:22.066471 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 10:15:22.066492 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 10:15:22.066499 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 10:15:22.070852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 10:15:22.070873 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070879 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 10:15:22.070889 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 10:15:22.070892 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI1204 10:15:22.070891 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 10:15:22.070895 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 10:15:22.073753 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.442777 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zk2rt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.455340 4831 generic.go:334] "Generic (PLEG): container finished" podID="f4f6422d-d5d2-4e56-8f87-84846b4b98eb" containerID="455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea" exitCode=0 Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.455402 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" event={"ID":"f4f6422d-d5d2-4e56-8f87-84846b4b98eb","Type":"ContainerDied","Data":"455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea"} Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.458126 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9ft9l" event={"ID":"f3938322-cab2-412a-91e4-904ce2d99adf","Type":"ContainerStarted","Data":"a45d7846aee5ad8d18b94a52fb2d4e99365bc0a6c5b774d33b26e7a107fc825d"} Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.458194 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9ft9l" event={"ID":"f3938322-cab2-412a-91e4-904ce2d99adf","Type":"ContainerStarted","Data":"2c19df994c5a957a2eb0f43fa8b9365e7695f9bd12cd04c4cd41410e10e0f933"} Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.471633 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.471691 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.471703 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.471721 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.471731 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:27Z","lastTransitionTime":"2025-12-04T10:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:27 crc kubenswrapper[4831]: E1204 10:15:27.477585 4831 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.504888 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8a90f1a-4f3b-4007-8635-52d0af689af0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3ec5dcde986c3e369253eb2c4fc8d6e646c9c1557bc5c28a98a6c71562d04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e3c29b09715bd660a94365179740c46dd6bbeb04b333ecb5dadbf8cdea5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6c54394c01d6097ec29247b0676fc04989db88066d482eb053b5e9b040b630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://941587f1cf00e0b2279e8f08f1bad8e477f14a02f6cfc8ed858d0196e01f29f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.538554 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94431d54539f9725aa9b32ac709bc9406dd0f1a1f82c3de77b3fefe7a34e3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.574125 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.574149 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.574157 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.574170 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.574178 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:27Z","lastTransitionTime":"2025-12-04T10:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.584230 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.618040 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8475bb26-8864-4d49-935b-db7d4cb73387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bf0cca6fa6a2c105fe80613113f4e04f5252d70eb5de23b53632ba8479e070\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399a139e421589fc50e4bad256402be123e37600f5c39eadf919ce767e9eeeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g76nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.657800 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.676863 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.676895 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.676903 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.676916 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.676925 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:27Z","lastTransitionTime":"2025-12-04T10:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.696977 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5g27v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6a78509-d612-4338-8562-9b0627c1793f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8109a06b8e0bea96f12e0e1cf757a34d1da4d89159cac1247613d59cfa5645df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5g27v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.741392 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xzkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.775595 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9ft9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3938322-cab2-412a-91e4-904ce2d99adf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4l7pt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9ft9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.779710 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.779754 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.779767 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.779783 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.779794 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:27Z","lastTransitionTime":"2025-12-04T10:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.819685 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7266f967-3803-4ef3-9609-5a9c540a8305\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 10:15:16.683634 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 10:15:16.685267 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-316787488/tls.crt::/tmp/serving-cert-316787488/tls.key\\\\\\\"\\\\nI1204 10:15:22.063722 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 10:15:22.066433 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 10:15:22.066471 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 10:15:22.066492 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 10:15:22.066499 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 10:15:22.070852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 10:15:22.070873 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070879 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 10:15:22.070889 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 10:15:22.070892 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI1204 10:15:22.070891 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 10:15:22.070895 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 10:15:22.073753 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.861450 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zk2rt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.882703 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.882747 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.882758 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.882774 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.882785 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:27Z","lastTransitionTime":"2025-12-04T10:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.897915 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc448500592bc03a05d95154710642daf9362093c92bed1334c84b3f23ebf556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.942293 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.985641 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.985715 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.985728 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.985745 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.985760 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:27Z","lastTransitionTime":"2025-12-04T10:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:27 crc kubenswrapper[4831]: I1204 10:15:27.988703 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5814902f9f02bed9eb4600d77993f20ad463f2f78064f66755641f509913f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d1e9dabb13deee46d2c9ce72c90d2ed09c3e293b11b987c0c4be5b9a3cebe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.022752 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fade3c6b87d8b8d465b4e59ca3685265f09365b19feab5f4d31c18fecdea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:28Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.065110 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:28Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.088749 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.088795 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.088806 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.088823 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.088834 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:28Z","lastTransitionTime":"2025-12-04T10:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.106695 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5g27v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6a78509-d612-4338-8562-9b0627c1793f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8109a06b8e0bea96f12e0e1cf757a34d1da4d89159cac1247613d59cfa5645df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5g27v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:28Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.143062 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xzkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:28Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.179044 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9ft9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3938322-cab2-412a-91e4-904ce2d99adf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a45d7846aee5ad8d18b94a52fb2d4e99365bc0a6c5b774d33b26e7a107fc825d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4l7pt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9ft9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:28Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.191250 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.191401 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.191487 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.191609 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.191748 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:28Z","lastTransitionTime":"2025-12-04T10:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.226392 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7266f967-3803-4ef3-9609-5a9c540a8305\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 10:15:16.683634 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 10:15:16.685267 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-316787488/tls.crt::/tmp/serving-cert-316787488/tls.key\\\\\\\"\\\\nI1204 10:15:22.063722 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 10:15:22.066433 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 10:15:22.066471 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 10:15:22.066492 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 10:15:22.066499 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 10:15:22.070852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 10:15:22.070873 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070879 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 10:15:22.070889 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 10:15:22.070892 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI1204 10:15:22.070891 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 10:15:22.070895 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 10:15:22.073753 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:28Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.262186 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zk2rt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:28Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.275851 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.275862 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:15:28 crc kubenswrapper[4831]: E1204 10:15:28.276014 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.276034 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:15:28 crc kubenswrapper[4831]: E1204 10:15:28.276115 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:15:28 crc kubenswrapper[4831]: E1204 10:15:28.276265 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.294083 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.294151 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.294171 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.294197 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.294214 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:28Z","lastTransitionTime":"2025-12-04T10:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.304998 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fade3c6b87d8b8d465b4e59ca3685265f09365b19feab5f4d31c18fecdea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:28Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.338606 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc448500592bc03a05d95154710642daf9362093c92bed1334c84b3f23ebf556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:28Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.384814 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:28Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.396901 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.396942 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.396954 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.396969 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.396980 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:28Z","lastTransitionTime":"2025-12-04T10:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.417929 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5814902f9f02bed9eb4600d77993f20ad463f2f78064f66755641f509913f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d1e9dabb13deee46d2c9ce72c90d2ed09c3e293b11b987c0c4be5b9a3cebe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:28Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.463046 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8a90f1a-4f3b-4007-8635-52d0af689af0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3ec5dcde986c3e369253eb2c4fc8d6e646c9c1557bc5c28a98a6c71562d04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e3c29b09715bd660a94365179740c46dd6bbeb04b333ecb5dadbf8cdea5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6c54394c01d6097ec29247b0676fc04989db88066d482eb053b5e9b040b630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://941587f1cf00e0b2279e8f08f1bad8e477f14a02f6cfc8ed858d0196e01f29f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:28Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.463941 4831 generic.go:334] "Generic (PLEG): container finished" podID="f4f6422d-d5d2-4e56-8f87-84846b4b98eb" containerID="63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38" exitCode=0 Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.464025 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" event={"ID":"f4f6422d-d5d2-4e56-8f87-84846b4b98eb","Type":"ContainerDied","Data":"63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38"} Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.499765 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.499825 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.499849 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.499880 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.499903 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:28Z","lastTransitionTime":"2025-12-04T10:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.502699 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94431d54539f9725aa9b32ac709bc9406dd0f1a1f82c3de77b3fefe7a34e3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:28Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.536759 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:28Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.577877 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8475bb26-8864-4d49-935b-db7d4cb73387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bf0cca6fa6a2c105fe80613113f4e04f5252d70eb5de23b53632ba8479e070\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399a139e421589fc50e4bad256402be123e37600f5c39eadf919ce767e9eeeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g76nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:28Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.601828 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.601869 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.601881 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.601897 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.601907 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:28Z","lastTransitionTime":"2025-12-04T10:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.618768 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7266f967-3803-4ef3-9609-5a9c540a8305\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 10:15:16.683634 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 10:15:16.685267 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-316787488/tls.crt::/tmp/serving-cert-316787488/tls.key\\\\\\\"\\\\nI1204 10:15:22.063722 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 10:15:22.066433 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 10:15:22.066471 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 10:15:22.066492 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 10:15:22.066499 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 10:15:22.070852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 10:15:22.070873 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070879 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 10:15:22.070889 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 10:15:22.070892 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI1204 10:15:22.070891 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 10:15:22.070895 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 10:15:22.073753 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:28Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.659037 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zk2rt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:28Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.697055 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:28Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.705061 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.705101 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.705114 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.705129 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.705143 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:28Z","lastTransitionTime":"2025-12-04T10:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.739267 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5814902f9f02bed9eb4600d77993f20ad463f2f78064f66755641f509913f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d1e9dabb13deee46d2c9ce72c90d2ed09c3e293b11b987c0c4be5b9a3cebe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:28Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.788514 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fade3c6b87d8b8d465b4e59ca3685265f09365b19feab5f4d31c18fecdea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:28Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.809503 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.809542 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.809553 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.809580 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.809591 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:28Z","lastTransitionTime":"2025-12-04T10:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.822397 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc448500592bc03a05d95154710642daf9362093c92bed1334c84b3f23ebf556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:28Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.860530 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94431d54539f9725aa9b32ac709bc9406dd0f1a1f82c3de77b3fefe7a34e3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:28Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.901380 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:28Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.912093 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.912135 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.912148 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.912166 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.912178 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:28Z","lastTransitionTime":"2025-12-04T10:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.942013 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8475bb26-8864-4d49-935b-db7d4cb73387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bf0cca6fa6a2c105fe80613113f4e04f5252d70eb5de23b53632ba8479e070\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399a139e421589fc50e4bad256402be123e37600f5c39eadf919ce767e9eeeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g76nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:28Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:28 crc kubenswrapper[4831]: I1204 10:15:28.982618 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8a90f1a-4f3b-4007-8635-52d0af689af0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3ec5dcde986c3e369253eb2c4fc8d6e646c9c1557bc5c28a98a6c71562d04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e3c29b09715bd660a94365179740c46dd6bbeb04b333ecb5dadbf8cdea5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6c54394c01d6097ec29247b0676fc04989db88066d482eb053b5e9b040b630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://941587f1cf00e0b2279e8f08f1bad8e477f14a02f6cfc8ed858d0196e01f29f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:28Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.014129 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.014196 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.014217 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.014244 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.014260 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:29Z","lastTransitionTime":"2025-12-04T10:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.028965 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5g27v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6a78509-d612-4338-8562-9b0627c1793f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8109a06b8e0bea96f12e0e1cf757a34d1da4d89159cac1247613d59cfa5645df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5g27v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:29Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.073527 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xzkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:29Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.102048 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9ft9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3938322-cab2-412a-91e4-904ce2d99adf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a45d7846aee5ad8d18b94a52fb2d4e99365bc0a6c5b774d33b26e7a107fc825d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4l7pt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9ft9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:29Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.116307 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.116500 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.116527 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.116555 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.116576 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:29Z","lastTransitionTime":"2025-12-04T10:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.142471 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:29Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.218923 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.218995 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.219016 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.219044 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.219065 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:29Z","lastTransitionTime":"2025-12-04T10:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.321953 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.322002 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.322017 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.322038 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.322054 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:29Z","lastTransitionTime":"2025-12-04T10:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.424477 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.424547 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.424576 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.424604 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.424623 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:29Z","lastTransitionTime":"2025-12-04T10:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.472483 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" event={"ID":"1261b9db-fe52-4fbc-9a9c-7e0c3486276e","Type":"ContainerStarted","Data":"10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe"} Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.476221 4831 generic.go:334] "Generic (PLEG): container finished" podID="f4f6422d-d5d2-4e56-8f87-84846b4b98eb" containerID="ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2" exitCode=0 Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.476260 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" event={"ID":"f4f6422d-d5d2-4e56-8f87-84846b4b98eb","Type":"ContainerDied","Data":"ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2"} Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.495849 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9ft9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3938322-cab2-412a-91e4-904ce2d99adf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a45d7846aee5ad8d18b94a52fb2d4e99365bc0a6c5b774d33b26e7a107fc825d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4l7pt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9ft9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:29Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.516254 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:29Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.528074 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.528136 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.528154 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.528178 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.528195 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:29Z","lastTransitionTime":"2025-12-04T10:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.539336 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5g27v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6a78509-d612-4338-8562-9b0627c1793f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8109a06b8e0bea96f12e0e1cf757a34d1da4d89159cac1247613d59cfa5645df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5g27v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:29Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.564890 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xzkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:29Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.581764 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7266f967-3803-4ef3-9609-5a9c540a8305\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 10:15:16.683634 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 10:15:16.685267 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-316787488/tls.crt::/tmp/serving-cert-316787488/tls.key\\\\\\\"\\\\nI1204 10:15:22.063722 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 10:15:22.066433 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 10:15:22.066471 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 10:15:22.066492 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 10:15:22.066499 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 10:15:22.070852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 10:15:22.070873 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070879 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 10:15:22.070889 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 10:15:22.070892 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI1204 10:15:22.070891 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 10:15:22.070895 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 10:15:22.073753 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:29Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.599639 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zk2rt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:29Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.612137 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5814902f9f02bed9eb4600d77993f20ad463f2f78064f66755641f509913f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d1e9dabb13deee46d2c9ce72c90d2ed09c3e293b11b987c0c4be5b9a3cebe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:29Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.627051 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fade3c6b87d8b8d465b4e59ca3685265f09365b19feab5f4d31c18fecdea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:29Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.630835 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.630878 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.630890 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.630908 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.630922 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:29Z","lastTransitionTime":"2025-12-04T10:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.639110 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc448500592bc03a05d95154710642daf9362093c92bed1334c84b3f23ebf556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:29Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.651702 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:29Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.665786 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8475bb26-8864-4d49-935b-db7d4cb73387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bf0cca6fa6a2c105fe80613113f4e04f5252d70eb5de23b53632ba8479e070\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399a139e421589fc50e4bad256402be123e37600f5c39eadf919ce767e9eeeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g76nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:29Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.680084 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8a90f1a-4f3b-4007-8635-52d0af689af0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3ec5dcde986c3e369253eb2c4fc8d6e646c9c1557bc5c28a98a6c71562d04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e3c29b09715bd660a94365179740c46dd6bbeb04b333ecb5dadbf8cdea5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6c54394c01d6097ec29247b0676fc04989db88066d482eb053b5e9b040b630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://941587f1cf00e0b2279e8f08f1bad8e477f14a02f6cfc8ed858d0196e01f29f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:29Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.688413 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94431d54539f9725aa9b32ac709bc9406dd0f1a1f82c3de77b3fefe7a34e3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:29Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.704724 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:29Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.733244 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.733281 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.733292 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.733307 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.733316 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:29Z","lastTransitionTime":"2025-12-04T10:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.836613 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.836690 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.836706 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.836725 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.836738 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:29Z","lastTransitionTime":"2025-12-04T10:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.938898 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.938942 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.938954 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.938975 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.938992 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:29Z","lastTransitionTime":"2025-12-04T10:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:29 crc kubenswrapper[4831]: I1204 10:15:29.945611 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:15:29 crc kubenswrapper[4831]: E1204 10:15:29.945829 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:15:37.945796258 +0000 UTC m=+34.894971622 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.042933 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.043014 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.043035 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.043077 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.043099 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:30Z","lastTransitionTime":"2025-12-04T10:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.046567 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.046640 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.046708 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.046751 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:15:30 crc kubenswrapper[4831]: E1204 10:15:30.046881 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 10:15:30 crc kubenswrapper[4831]: E1204 10:15:30.046889 4831 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 10:15:30 crc kubenswrapper[4831]: E1204 10:15:30.046933 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 10:15:30 crc kubenswrapper[4831]: E1204 10:15:30.046978 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 10:15:30 crc kubenswrapper[4831]: E1204 10:15:30.046990 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 10:15:38.046966394 +0000 UTC m=+34.996141748 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 10:15:30 crc kubenswrapper[4831]: E1204 10:15:30.047004 4831 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 10:15:30 crc kubenswrapper[4831]: E1204 10:15:30.047079 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 10:15:38.047053056 +0000 UTC m=+34.996228410 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 10:15:30 crc kubenswrapper[4831]: E1204 10:15:30.046886 4831 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 10:15:30 crc kubenswrapper[4831]: E1204 10:15:30.047153 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 10:15:38.047134188 +0000 UTC m=+34.996309602 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 10:15:30 crc kubenswrapper[4831]: E1204 10:15:30.046904 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 10:15:30 crc kubenswrapper[4831]: E1204 10:15:30.047177 4831 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 10:15:30 crc kubenswrapper[4831]: E1204 10:15:30.047198 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 10:15:38.04719239 +0000 UTC m=+34.996367704 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.145450 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.145487 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.145499 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.145513 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.145522 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:30Z","lastTransitionTime":"2025-12-04T10:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.248431 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.248505 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.248527 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.248552 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.248572 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:30Z","lastTransitionTime":"2025-12-04T10:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.275923 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:15:30 crc kubenswrapper[4831]: E1204 10:15:30.276160 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.276226 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.276280 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:15:30 crc kubenswrapper[4831]: E1204 10:15:30.276398 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:15:30 crc kubenswrapper[4831]: E1204 10:15:30.276508 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.351019 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.351097 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.351120 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.351155 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.351177 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:30Z","lastTransitionTime":"2025-12-04T10:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.454324 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.454395 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.454419 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.454448 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.454470 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:30Z","lastTransitionTime":"2025-12-04T10:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.483525 4831 generic.go:334] "Generic (PLEG): container finished" podID="f4f6422d-d5d2-4e56-8f87-84846b4b98eb" containerID="13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c" exitCode=0 Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.483573 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" event={"ID":"f4f6422d-d5d2-4e56-8f87-84846b4b98eb","Type":"ContainerDied","Data":"13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c"} Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.499489 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7266f967-3803-4ef3-9609-5a9c540a8305\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 10:15:16.683634 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 10:15:16.685267 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-316787488/tls.crt::/tmp/serving-cert-316787488/tls.key\\\\\\\"\\\\nI1204 10:15:22.063722 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 10:15:22.066433 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 10:15:22.066471 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 10:15:22.066492 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 10:15:22.066499 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 10:15:22.070852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 10:15:22.070873 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070879 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 10:15:22.070889 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 10:15:22.070892 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI1204 10:15:22.070891 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 10:15:22.070895 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 10:15:22.073753 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:30Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.515743 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zk2rt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:30Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.525986 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:30Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.543832 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5814902f9f02bed9eb4600d77993f20ad463f2f78064f66755641f509913f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d1e9dabb13deee46d2c9ce72c90d2ed09c3e293b11b987c0c4be5b9a3cebe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:30Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.556955 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.557012 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.557023 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.557046 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.557058 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:30Z","lastTransitionTime":"2025-12-04T10:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.558039 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fade3c6b87d8b8d465b4e59ca3685265f09365b19feab5f4d31c18fecdea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:30Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.571926 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc448500592bc03a05d95154710642daf9362093c92bed1334c84b3f23ebf556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:30Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.589257 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8a90f1a-4f3b-4007-8635-52d0af689af0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3ec5dcde986c3e369253eb2c4fc8d6e646c9c1557bc5c28a98a6c71562d04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e3c29b09715bd660a94365179740c46dd6bbeb04b333ecb5dadbf8cdea5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6c54394c01d6097ec29247b0676fc04989db88066d482eb053b5e9b040b630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://941587f1cf00e0b2279e8f08f1bad8e477f14a02f6cfc8ed858d0196e01f29f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:30Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.604245 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94431d54539f9725aa9b32ac709bc9406dd0f1a1f82c3de77b3fefe7a34e3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:30Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.622307 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:30Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.640725 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8475bb26-8864-4d49-935b-db7d4cb73387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bf0cca6fa6a2c105fe80613113f4e04f5252d70eb5de23b53632ba8479e070\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399a139e421589fc50e4bad256402be123e37600f5c39eadf919ce767e9eeeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g76nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:30Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.659026 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:30Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.660746 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.660947 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.661094 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.661116 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.661128 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:30Z","lastTransitionTime":"2025-12-04T10:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.674606 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5g27v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6a78509-d612-4338-8562-9b0627c1793f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8109a06b8e0bea96f12e0e1cf757a34d1da4d89159cac1247613d59cfa5645df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5g27v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:30Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.696518 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xzkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:30Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.708496 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9ft9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3938322-cab2-412a-91e4-904ce2d99adf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a45d7846aee5ad8d18b94a52fb2d4e99365bc0a6c5b774d33b26e7a107fc825d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4l7pt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9ft9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:30Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.763465 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.763501 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.763509 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.763524 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.763533 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:30Z","lastTransitionTime":"2025-12-04T10:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.866272 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.866315 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.866324 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.866340 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.866351 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:30Z","lastTransitionTime":"2025-12-04T10:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.969422 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.969455 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.969465 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.969485 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:30 crc kubenswrapper[4831]: I1204 10:15:30.969496 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:30Z","lastTransitionTime":"2025-12-04T10:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.071306 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.071345 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.071355 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.071370 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.071381 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:31Z","lastTransitionTime":"2025-12-04T10:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.174983 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.175065 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.175088 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.175125 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.175148 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:31Z","lastTransitionTime":"2025-12-04T10:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.277802 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.277871 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.277888 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.277916 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.277934 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:31Z","lastTransitionTime":"2025-12-04T10:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.380474 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.380512 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.380521 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.380537 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.380547 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:31Z","lastTransitionTime":"2025-12-04T10:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.484174 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.484235 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.484257 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.484286 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.484306 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:31Z","lastTransitionTime":"2025-12-04T10:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.492774 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" event={"ID":"1261b9db-fe52-4fbc-9a9c-7e0c3486276e","Type":"ContainerStarted","Data":"5d6e5616fcaf6e34e6247d6e0cf66677c59142310c093aa0f4e449fffffdd3b7"} Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.493193 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.493258 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.501996 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" event={"ID":"f4f6422d-d5d2-4e56-8f87-84846b4b98eb","Type":"ContainerStarted","Data":"c223077ddfa1e5cb3f1efbb02dcac1239c34d2337f398635019c8fe6e4509097"} Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.516477 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fade3c6b87d8b8d465b4e59ca3685265f09365b19feab5f4d31c18fecdea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:31Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.539015 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc448500592bc03a05d95154710642daf9362093c92bed1334c84b3f23ebf556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:31Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.562740 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:31Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.579160 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5814902f9f02bed9eb4600d77993f20ad463f2f78064f66755641f509913f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d1e9dabb13deee46d2c9ce72c90d2ed09c3e293b11b987c0c4be5b9a3cebe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:31Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.587909 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.587964 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.587982 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.588005 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.588022 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:31Z","lastTransitionTime":"2025-12-04T10:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.601329 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8a90f1a-4f3b-4007-8635-52d0af689af0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3ec5dcde986c3e369253eb2c4fc8d6e646c9c1557bc5c28a98a6c71562d04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e3c29b09715bd660a94365179740c46dd6bbeb04b333ecb5dadbf8cdea5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6c54394c01d6097ec29247b0676fc04989db88066d482eb053b5e9b040b630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://941587f1cf00e0b2279e8f08f1bad8e477f14a02f6cfc8ed858d0196e01f29f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:31Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.615794 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94431d54539f9725aa9b32ac709bc9406dd0f1a1f82c3de77b3fefe7a34e3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:31Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.634749 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:31Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.648383 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8475bb26-8864-4d49-935b-db7d4cb73387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bf0cca6fa6a2c105fe80613113f4e04f5252d70eb5de23b53632ba8479e070\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399a139e421589fc50e4bad256402be123e37600f5c39eadf919ce767e9eeeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g76nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:31Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.667728 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:31Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.685707 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5g27v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6a78509-d612-4338-8562-9b0627c1793f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8109a06b8e0bea96f12e0e1cf757a34d1da4d89159cac1247613d59cfa5645df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5g27v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:31Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.691085 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.691129 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.691145 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.691165 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.691179 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:31Z","lastTransitionTime":"2025-12-04T10:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.699656 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.715053 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6e5616fcaf6e34e6247d6e0cf66677c59142310c093aa0f4e449fffffdd3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xzkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:31Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.737489 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9ft9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3938322-cab2-412a-91e4-904ce2d99adf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a45d7846aee5ad8d18b94a52fb2d4e99365bc0a6c5b774d33b26e7a107fc825d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4l7pt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9ft9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:31Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.757719 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7266f967-3803-4ef3-9609-5a9c540a8305\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 10:15:16.683634 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 10:15:16.685267 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-316787488/tls.crt::/tmp/serving-cert-316787488/tls.key\\\\\\\"\\\\nI1204 10:15:22.063722 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 10:15:22.066433 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 10:15:22.066471 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 10:15:22.066492 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 10:15:22.066499 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 10:15:22.070852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 10:15:22.070873 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070879 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 10:15:22.070889 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 10:15:22.070892 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI1204 10:15:22.070891 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 10:15:22.070895 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 10:15:22.073753 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:31Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.781380 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zk2rt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:31Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.794751 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.794846 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.794865 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.794894 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.794918 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:31Z","lastTransitionTime":"2025-12-04T10:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.799470 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5g27v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6a78509-d612-4338-8562-9b0627c1793f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8109a06b8e0bea96f12e0e1cf757a34d1da4d89159cac1247613d59cfa5645df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5g27v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:31Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.817471 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6e5616fcaf6e34e6247d6e0cf66677c59142310c093aa0f4e449fffffdd3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xzkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:31Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.828504 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9ft9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3938322-cab2-412a-91e4-904ce2d99adf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a45d7846aee5ad8d18b94a52fb2d4e99365bc0a6c5b774d33b26e7a107fc825d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4l7pt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9ft9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:31Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.841936 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:31Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.855499 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7266f967-3803-4ef3-9609-5a9c540a8305\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 10:15:16.683634 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 10:15:16.685267 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-316787488/tls.crt::/tmp/serving-cert-316787488/tls.key\\\\\\\"\\\\nI1204 10:15:22.063722 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 10:15:22.066433 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 10:15:22.066471 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 10:15:22.066492 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 10:15:22.066499 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 10:15:22.070852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 10:15:22.070873 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070879 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 10:15:22.070889 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 10:15:22.070892 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI1204 10:15:22.070891 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 10:15:22.070895 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 10:15:22.073753 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:31Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.867830 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c223077ddfa1e5cb3f1efbb02dcac1239c34d2337f398635019c8fe6e4509097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zk2rt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:31Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.878963 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:31Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.889763 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5814902f9f02bed9eb4600d77993f20ad463f2f78064f66755641f509913f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d1e9dabb13deee46d2c9ce72c90d2ed09c3e293b11b987c0c4be5b9a3cebe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:31Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.896836 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.896878 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.896894 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.896913 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.896925 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:31Z","lastTransitionTime":"2025-12-04T10:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.904266 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fade3c6b87d8b8d465b4e59ca3685265f09365b19feab5f4d31c18fecdea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:31Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.917683 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc448500592bc03a05d95154710642daf9362093c92bed1334c84b3f23ebf556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:31Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.929298 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94431d54539f9725aa9b32ac709bc9406dd0f1a1f82c3de77b3fefe7a34e3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:31Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.940094 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:31Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.950879 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8475bb26-8864-4d49-935b-db7d4cb73387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bf0cca6fa6a2c105fe80613113f4e04f5252d70eb5de23b53632ba8479e070\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399a139e421589fc50e4bad256402be123e37600f5c39eadf919ce767e9eeeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g76nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:31Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.960747 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8a90f1a-4f3b-4007-8635-52d0af689af0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3ec5dcde986c3e369253eb2c4fc8d6e646c9c1557bc5c28a98a6c71562d04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e3c29b09715bd660a94365179740c46dd6bbeb04b333ecb5dadbf8cdea5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6c54394c01d6097ec29247b0676fc04989db88066d482eb053b5e9b040b630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://941587f1cf00e0b2279e8f08f1bad8e477f14a02f6cfc8ed858d0196e01f29f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:31Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.999798 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.999845 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.999859 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.999876 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:31 crc kubenswrapper[4831]: I1204 10:15:31.999888 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:31Z","lastTransitionTime":"2025-12-04T10:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.102764 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.102804 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.102813 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.102828 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.102838 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:32Z","lastTransitionTime":"2025-12-04T10:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.206087 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.206150 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.206170 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.206199 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.206220 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:32Z","lastTransitionTime":"2025-12-04T10:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.276364 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.276498 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:15:32 crc kubenswrapper[4831]: E1204 10:15:32.276623 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.276389 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:15:32 crc kubenswrapper[4831]: E1204 10:15:32.276779 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:15:32 crc kubenswrapper[4831]: E1204 10:15:32.277065 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.309719 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.310162 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.310230 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.310298 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.310380 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:32Z","lastTransitionTime":"2025-12-04T10:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.413652 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.413990 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.414082 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.414176 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.414252 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:32Z","lastTransitionTime":"2025-12-04T10:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.505682 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.516518 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.516557 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.516569 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.516585 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.516598 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:32Z","lastTransitionTime":"2025-12-04T10:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.527889 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.541549 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c223077ddfa1e5cb3f1efbb02dcac1239c34d2337f398635019c8fe6e4509097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zk2rt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:32Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.556132 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7266f967-3803-4ef3-9609-5a9c540a8305\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 10:15:16.683634 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 10:15:16.685267 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-316787488/tls.crt::/tmp/serving-cert-316787488/tls.key\\\\\\\"\\\\nI1204 10:15:22.063722 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 10:15:22.066433 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 10:15:22.066471 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 10:15:22.066492 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 10:15:22.066499 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 10:15:22.070852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 10:15:22.070873 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070879 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 10:15:22.070889 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 10:15:22.070892 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI1204 10:15:22.070891 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 10:15:22.070895 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 10:15:22.073753 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:32Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.568636 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:32Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.581936 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5814902f9f02bed9eb4600d77993f20ad463f2f78064f66755641f509913f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d1e9dabb13deee46d2c9ce72c90d2ed09c3e293b11b987c0c4be5b9a3cebe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:32Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.600234 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fade3c6b87d8b8d465b4e59ca3685265f09365b19feab5f4d31c18fecdea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:32Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.613360 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc448500592bc03a05d95154710642daf9362093c92bed1334c84b3f23ebf556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:32Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.618404 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.618465 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.618482 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.618505 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.618523 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:32Z","lastTransitionTime":"2025-12-04T10:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.632259 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:32Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.647296 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8475bb26-8864-4d49-935b-db7d4cb73387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bf0cca6fa6a2c105fe80613113f4e04f5252d70eb5de23b53632ba8479e070\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399a139e421589fc50e4bad256402be123e37600f5c39eadf919ce767e9eeeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g76nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:32Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.659540 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8a90f1a-4f3b-4007-8635-52d0af689af0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3ec5dcde986c3e369253eb2c4fc8d6e646c9c1557bc5c28a98a6c71562d04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e3c29b09715bd660a94365179740c46dd6bbeb04b333ecb5dadbf8cdea5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6c54394c01d6097ec29247b0676fc04989db88066d482eb053b5e9b040b630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://941587f1cf00e0b2279e8f08f1bad8e477f14a02f6cfc8ed858d0196e01f29f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:32Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.671813 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94431d54539f9725aa9b32ac709bc9406dd0f1a1f82c3de77b3fefe7a34e3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:32Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.688358 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6e5616fcaf6e34e6247d6e0cf66677c59142310c093aa0f4e449fffffdd3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xzkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:32Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.698415 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9ft9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3938322-cab2-412a-91e4-904ce2d99adf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a45d7846aee5ad8d18b94a52fb2d4e99365bc0a6c5b774d33b26e7a107fc825d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4l7pt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9ft9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:32Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.709522 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:32Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.720533 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5g27v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6a78509-d612-4338-8562-9b0627c1793f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8109a06b8e0bea96f12e0e1cf757a34d1da4d89159cac1247613d59cfa5645df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5g27v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:32Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.721182 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.721228 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.721241 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.721260 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.721272 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:32Z","lastTransitionTime":"2025-12-04T10:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.823513 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.823549 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.823560 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.823575 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.823586 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:32Z","lastTransitionTime":"2025-12-04T10:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.927125 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.927169 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.927180 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.927195 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:32 crc kubenswrapper[4831]: I1204 10:15:32.927205 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:32Z","lastTransitionTime":"2025-12-04T10:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.029264 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.029310 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.029326 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.029345 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.029362 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:33Z","lastTransitionTime":"2025-12-04T10:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.132157 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.132193 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.132205 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.132223 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.132234 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:33Z","lastTransitionTime":"2025-12-04T10:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.234451 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.234488 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.234499 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.234514 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.234525 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:33Z","lastTransitionTime":"2025-12-04T10:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.290079 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:33Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.301219 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8475bb26-8864-4d49-935b-db7d4cb73387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bf0cca6fa6a2c105fe80613113f4e04f5252d70eb5de23b53632ba8479e070\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399a139e421589fc50e4bad256402be123e37600f5c39eadf919ce767e9eeeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g76nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:33Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.311769 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8a90f1a-4f3b-4007-8635-52d0af689af0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3ec5dcde986c3e369253eb2c4fc8d6e646c9c1557bc5c28a98a6c71562d04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e3c29b09715bd660a94365179740c46dd6bbeb04b333ecb5dadbf8cdea5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6c54394c01d6097ec29247b0676fc04989db88066d482eb053b5e9b040b630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://941587f1cf00e0b2279e8f08f1bad8e477f14a02f6cfc8ed858d0196e01f29f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:33Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.325438 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94431d54539f9725aa9b32ac709bc9406dd0f1a1f82c3de77b3fefe7a34e3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:33Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.337024 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.337084 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.337100 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.337134 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.337152 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:33Z","lastTransitionTime":"2025-12-04T10:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.342725 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6e5616fcaf6e34e6247d6e0cf66677c59142310c093aa0f4e449fffffdd3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xzkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:33Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.352321 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9ft9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3938322-cab2-412a-91e4-904ce2d99adf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a45d7846aee5ad8d18b94a52fb2d4e99365bc0a6c5b774d33b26e7a107fc825d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4l7pt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9ft9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:33Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.364592 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:33Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.374559 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5g27v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6a78509-d612-4338-8562-9b0627c1793f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8109a06b8e0bea96f12e0e1cf757a34d1da4d89159cac1247613d59cfa5645df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5g27v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:33Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.386800 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c223077ddfa1e5cb3f1efbb02dcac1239c34d2337f398635019c8fe6e4509097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zk2rt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:33Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.400379 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7266f967-3803-4ef3-9609-5a9c540a8305\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 10:15:16.683634 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 10:15:16.685267 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-316787488/tls.crt::/tmp/serving-cert-316787488/tls.key\\\\\\\"\\\\nI1204 10:15:22.063722 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 10:15:22.066433 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 10:15:22.066471 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 10:15:22.066492 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 10:15:22.066499 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 10:15:22.070852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 10:15:22.070873 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070879 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 10:15:22.070889 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 10:15:22.070892 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI1204 10:15:22.070891 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 10:15:22.070895 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 10:15:22.073753 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:33Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.413563 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:33Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.427104 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5814902f9f02bed9eb4600d77993f20ad463f2f78064f66755641f509913f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d1e9dabb13deee46d2c9ce72c90d2ed09c3e293b11b987c0c4be5b9a3cebe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:33Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.439513 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.439576 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.439593 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.439614 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.439630 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:33Z","lastTransitionTime":"2025-12-04T10:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.441744 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fade3c6b87d8b8d465b4e59ca3685265f09365b19feab5f4d31c18fecdea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:33Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.457841 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc448500592bc03a05d95154710642daf9362093c92bed1334c84b3f23ebf556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:33Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.542960 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.543022 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.543038 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.543064 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.543082 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:33Z","lastTransitionTime":"2025-12-04T10:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.646687 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.646749 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.646768 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.646793 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.646810 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:33Z","lastTransitionTime":"2025-12-04T10:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.749640 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.749729 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.749783 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.749818 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.749840 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:33Z","lastTransitionTime":"2025-12-04T10:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.852414 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.852466 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.852483 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.852506 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.852524 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:33Z","lastTransitionTime":"2025-12-04T10:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.955493 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.955623 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.955708 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.955734 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:33 crc kubenswrapper[4831]: I1204 10:15:33.955761 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:33Z","lastTransitionTime":"2025-12-04T10:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.059069 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.059148 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.059170 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.059199 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.059221 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:34Z","lastTransitionTime":"2025-12-04T10:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.161980 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.162059 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.162085 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.162112 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.162129 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:34Z","lastTransitionTime":"2025-12-04T10:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.265274 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.265340 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.265358 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.265383 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.265400 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:34Z","lastTransitionTime":"2025-12-04T10:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.275540 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.275606 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:15:34 crc kubenswrapper[4831]: E1204 10:15:34.275741 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.275786 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:15:34 crc kubenswrapper[4831]: E1204 10:15:34.275944 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:15:34 crc kubenswrapper[4831]: E1204 10:15:34.276121 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.367760 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.367825 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.367842 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.367866 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.367883 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:34Z","lastTransitionTime":"2025-12-04T10:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.471537 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.471588 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.471603 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.471626 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.471643 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:34Z","lastTransitionTime":"2025-12-04T10:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.514620 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xzkp_1261b9db-fe52-4fbc-9a9c-7e0c3486276e/ovnkube-controller/0.log" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.519238 4831 generic.go:334] "Generic (PLEG): container finished" podID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerID="5d6e5616fcaf6e34e6247d6e0cf66677c59142310c093aa0f4e449fffffdd3b7" exitCode=1 Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.519287 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" event={"ID":"1261b9db-fe52-4fbc-9a9c-7e0c3486276e","Type":"ContainerDied","Data":"5d6e5616fcaf6e34e6247d6e0cf66677c59142310c093aa0f4e449fffffdd3b7"} Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.520839 4831 scope.go:117] "RemoveContainer" containerID="5d6e5616fcaf6e34e6247d6e0cf66677c59142310c093aa0f4e449fffffdd3b7" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.545759 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8a90f1a-4f3b-4007-8635-52d0af689af0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3ec5dcde986c3e369253eb2c4fc8d6e646c9c1557bc5c28a98a6c71562d04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e3c29b09715bd660a94365179740c46dd6bbeb04b333ecb5dadbf8cdea5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6c54394c01d6097ec29247b0676fc04989db88066d482eb053b5e9b040b630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://941587f1cf00e0b2279e8f08f1bad8e477f14a02f6cfc8ed858d0196e01f29f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:34Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.565446 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94431d54539f9725aa9b32ac709bc9406dd0f1a1f82c3de77b3fefe7a34e3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:34Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.575150 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.575232 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.575259 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.575293 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.575316 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:34Z","lastTransitionTime":"2025-12-04T10:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.589485 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:34Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.612082 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8475bb26-8864-4d49-935b-db7d4cb73387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bf0cca6fa6a2c105fe80613113f4e04f5252d70eb5de23b53632ba8479e070\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399a139e421589fc50e4bad256402be123e37600f5c39eadf919ce767e9eeeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g76nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:34Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.630758 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:34Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.647026 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5g27v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6a78509-d612-4338-8562-9b0627c1793f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8109a06b8e0bea96f12e0e1cf757a34d1da4d89159cac1247613d59cfa5645df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5g27v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:34Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.675823 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6e5616fcaf6e34e6247d6e0cf66677c59142310c093aa0f4e449fffffdd3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d6e5616fcaf6e34e6247d6e0cf66677c59142310c093aa0f4e449fffffdd3b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T10:15:33Z\\\",\\\"message\\\":\\\"1204 10:15:33.248372 6134 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1204 10:15:33.248391 6134 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1204 10:15:33.248395 6134 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1204 10:15:33.248416 6134 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1204 10:15:33.248451 6134 handler.go:208] Removed *v1.Node event handler 7\\\\nI1204 10:15:33.248459 6134 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1204 10:15:33.248455 6134 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1204 10:15:33.248461 6134 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1204 10:15:33.248435 6134 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1204 10:15:33.248483 6134 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1204 10:15:33.248516 6134 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1204 10:15:33.248543 6134 factory.go:656] Stopping watch factory\\\\nI1204 10:15:33.248558 6134 ovnkube.go:599] Stopped ovnkube\\\\nI1204 10:15:33.248565 6134 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1204 10:15:33.248585 6134 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1204 10:15:33.248619 6134 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xzkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:34Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.678868 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.678927 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.678950 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.678981 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.679056 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:34Z","lastTransitionTime":"2025-12-04T10:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.693337 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9ft9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3938322-cab2-412a-91e4-904ce2d99adf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a45d7846aee5ad8d18b94a52fb2d4e99365bc0a6c5b774d33b26e7a107fc825d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4l7pt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9ft9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:34Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.713090 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7266f967-3803-4ef3-9609-5a9c540a8305\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 10:15:16.683634 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 10:15:16.685267 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-316787488/tls.crt::/tmp/serving-cert-316787488/tls.key\\\\\\\"\\\\nI1204 10:15:22.063722 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 10:15:22.066433 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 10:15:22.066471 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 10:15:22.066492 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 10:15:22.066499 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 10:15:22.070852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 10:15:22.070873 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070879 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 10:15:22.070889 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 10:15:22.070892 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI1204 10:15:22.070891 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 10:15:22.070895 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 10:15:22.073753 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:34Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.730236 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c223077ddfa1e5cb3f1efbb02dcac1239c34d2337f398635019c8fe6e4509097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zk2rt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:34Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.749116 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fade3c6b87d8b8d465b4e59ca3685265f09365b19feab5f4d31c18fecdea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:34Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.769039 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc448500592bc03a05d95154710642daf9362093c92bed1334c84b3f23ebf556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:34Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.782402 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.782449 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.782463 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.782484 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.782500 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:34Z","lastTransitionTime":"2025-12-04T10:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.784792 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:34Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.803019 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5814902f9f02bed9eb4600d77993f20ad463f2f78064f66755641f509913f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d1e9dabb13deee46d2c9ce72c90d2ed09c3e293b11b987c0c4be5b9a3cebe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:34Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.885125 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.885194 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.885211 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.885238 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.885253 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:34Z","lastTransitionTime":"2025-12-04T10:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.987067 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.987468 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.987483 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.987506 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:34 crc kubenswrapper[4831]: I1204 10:15:34.987519 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:34Z","lastTransitionTime":"2025-12-04T10:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.090516 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.090607 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.090621 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.090636 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.090649 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:35Z","lastTransitionTime":"2025-12-04T10:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.193000 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.193034 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.193043 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.193056 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.193065 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:35Z","lastTransitionTime":"2025-12-04T10:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.295318 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.295356 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.295367 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.295397 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.295408 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:35Z","lastTransitionTime":"2025-12-04T10:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.397714 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.397763 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.397776 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.397795 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.397807 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:35Z","lastTransitionTime":"2025-12-04T10:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.499713 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.499769 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.499785 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.499807 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.499825 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:35Z","lastTransitionTime":"2025-12-04T10:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.524765 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xzkp_1261b9db-fe52-4fbc-9a9c-7e0c3486276e/ovnkube-controller/0.log" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.527516 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" event={"ID":"1261b9db-fe52-4fbc-9a9c-7e0c3486276e","Type":"ContainerStarted","Data":"d36280e9b7a9c59f592f19bf2d522693e7c55a6621cc5d65fd9e6b61314417e7"} Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.528090 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.544105 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:35Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.558203 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5814902f9f02bed9eb4600d77993f20ad463f2f78064f66755641f509913f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d1e9dabb13deee46d2c9ce72c90d2ed09c3e293b11b987c0c4be5b9a3cebe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:35Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.571920 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fade3c6b87d8b8d465b4e59ca3685265f09365b19feab5f4d31c18fecdea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:35Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.587565 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc448500592bc03a05d95154710642daf9362093c92bed1334c84b3f23ebf556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:35Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.603092 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.603166 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.603190 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.603222 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.603246 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:35Z","lastTransitionTime":"2025-12-04T10:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.606421 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:35Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.624739 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8475bb26-8864-4d49-935b-db7d4cb73387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bf0cca6fa6a2c105fe80613113f4e04f5252d70eb5de23b53632ba8479e070\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399a139e421589fc50e4bad256402be123e37600f5c39eadf919ce767e9eeeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g76nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:35Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.638635 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8a90f1a-4f3b-4007-8635-52d0af689af0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3ec5dcde986c3e369253eb2c4fc8d6e646c9c1557bc5c28a98a6c71562d04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e3c29b09715bd660a94365179740c46dd6bbeb04b333ecb5dadbf8cdea5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6c54394c01d6097ec29247b0676fc04989db88066d482eb053b5e9b040b630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://941587f1cf00e0b2279e8f08f1bad8e477f14a02f6cfc8ed858d0196e01f29f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:35Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.652610 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94431d54539f9725aa9b32ac709bc9406dd0f1a1f82c3de77b3fefe7a34e3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:35Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.698520 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d36280e9b7a9c59f592f19bf2d522693e7c55a6621cc5d65fd9e6b61314417e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d6e5616fcaf6e34e6247d6e0cf66677c59142310c093aa0f4e449fffffdd3b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T10:15:33Z\\\",\\\"message\\\":\\\"1204 10:15:33.248372 6134 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1204 10:15:33.248391 6134 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1204 10:15:33.248395 6134 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1204 10:15:33.248416 6134 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1204 10:15:33.248451 6134 handler.go:208] Removed *v1.Node event handler 7\\\\nI1204 10:15:33.248459 6134 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1204 10:15:33.248455 6134 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1204 10:15:33.248461 6134 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1204 10:15:33.248435 6134 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1204 10:15:33.248483 6134 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1204 10:15:33.248516 6134 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1204 10:15:33.248543 6134 factory.go:656] Stopping watch factory\\\\nI1204 10:15:33.248558 6134 ovnkube.go:599] Stopped ovnkube\\\\nI1204 10:15:33.248565 6134 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1204 10:15:33.248585 6134 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1204 10:15:33.248619 6134 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xzkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:35Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.705325 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.705382 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.705390 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.705405 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.705414 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:35Z","lastTransitionTime":"2025-12-04T10:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.714445 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9ft9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3938322-cab2-412a-91e4-904ce2d99adf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a45d7846aee5ad8d18b94a52fb2d4e99365bc0a6c5b774d33b26e7a107fc825d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4l7pt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9ft9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:35Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.733713 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:35Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.775916 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5g27v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6a78509-d612-4338-8562-9b0627c1793f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8109a06b8e0bea96f12e0e1cf757a34d1da4d89159cac1247613d59cfa5645df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5g27v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:35Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.792655 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c223077ddfa1e5cb3f1efbb02dcac1239c34d2337f398635019c8fe6e4509097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zk2rt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:35Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.807283 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.807268 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7266f967-3803-4ef3-9609-5a9c540a8305\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 10:15:16.683634 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 10:15:16.685267 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-316787488/tls.crt::/tmp/serving-cert-316787488/tls.key\\\\\\\"\\\\nI1204 10:15:22.063722 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 10:15:22.066433 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 10:15:22.066471 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 10:15:22.066492 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 10:15:22.066499 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 10:15:22.070852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 10:15:22.070873 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070879 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 10:15:22.070889 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 10:15:22.070892 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI1204 10:15:22.070891 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 10:15:22.070895 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 10:15:22.073753 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:35Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.807508 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.807525 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.807547 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.807563 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:35Z","lastTransitionTime":"2025-12-04T10:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.912876 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.912992 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.913030 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.913064 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:35 crc kubenswrapper[4831]: I1204 10:15:35.913086 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:35Z","lastTransitionTime":"2025-12-04T10:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.016220 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.016301 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.016324 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.016355 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.016377 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:36Z","lastTransitionTime":"2025-12-04T10:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.119532 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.119610 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.119630 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.119653 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.119725 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:36Z","lastTransitionTime":"2025-12-04T10:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.221790 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.221831 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.221842 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.221858 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.221871 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:36Z","lastTransitionTime":"2025-12-04T10:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.275757 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.275786 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:15:36 crc kubenswrapper[4831]: E1204 10:15:36.275878 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.275757 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:15:36 crc kubenswrapper[4831]: E1204 10:15:36.276010 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:15:36 crc kubenswrapper[4831]: E1204 10:15:36.276120 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.324548 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.324628 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.324648 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.324725 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.324743 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:36Z","lastTransitionTime":"2025-12-04T10:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.427335 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.427401 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.427423 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.427452 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.427474 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:36Z","lastTransitionTime":"2025-12-04T10:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.445772 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.445874 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.445942 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.445973 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.446043 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:36Z","lastTransitionTime":"2025-12-04T10:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:36 crc kubenswrapper[4831]: E1204 10:15:36.467989 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae8c7b9f-5f4a-44bc-a820-5600f29471a7\\\",\\\"systemUUID\\\":\\\"aaf9904b-e604-46a1-bdf5-7d2b7b9a992c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:36Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.474518 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.474582 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.474602 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.474627 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.474646 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:36Z","lastTransitionTime":"2025-12-04T10:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.492956 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttr4t"] Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.493957 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttr4t" Dec 04 10:15:36 crc kubenswrapper[4831]: E1204 10:15:36.494975 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae8c7b9f-5f4a-44bc-a820-5600f29471a7\\\",\\\"systemUUID\\\":\\\"aaf9904b-e604-46a1-bdf5-7d2b7b9a992c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:36Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.497587 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.498761 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.505198 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.505248 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.505271 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.505296 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.505313 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:36Z","lastTransitionTime":"2025-12-04T10:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.524555 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:36Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:36 crc kubenswrapper[4831]: E1204 10:15:36.528425 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae8c7b9f-5f4a-44bc-a820-5600f29471a7\\\",\\\"systemUUID\\\":\\\"aaf9904b-e604-46a1-bdf5-7d2b7b9a992c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:36Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.533751 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.533784 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.533797 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.533812 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.533824 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:36Z","lastTransitionTime":"2025-12-04T10:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.535128 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xzkp_1261b9db-fe52-4fbc-9a9c-7e0c3486276e/ovnkube-controller/1.log" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.536302 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xzkp_1261b9db-fe52-4fbc-9a9c-7e0c3486276e/ovnkube-controller/0.log" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.540406 4831 generic.go:334] "Generic (PLEG): container finished" podID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerID="d36280e9b7a9c59f592f19bf2d522693e7c55a6621cc5d65fd9e6b61314417e7" exitCode=1 Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.540513 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" event={"ID":"1261b9db-fe52-4fbc-9a9c-7e0c3486276e","Type":"ContainerDied","Data":"d36280e9b7a9c59f592f19bf2d522693e7c55a6621cc5d65fd9e6b61314417e7"} Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.540569 4831 scope.go:117] "RemoveContainer" containerID="5d6e5616fcaf6e34e6247d6e0cf66677c59142310c093aa0f4e449fffffdd3b7" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.541549 4831 scope.go:117] "RemoveContainer" containerID="d36280e9b7a9c59f592f19bf2d522693e7c55a6621cc5d65fd9e6b61314417e7" Dec 04 10:15:36 crc kubenswrapper[4831]: E1204 10:15:36.541880 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4xzkp_openshift-ovn-kubernetes(1261b9db-fe52-4fbc-9a9c-7e0c3486276e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.547106 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5g27v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6a78509-d612-4338-8562-9b0627c1793f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8109a06b8e0bea96f12e0e1cf757a34d1da4d89159cac1247613d59cfa5645df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5g27v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:36Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:36 crc kubenswrapper[4831]: E1204 10:15:36.555607 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae8c7b9f-5f4a-44bc-a820-5600f29471a7\\\",\\\"systemUUID\\\":\\\"aaf9904b-e604-46a1-bdf5-7d2b7b9a992c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:36Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.559858 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.559895 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.559906 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.559921 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.559931 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:36Z","lastTransitionTime":"2025-12-04T10:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.570610 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d36280e9b7a9c59f592f19bf2d522693e7c55a6621cc5d65fd9e6b61314417e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d6e5616fcaf6e34e6247d6e0cf66677c59142310c093aa0f4e449fffffdd3b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T10:15:33Z\\\",\\\"message\\\":\\\"1204 10:15:33.248372 6134 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1204 10:15:33.248391 6134 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1204 10:15:33.248395 6134 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1204 10:15:33.248416 6134 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1204 10:15:33.248451 6134 handler.go:208] Removed *v1.Node event handler 7\\\\nI1204 10:15:33.248459 6134 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1204 10:15:33.248455 6134 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1204 10:15:33.248461 6134 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1204 10:15:33.248435 6134 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1204 10:15:33.248483 6134 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1204 10:15:33.248516 6134 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1204 10:15:33.248543 6134 factory.go:656] Stopping watch factory\\\\nI1204 10:15:33.248558 6134 ovnkube.go:599] Stopped ovnkube\\\\nI1204 10:15:33.248565 6134 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1204 10:15:33.248585 6134 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1204 10:15:33.248619 6134 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xzkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:36Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:36 crc kubenswrapper[4831]: E1204 10:15:36.573130 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae8c7b9f-5f4a-44bc-a820-5600f29471a7\\\",\\\"systemUUID\\\":\\\"aaf9904b-e604-46a1-bdf5-7d2b7b9a992c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:36Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:36 crc kubenswrapper[4831]: E1204 10:15:36.573283 4831 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.574700 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.574738 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.574749 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.574765 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.574777 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:36Z","lastTransitionTime":"2025-12-04T10:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.581771 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9ft9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3938322-cab2-412a-91e4-904ce2d99adf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a45d7846aee5ad8d18b94a52fb2d4e99365bc0a6c5b774d33b26e7a107fc825d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4l7pt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9ft9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:36Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.594922 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7266f967-3803-4ef3-9609-5a9c540a8305\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 10:15:16.683634 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 10:15:16.685267 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-316787488/tls.crt::/tmp/serving-cert-316787488/tls.key\\\\\\\"\\\\nI1204 10:15:22.063722 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 10:15:22.066433 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 10:15:22.066471 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 10:15:22.066492 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 10:15:22.066499 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 10:15:22.070852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 10:15:22.070873 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070879 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 10:15:22.070889 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 10:15:22.070892 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI1204 10:15:22.070891 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 10:15:22.070895 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 10:15:22.073753 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:36Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.609259 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c223077ddfa1e5cb3f1efbb02dcac1239c34d2337f398635019c8fe6e4509097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zk2rt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:36Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.612673 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ca0a26ee-8553-41fc-8723-935bd994e3dd-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ttr4t\" (UID: \"ca0a26ee-8553-41fc-8723-935bd994e3dd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttr4t" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.612704 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmwqm\" (UniqueName: \"kubernetes.io/projected/ca0a26ee-8553-41fc-8723-935bd994e3dd-kube-api-access-cmwqm\") pod \"ovnkube-control-plane-749d76644c-ttr4t\" (UID: \"ca0a26ee-8553-41fc-8723-935bd994e3dd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttr4t" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.612733 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ca0a26ee-8553-41fc-8723-935bd994e3dd-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ttr4t\" (UID: \"ca0a26ee-8553-41fc-8723-935bd994e3dd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttr4t" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.612750 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ca0a26ee-8553-41fc-8723-935bd994e3dd-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ttr4t\" (UID: \"ca0a26ee-8553-41fc-8723-935bd994e3dd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttr4t" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.619557 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttr4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca0a26ee-8553-41fc-8723-935bd994e3dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmwqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmwqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ttr4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:36Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.633817 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc448500592bc03a05d95154710642daf9362093c92bed1334c84b3f23ebf556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:36Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.647482 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:36Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.661758 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5814902f9f02bed9eb4600d77993f20ad463f2f78064f66755641f509913f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d1e9dabb13deee46d2c9ce72c90d2ed09c3e293b11b987c0c4be5b9a3cebe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:36Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.676591 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fade3c6b87d8b8d465b4e59ca3685265f09365b19feab5f4d31c18fecdea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:36Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.676978 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.677020 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.677039 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.677057 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.677070 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:36Z","lastTransitionTime":"2025-12-04T10:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.690591 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8a90f1a-4f3b-4007-8635-52d0af689af0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3ec5dcde986c3e369253eb2c4fc8d6e646c9c1557bc5c28a98a6c71562d04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e3c29b09715bd660a94365179740c46dd6bbeb04b333ecb5dadbf8cdea5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6c54394c01d6097ec29247b0676fc04989db88066d482eb053b5e9b040b630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://941587f1cf00e0b2279e8f08f1bad8e477f14a02f6cfc8ed858d0196e01f29f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:36Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.702106 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94431d54539f9725aa9b32ac709bc9406dd0f1a1f82c3de77b3fefe7a34e3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:36Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.713645 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ca0a26ee-8553-41fc-8723-935bd994e3dd-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ttr4t\" (UID: \"ca0a26ee-8553-41fc-8723-935bd994e3dd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttr4t" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.713697 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ca0a26ee-8553-41fc-8723-935bd994e3dd-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ttr4t\" (UID: \"ca0a26ee-8553-41fc-8723-935bd994e3dd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttr4t" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.713759 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ca0a26ee-8553-41fc-8723-935bd994e3dd-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ttr4t\" (UID: \"ca0a26ee-8553-41fc-8723-935bd994e3dd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttr4t" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.714043 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmwqm\" (UniqueName: \"kubernetes.io/projected/ca0a26ee-8553-41fc-8723-935bd994e3dd-kube-api-access-cmwqm\") pod \"ovnkube-control-plane-749d76644c-ttr4t\" (UID: \"ca0a26ee-8553-41fc-8723-935bd994e3dd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttr4t" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.714393 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ca0a26ee-8553-41fc-8723-935bd994e3dd-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ttr4t\" (UID: \"ca0a26ee-8553-41fc-8723-935bd994e3dd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttr4t" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.714458 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ca0a26ee-8553-41fc-8723-935bd994e3dd-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ttr4t\" (UID: \"ca0a26ee-8553-41fc-8723-935bd994e3dd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttr4t" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.718862 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:36Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.720412 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ca0a26ee-8553-41fc-8723-935bd994e3dd-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ttr4t\" (UID: \"ca0a26ee-8553-41fc-8723-935bd994e3dd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttr4t" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.730058 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmwqm\" (UniqueName: \"kubernetes.io/projected/ca0a26ee-8553-41fc-8723-935bd994e3dd-kube-api-access-cmwqm\") pod \"ovnkube-control-plane-749d76644c-ttr4t\" (UID: \"ca0a26ee-8553-41fc-8723-935bd994e3dd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttr4t" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.733043 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8475bb26-8864-4d49-935b-db7d4cb73387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bf0cca6fa6a2c105fe80613113f4e04f5252d70eb5de23b53632ba8479e070\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399a139e421589fc50e4bad256402be123e37600f5c39eadf919ce767e9eeeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g76nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:36Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.750590 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8a90f1a-4f3b-4007-8635-52d0af689af0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3ec5dcde986c3e369253eb2c4fc8d6e646c9c1557bc5c28a98a6c71562d04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e3c29b09715bd660a94365179740c46dd6bbeb04b333ecb5dadbf8cdea5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6c54394c01d6097ec29247b0676fc04989db88066d482eb053b5e9b040b630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://941587f1cf00e0b2279e8f08f1bad8e477f14a02f6cfc8ed858d0196e01f29f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:36Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.764135 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94431d54539f9725aa9b32ac709bc9406dd0f1a1f82c3de77b3fefe7a34e3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:36Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.780287 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.780329 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.780343 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.780362 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.780376 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:36Z","lastTransitionTime":"2025-12-04T10:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.782101 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:36Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.795556 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8475bb26-8864-4d49-935b-db7d4cb73387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bf0cca6fa6a2c105fe80613113f4e04f5252d70eb5de23b53632ba8479e070\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399a139e421589fc50e4bad256402be123e37600f5c39eadf919ce767e9eeeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g76nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:36Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.810038 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:36Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.819182 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttr4t" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.825243 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5g27v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6a78509-d612-4338-8562-9b0627c1793f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8109a06b8e0bea96f12e0e1cf757a34d1da4d89159cac1247613d59cfa5645df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5g27v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:36Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:36 crc kubenswrapper[4831]: W1204 10:15:36.833509 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca0a26ee_8553_41fc_8723_935bd994e3dd.slice/crio-249ba18714c2bb598d2611fe65f32ce465162006bfd401d5640c3d229cc50d6d WatchSource:0}: Error finding container 249ba18714c2bb598d2611fe65f32ce465162006bfd401d5640c3d229cc50d6d: Status 404 returned error can't find the container with id 249ba18714c2bb598d2611fe65f32ce465162006bfd401d5640c3d229cc50d6d Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.855615 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d36280e9b7a9c59f592f19bf2d522693e7c55a6621cc5d65fd9e6b61314417e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d6e5616fcaf6e34e6247d6e0cf66677c59142310c093aa0f4e449fffffdd3b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T10:15:33Z\\\",\\\"message\\\":\\\"1204 10:15:33.248372 6134 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1204 10:15:33.248391 6134 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1204 10:15:33.248395 6134 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1204 10:15:33.248416 6134 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1204 10:15:33.248451 6134 handler.go:208] Removed *v1.Node event handler 7\\\\nI1204 10:15:33.248459 6134 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1204 10:15:33.248455 6134 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1204 10:15:33.248461 6134 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1204 10:15:33.248435 6134 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1204 10:15:33.248483 6134 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1204 10:15:33.248516 6134 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1204 10:15:33.248543 6134 factory.go:656] Stopping watch factory\\\\nI1204 10:15:33.248558 6134 ovnkube.go:599] Stopped ovnkube\\\\nI1204 10:15:33.248565 6134 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1204 10:15:33.248585 6134 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1204 10:15:33.248619 6134 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d36280e9b7a9c59f592f19bf2d522693e7c55a6621cc5d65fd9e6b61314417e7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"message\\\":\\\"5:35.785914 6259 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1204 10:15:35.785917 6259 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1204 10:15:35.785923 6259 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 10:15:35.786159 6259 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 10:15:35.786390 6259 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 10:15:35.786452 6259 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 10:15:35.786578 6259 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1204 10:15:35.786806 6259 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 10:15:35.786944 6259 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 10:15:35.787222 6259 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xzkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:36Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.870477 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9ft9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3938322-cab2-412a-91e4-904ce2d99adf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a45d7846aee5ad8d18b94a52fb2d4e99365bc0a6c5b774d33b26e7a107fc825d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4l7pt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9ft9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:36Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.883010 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.883039 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.883052 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.883067 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.883079 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:36Z","lastTransitionTime":"2025-12-04T10:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.892180 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7266f967-3803-4ef3-9609-5a9c540a8305\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 10:15:16.683634 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 10:15:16.685267 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-316787488/tls.crt::/tmp/serving-cert-316787488/tls.key\\\\\\\"\\\\nI1204 10:15:22.063722 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 10:15:22.066433 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 10:15:22.066471 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 10:15:22.066492 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 10:15:22.066499 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 10:15:22.070852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 10:15:22.070873 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070879 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 10:15:22.070889 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 10:15:22.070892 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI1204 10:15:22.070891 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 10:15:22.070895 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 10:15:22.073753 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:36Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.911701 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c223077ddfa1e5cb3f1efbb02dcac1239c34d2337f398635019c8fe6e4509097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zk2rt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:36Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.926065 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttr4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca0a26ee-8553-41fc-8723-935bd994e3dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmwqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmwqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ttr4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:36Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.941341 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fade3c6b87d8b8d465b4e59ca3685265f09365b19feab5f4d31c18fecdea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:36Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.955637 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc448500592bc03a05d95154710642daf9362093c92bed1334c84b3f23ebf556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:36Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.969631 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:36Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.982721 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5814902f9f02bed9eb4600d77993f20ad463f2f78064f66755641f509913f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d1e9dabb13deee46d2c9ce72c90d2ed09c3e293b11b987c0c4be5b9a3cebe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:36Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.985775 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.985814 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.985823 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.985839 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:36 crc kubenswrapper[4831]: I1204 10:15:36.985857 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:36Z","lastTransitionTime":"2025-12-04T10:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.089507 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.089858 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.090000 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.090134 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.090259 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:37Z","lastTransitionTime":"2025-12-04T10:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.193890 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.193949 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.193965 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.193988 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.194004 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:37Z","lastTransitionTime":"2025-12-04T10:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.296537 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.297232 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.297319 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.297349 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.297409 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:37Z","lastTransitionTime":"2025-12-04T10:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.400862 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.400918 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.400929 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.400956 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.400969 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:37Z","lastTransitionTime":"2025-12-04T10:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.504549 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.504609 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.504627 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.504654 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.504720 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:37Z","lastTransitionTime":"2025-12-04T10:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.546865 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttr4t" event={"ID":"ca0a26ee-8553-41fc-8723-935bd994e3dd","Type":"ContainerStarted","Data":"249ba18714c2bb598d2611fe65f32ce465162006bfd401d5640c3d229cc50d6d"} Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.548114 4831 scope.go:117] "RemoveContainer" containerID="d36280e9b7a9c59f592f19bf2d522693e7c55a6621cc5d65fd9e6b61314417e7" Dec 04 10:15:37 crc kubenswrapper[4831]: E1204 10:15:37.548451 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4xzkp_openshift-ovn-kubernetes(1261b9db-fe52-4fbc-9a9c-7e0c3486276e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.587398 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d36280e9b7a9c59f592f19bf2d522693e7c55a6621cc5d65fd9e6b61314417e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d36280e9b7a9c59f592f19bf2d522693e7c55a6621cc5d65fd9e6b61314417e7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"message\\\":\\\"5:35.785914 6259 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1204 10:15:35.785917 6259 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1204 10:15:35.785923 6259 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 10:15:35.786159 6259 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 10:15:35.786390 6259 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 10:15:35.786452 6259 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 10:15:35.786578 6259 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1204 10:15:35.786806 6259 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 10:15:35.786944 6259 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 10:15:35.787222 6259 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4xzkp_openshift-ovn-kubernetes(1261b9db-fe52-4fbc-9a9c-7e0c3486276e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xzkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:37Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.601936 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9ft9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3938322-cab2-412a-91e4-904ce2d99adf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a45d7846aee5ad8d18b94a52fb2d4e99365bc0a6c5b774d33b26e7a107fc825d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4l7pt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9ft9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:37Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.607200 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.607233 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.607242 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.607276 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.607287 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:37Z","lastTransitionTime":"2025-12-04T10:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.620336 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:37Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.641318 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5g27v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6a78509-d612-4338-8562-9b0627c1793f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8109a06b8e0bea96f12e0e1cf757a34d1da4d89159cac1247613d59cfa5645df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5g27v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:37Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.658325 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c223077ddfa1e5cb3f1efbb02dcac1239c34d2337f398635019c8fe6e4509097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zk2rt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:37Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.673141 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttr4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca0a26ee-8553-41fc-8723-935bd994e3dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmwqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmwqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ttr4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:37Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.689937 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7266f967-3803-4ef3-9609-5a9c540a8305\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 10:15:16.683634 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 10:15:16.685267 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-316787488/tls.crt::/tmp/serving-cert-316787488/tls.key\\\\\\\"\\\\nI1204 10:15:22.063722 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 10:15:22.066433 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 10:15:22.066471 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 10:15:22.066492 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 10:15:22.066499 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 10:15:22.070852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 10:15:22.070873 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070879 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 10:15:22.070889 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 10:15:22.070892 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI1204 10:15:22.070891 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 10:15:22.070895 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 10:15:22.073753 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:37Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.702501 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:37Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.709892 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.709916 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.709925 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.709937 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.709947 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:37Z","lastTransitionTime":"2025-12-04T10:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.716627 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5814902f9f02bed9eb4600d77993f20ad463f2f78064f66755641f509913f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d1e9dabb13deee46d2c9ce72c90d2ed09c3e293b11b987c0c4be5b9a3cebe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:37Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.733076 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fade3c6b87d8b8d465b4e59ca3685265f09365b19feab5f4d31c18fecdea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:37Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.747908 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc448500592bc03a05d95154710642daf9362093c92bed1334c84b3f23ebf556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:37Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.769812 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:37Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.785578 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8475bb26-8864-4d49-935b-db7d4cb73387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bf0cca6fa6a2c105fe80613113f4e04f5252d70eb5de23b53632ba8479e070\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399a139e421589fc50e4bad256402be123e37600f5c39eadf919ce767e9eeeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g76nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:37Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.803022 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8a90f1a-4f3b-4007-8635-52d0af689af0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3ec5dcde986c3e369253eb2c4fc8d6e646c9c1557bc5c28a98a6c71562d04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e3c29b09715bd660a94365179740c46dd6bbeb04b333ecb5dadbf8cdea5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6c54394c01d6097ec29247b0676fc04989db88066d482eb053b5e9b040b630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://941587f1cf00e0b2279e8f08f1bad8e477f14a02f6cfc8ed858d0196e01f29f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:37Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.812052 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.812100 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.812112 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.812132 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.812143 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:37Z","lastTransitionTime":"2025-12-04T10:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.818047 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94431d54539f9725aa9b32ac709bc9406dd0f1a1f82c3de77b3fefe7a34e3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:37Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.915348 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.915392 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.915404 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.915424 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.915437 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:37Z","lastTransitionTime":"2025-12-04T10:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.976731 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-fd6cw"] Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.977590 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:15:37 crc kubenswrapper[4831]: E1204 10:15:37.977742 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:15:37 crc kubenswrapper[4831]: I1204 10:15:37.990555 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:37Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.006509 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5814902f9f02bed9eb4600d77993f20ad463f2f78064f66755641f509913f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d1e9dabb13deee46d2c9ce72c90d2ed09c3e293b11b987c0c4be5b9a3cebe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:38Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.017386 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.017435 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.017448 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.017468 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.017481 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:38Z","lastTransitionTime":"2025-12-04T10:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.027855 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:15:38 crc kubenswrapper[4831]: E1204 10:15:38.027996 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:15:54.027979834 +0000 UTC m=+50.977155148 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.029619 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fade3c6b87d8b8d465b4e59ca3685265f09365b19feab5f4d31c18fecdea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:38Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.045442 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc448500592bc03a05d95154710642daf9362093c92bed1334c84b3f23ebf556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:38Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.061736 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8a90f1a-4f3b-4007-8635-52d0af689af0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3ec5dcde986c3e369253eb2c4fc8d6e646c9c1557bc5c28a98a6c71562d04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e3c29b09715bd660a94365179740c46dd6bbeb04b333ecb5dadbf8cdea5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6c54394c01d6097ec29247b0676fc04989db88066d482eb053b5e9b040b630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://941587f1cf00e0b2279e8f08f1bad8e477f14a02f6cfc8ed858d0196e01f29f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:38Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.072445 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94431d54539f9725aa9b32ac709bc9406dd0f1a1f82c3de77b3fefe7a34e3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:38Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.084942 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:38Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.095478 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8475bb26-8864-4d49-935b-db7d4cb73387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bf0cca6fa6a2c105fe80613113f4e04f5252d70eb5de23b53632ba8479e070\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399a139e421589fc50e4bad256402be123e37600f5c39eadf919ce767e9eeeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g76nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:38Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.106086 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fd6cw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b50ce71-ca0a-4532-86e9-4f779dcc7b93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fd6cw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:38Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.119965 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.120003 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.120012 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.120028 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.120038 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:38Z","lastTransitionTime":"2025-12-04T10:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.120674 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:38Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.129042 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.129083 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.129109 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b50ce71-ca0a-4532-86e9-4f779dcc7b93-metrics-certs\") pod \"network-metrics-daemon-fd6cw\" (UID: \"5b50ce71-ca0a-4532-86e9-4f779dcc7b93\") " pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.129140 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.129163 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-strcn\" (UniqueName: \"kubernetes.io/projected/5b50ce71-ca0a-4532-86e9-4f779dcc7b93-kube-api-access-strcn\") pod \"network-metrics-daemon-fd6cw\" (UID: \"5b50ce71-ca0a-4532-86e9-4f779dcc7b93\") " pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.129187 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:15:38 crc kubenswrapper[4831]: E1204 10:15:38.129244 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 10:15:38 crc kubenswrapper[4831]: E1204 10:15:38.129271 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 10:15:38 crc kubenswrapper[4831]: E1204 10:15:38.129286 4831 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 10:15:38 crc kubenswrapper[4831]: E1204 10:15:38.129285 4831 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 10:15:38 crc kubenswrapper[4831]: E1204 10:15:38.129298 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 10:15:38 crc kubenswrapper[4831]: E1204 10:15:38.129323 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 10:15:38 crc kubenswrapper[4831]: E1204 10:15:38.129243 4831 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 10:15:38 crc kubenswrapper[4831]: E1204 10:15:38.129333 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 10:15:54.129315855 +0000 UTC m=+51.078491179 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 10:15:38 crc kubenswrapper[4831]: E1204 10:15:38.129412 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 10:15:54.129390707 +0000 UTC m=+51.078566031 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 10:15:38 crc kubenswrapper[4831]: E1204 10:15:38.129334 4831 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 10:15:38 crc kubenswrapper[4831]: E1204 10:15:38.129427 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 10:15:54.129418958 +0000 UTC m=+51.078594282 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 10:15:38 crc kubenswrapper[4831]: E1204 10:15:38.129476 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 10:15:54.129458669 +0000 UTC m=+51.078634053 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.134738 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5g27v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6a78509-d612-4338-8562-9b0627c1793f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8109a06b8e0bea96f12e0e1cf757a34d1da4d89159cac1247613d59cfa5645df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5g27v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:38Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.152984 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d36280e9b7a9c59f592f19bf2d522693e7c55a6621cc5d65fd9e6b61314417e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d36280e9b7a9c59f592f19bf2d522693e7c55a6621cc5d65fd9e6b61314417e7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"message\\\":\\\"5:35.785914 6259 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1204 10:15:35.785917 6259 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1204 10:15:35.785923 6259 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 10:15:35.786159 6259 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 10:15:35.786390 6259 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 10:15:35.786452 6259 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 10:15:35.786578 6259 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1204 10:15:35.786806 6259 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 10:15:35.786944 6259 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 10:15:35.787222 6259 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4xzkp_openshift-ovn-kubernetes(1261b9db-fe52-4fbc-9a9c-7e0c3486276e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xzkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:38Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.165441 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9ft9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3938322-cab2-412a-91e4-904ce2d99adf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a45d7846aee5ad8d18b94a52fb2d4e99365bc0a6c5b774d33b26e7a107fc825d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4l7pt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9ft9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:38Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.183832 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7266f967-3803-4ef3-9609-5a9c540a8305\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 10:15:16.683634 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 10:15:16.685267 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-316787488/tls.crt::/tmp/serving-cert-316787488/tls.key\\\\\\\"\\\\nI1204 10:15:22.063722 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 10:15:22.066433 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 10:15:22.066471 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 10:15:22.066492 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 10:15:22.066499 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 10:15:22.070852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 10:15:22.070873 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070879 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 10:15:22.070889 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 10:15:22.070892 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI1204 10:15:22.070891 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 10:15:22.070895 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 10:15:22.073753 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:38Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.198976 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c223077ddfa1e5cb3f1efbb02dcac1239c34d2337f398635019c8fe6e4509097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zk2rt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:38Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.209953 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttr4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca0a26ee-8553-41fc-8723-935bd994e3dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmwqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmwqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ttr4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:38Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.222254 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.222294 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.222304 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.222333 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.222346 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:38Z","lastTransitionTime":"2025-12-04T10:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.229891 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b50ce71-ca0a-4532-86e9-4f779dcc7b93-metrics-certs\") pod \"network-metrics-daemon-fd6cw\" (UID: \"5b50ce71-ca0a-4532-86e9-4f779dcc7b93\") " pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.229943 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-strcn\" (UniqueName: \"kubernetes.io/projected/5b50ce71-ca0a-4532-86e9-4f779dcc7b93-kube-api-access-strcn\") pod \"network-metrics-daemon-fd6cw\" (UID: \"5b50ce71-ca0a-4532-86e9-4f779dcc7b93\") " pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:15:38 crc kubenswrapper[4831]: E1204 10:15:38.230148 4831 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 10:15:38 crc kubenswrapper[4831]: E1204 10:15:38.230262 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b50ce71-ca0a-4532-86e9-4f779dcc7b93-metrics-certs podName:5b50ce71-ca0a-4532-86e9-4f779dcc7b93 nodeName:}" failed. No retries permitted until 2025-12-04 10:15:38.730243945 +0000 UTC m=+35.679419259 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b50ce71-ca0a-4532-86e9-4f779dcc7b93-metrics-certs") pod "network-metrics-daemon-fd6cw" (UID: "5b50ce71-ca0a-4532-86e9-4f779dcc7b93") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.245925 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-strcn\" (UniqueName: \"kubernetes.io/projected/5b50ce71-ca0a-4532-86e9-4f779dcc7b93-kube-api-access-strcn\") pod \"network-metrics-daemon-fd6cw\" (UID: \"5b50ce71-ca0a-4532-86e9-4f779dcc7b93\") " pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.275672 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:15:38 crc kubenswrapper[4831]: E1204 10:15:38.275784 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.275999 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.276044 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:15:38 crc kubenswrapper[4831]: E1204 10:15:38.276250 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:15:38 crc kubenswrapper[4831]: E1204 10:15:38.276403 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.325179 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.325210 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.325221 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.325237 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.325249 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:38Z","lastTransitionTime":"2025-12-04T10:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.427111 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.427152 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.427160 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.427174 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.427183 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:38Z","lastTransitionTime":"2025-12-04T10:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.529629 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.529950 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.529972 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.529993 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.530008 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:38Z","lastTransitionTime":"2025-12-04T10:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.551493 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttr4t" event={"ID":"ca0a26ee-8553-41fc-8723-935bd994e3dd","Type":"ContainerStarted","Data":"3c389f190e572a728addfeaa6cb3bac8ece2ffa00fd451e584b5bcf4965256fb"} Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.551878 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttr4t" event={"ID":"ca0a26ee-8553-41fc-8723-935bd994e3dd","Type":"ContainerStarted","Data":"87e3574c160272839c8c85471f3e37b92c4bfaac028513cb345db9c804f73ce2"} Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.553290 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xzkp_1261b9db-fe52-4fbc-9a9c-7e0c3486276e/ovnkube-controller/1.log" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.566372 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:38Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.577341 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5g27v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6a78509-d612-4338-8562-9b0627c1793f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8109a06b8e0bea96f12e0e1cf757a34d1da4d89159cac1247613d59cfa5645df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5g27v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:38Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.597215 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d36280e9b7a9c59f592f19bf2d522693e7c55a6621cc5d65fd9e6b61314417e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d36280e9b7a9c59f592f19bf2d522693e7c55a6621cc5d65fd9e6b61314417e7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"message\\\":\\\"5:35.785914 6259 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1204 10:15:35.785917 6259 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1204 10:15:35.785923 6259 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 10:15:35.786159 6259 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 10:15:35.786390 6259 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 10:15:35.786452 6259 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 10:15:35.786578 6259 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1204 10:15:35.786806 6259 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 10:15:35.786944 6259 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 10:15:35.787222 6259 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4xzkp_openshift-ovn-kubernetes(1261b9db-fe52-4fbc-9a9c-7e0c3486276e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xzkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:38Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.606007 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9ft9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3938322-cab2-412a-91e4-904ce2d99adf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a45d7846aee5ad8d18b94a52fb2d4e99365bc0a6c5b774d33b26e7a107fc825d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4l7pt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9ft9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:38Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.624049 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7266f967-3803-4ef3-9609-5a9c540a8305\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 10:15:16.683634 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 10:15:16.685267 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-316787488/tls.crt::/tmp/serving-cert-316787488/tls.key\\\\\\\"\\\\nI1204 10:15:22.063722 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 10:15:22.066433 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 10:15:22.066471 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 10:15:22.066492 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 10:15:22.066499 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 10:15:22.070852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 10:15:22.070873 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070879 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 10:15:22.070889 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 10:15:22.070892 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI1204 10:15:22.070891 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 10:15:22.070895 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 10:15:22.073753 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:38Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.631848 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.631882 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.631894 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.631910 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.631922 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:38Z","lastTransitionTime":"2025-12-04T10:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.643614 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c223077ddfa1e5cb3f1efbb02dcac1239c34d2337f398635019c8fe6e4509097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zk2rt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:38Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.657390 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttr4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca0a26ee-8553-41fc-8723-935bd994e3dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87e3574c160272839c8c85471f3e37b92c4bfaac028513cb345db9c804f73ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmwqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c389f190e572a728addfeaa6cb3bac8ece2ffa00fd451e584b5bcf4965256fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmwqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ttr4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:38Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.669836 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:38Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.683349 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5814902f9f02bed9eb4600d77993f20ad463f2f78064f66755641f509913f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d1e9dabb13deee46d2c9ce72c90d2ed09c3e293b11b987c0c4be5b9a3cebe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:38Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.698316 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fade3c6b87d8b8d465b4e59ca3685265f09365b19feab5f4d31c18fecdea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:38Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.714696 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc448500592bc03a05d95154710642daf9362093c92bed1334c84b3f23ebf556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:38Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.729495 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8a90f1a-4f3b-4007-8635-52d0af689af0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3ec5dcde986c3e369253eb2c4fc8d6e646c9c1557bc5c28a98a6c71562d04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e3c29b09715bd660a94365179740c46dd6bbeb04b333ecb5dadbf8cdea5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6c54394c01d6097ec29247b0676fc04989db88066d482eb053b5e9b040b630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://941587f1cf00e0b2279e8f08f1bad8e477f14a02f6cfc8ed858d0196e01f29f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:38Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.733886 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.733920 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.733932 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.733947 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.733958 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:38Z","lastTransitionTime":"2025-12-04T10:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.736368 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b50ce71-ca0a-4532-86e9-4f779dcc7b93-metrics-certs\") pod \"network-metrics-daemon-fd6cw\" (UID: \"5b50ce71-ca0a-4532-86e9-4f779dcc7b93\") " pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:15:38 crc kubenswrapper[4831]: E1204 10:15:38.736525 4831 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 10:15:38 crc kubenswrapper[4831]: E1204 10:15:38.736612 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b50ce71-ca0a-4532-86e9-4f779dcc7b93-metrics-certs podName:5b50ce71-ca0a-4532-86e9-4f779dcc7b93 nodeName:}" failed. No retries permitted until 2025-12-04 10:15:39.736590449 +0000 UTC m=+36.685765773 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b50ce71-ca0a-4532-86e9-4f779dcc7b93-metrics-certs") pod "network-metrics-daemon-fd6cw" (UID: "5b50ce71-ca0a-4532-86e9-4f779dcc7b93") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.741402 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94431d54539f9725aa9b32ac709bc9406dd0f1a1f82c3de77b3fefe7a34e3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:38Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.755265 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:38Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.768466 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8475bb26-8864-4d49-935b-db7d4cb73387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bf0cca6fa6a2c105fe80613113f4e04f5252d70eb5de23b53632ba8479e070\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399a139e421589fc50e4bad256402be123e37600f5c39eadf919ce767e9eeeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g76nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:38Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.780942 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fd6cw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b50ce71-ca0a-4532-86e9-4f779dcc7b93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fd6cw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:38Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.836030 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.836083 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.836099 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.836121 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.836137 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:38Z","lastTransitionTime":"2025-12-04T10:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.938824 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.938863 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.938874 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.938888 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:38 crc kubenswrapper[4831]: I1204 10:15:38.938898 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:38Z","lastTransitionTime":"2025-12-04T10:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.041424 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.041500 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.041512 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.041528 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.041600 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:39Z","lastTransitionTime":"2025-12-04T10:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.144888 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.144964 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.144989 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.145019 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.145039 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:39Z","lastTransitionTime":"2025-12-04T10:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.247408 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.247445 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.247454 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.247469 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.247480 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:39Z","lastTransitionTime":"2025-12-04T10:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.276404 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:15:39 crc kubenswrapper[4831]: E1204 10:15:39.276638 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.350529 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.350597 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.350617 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.350644 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.350695 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:39Z","lastTransitionTime":"2025-12-04T10:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.453931 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.454003 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.454028 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.454058 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.454079 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:39Z","lastTransitionTime":"2025-12-04T10:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.556644 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.556806 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.556840 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.556927 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.556952 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:39Z","lastTransitionTime":"2025-12-04T10:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.659584 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.659632 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.659644 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.659690 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.659704 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:39Z","lastTransitionTime":"2025-12-04T10:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.746799 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b50ce71-ca0a-4532-86e9-4f779dcc7b93-metrics-certs\") pod \"network-metrics-daemon-fd6cw\" (UID: \"5b50ce71-ca0a-4532-86e9-4f779dcc7b93\") " pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:15:39 crc kubenswrapper[4831]: E1204 10:15:39.747019 4831 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 10:15:39 crc kubenswrapper[4831]: E1204 10:15:39.747118 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b50ce71-ca0a-4532-86e9-4f779dcc7b93-metrics-certs podName:5b50ce71-ca0a-4532-86e9-4f779dcc7b93 nodeName:}" failed. No retries permitted until 2025-12-04 10:15:41.747087118 +0000 UTC m=+38.696262472 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b50ce71-ca0a-4532-86e9-4f779dcc7b93-metrics-certs") pod "network-metrics-daemon-fd6cw" (UID: "5b50ce71-ca0a-4532-86e9-4f779dcc7b93") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.763087 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.763128 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.763187 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.763211 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.763230 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:39Z","lastTransitionTime":"2025-12-04T10:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.818830 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.836443 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:39Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.852717 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5g27v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6a78509-d612-4338-8562-9b0627c1793f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8109a06b8e0bea96f12e0e1cf757a34d1da4d89159cac1247613d59cfa5645df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5g27v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:39Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.865574 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.865653 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.865710 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.865742 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.865766 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:39Z","lastTransitionTime":"2025-12-04T10:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.874840 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d36280e9b7a9c59f592f19bf2d522693e7c55a6621cc5d65fd9e6b61314417e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d36280e9b7a9c59f592f19bf2d522693e7c55a6621cc5d65fd9e6b61314417e7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"message\\\":\\\"5:35.785914 6259 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1204 10:15:35.785917 6259 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1204 10:15:35.785923 6259 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 10:15:35.786159 6259 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 10:15:35.786390 6259 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 10:15:35.786452 6259 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 10:15:35.786578 6259 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1204 10:15:35.786806 6259 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 10:15:35.786944 6259 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 10:15:35.787222 6259 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4xzkp_openshift-ovn-kubernetes(1261b9db-fe52-4fbc-9a9c-7e0c3486276e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xzkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:39Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.887440 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9ft9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3938322-cab2-412a-91e4-904ce2d99adf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a45d7846aee5ad8d18b94a52fb2d4e99365bc0a6c5b774d33b26e7a107fc825d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4l7pt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9ft9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:39Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.901956 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7266f967-3803-4ef3-9609-5a9c540a8305\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 10:15:16.683634 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 10:15:16.685267 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-316787488/tls.crt::/tmp/serving-cert-316787488/tls.key\\\\\\\"\\\\nI1204 10:15:22.063722 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 10:15:22.066433 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 10:15:22.066471 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 10:15:22.066492 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 10:15:22.066499 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 10:15:22.070852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 10:15:22.070873 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070879 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 10:15:22.070889 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 10:15:22.070892 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI1204 10:15:22.070891 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 10:15:22.070895 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 10:15:22.073753 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:39Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.917210 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c223077ddfa1e5cb3f1efbb02dcac1239c34d2337f398635019c8fe6e4509097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zk2rt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:39Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.934410 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttr4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca0a26ee-8553-41fc-8723-935bd994e3dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87e3574c160272839c8c85471f3e37b92c4bfaac028513cb345db9c804f73ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmwqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c389f190e572a728addfeaa6cb3bac8ece2ffa00fd451e584b5bcf4965256fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmwqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ttr4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:39Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.946967 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:39Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.960198 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5814902f9f02bed9eb4600d77993f20ad463f2f78064f66755641f509913f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d1e9dabb13deee46d2c9ce72c90d2ed09c3e293b11b987c0c4be5b9a3cebe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:39Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.968715 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.968749 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.968759 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.968775 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.968787 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:39Z","lastTransitionTime":"2025-12-04T10:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.973927 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fade3c6b87d8b8d465b4e59ca3685265f09365b19feab5f4d31c18fecdea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:39Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:39 crc kubenswrapper[4831]: I1204 10:15:39.986203 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc448500592bc03a05d95154710642daf9362093c92bed1334c84b3f23ebf556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:39Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.000451 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8a90f1a-4f3b-4007-8635-52d0af689af0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3ec5dcde986c3e369253eb2c4fc8d6e646c9c1557bc5c28a98a6c71562d04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e3c29b09715bd660a94365179740c46dd6bbeb04b333ecb5dadbf8cdea5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6c54394c01d6097ec29247b0676fc04989db88066d482eb053b5e9b040b630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://941587f1cf00e0b2279e8f08f1bad8e477f14a02f6cfc8ed858d0196e01f29f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:39Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.010391 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94431d54539f9725aa9b32ac709bc9406dd0f1a1f82c3de77b3fefe7a34e3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:40Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.022284 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:40Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.033735 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8475bb26-8864-4d49-935b-db7d4cb73387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bf0cca6fa6a2c105fe80613113f4e04f5252d70eb5de23b53632ba8479e070\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399a139e421589fc50e4bad256402be123e37600f5c39eadf919ce767e9eeeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g76nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:40Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.047016 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fd6cw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b50ce71-ca0a-4532-86e9-4f779dcc7b93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fd6cw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:40Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.071303 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.071349 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.071369 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.071390 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.071405 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:40Z","lastTransitionTime":"2025-12-04T10:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.173956 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.174008 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.174025 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.174044 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.174056 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:40Z","lastTransitionTime":"2025-12-04T10:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.275344 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.275443 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:15:40 crc kubenswrapper[4831]: E1204 10:15:40.275513 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:15:40 crc kubenswrapper[4831]: E1204 10:15:40.275641 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.275362 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:15:40 crc kubenswrapper[4831]: E1204 10:15:40.275822 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.277056 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.277104 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.277122 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.277145 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.277160 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:40Z","lastTransitionTime":"2025-12-04T10:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.379079 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.379109 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.379117 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.379130 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.379139 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:40Z","lastTransitionTime":"2025-12-04T10:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.482011 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.482130 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.482151 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.482172 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.482184 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:40Z","lastTransitionTime":"2025-12-04T10:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.585230 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.585275 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.585291 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.585314 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.585331 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:40Z","lastTransitionTime":"2025-12-04T10:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.688319 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.688370 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.688386 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.688408 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.688425 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:40Z","lastTransitionTime":"2025-12-04T10:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.791388 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.791497 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.791537 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.791560 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.791576 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:40Z","lastTransitionTime":"2025-12-04T10:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.893693 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.893754 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.893773 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.893795 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.893812 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:40Z","lastTransitionTime":"2025-12-04T10:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.996735 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.996797 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.996822 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.996866 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:40 crc kubenswrapper[4831]: I1204 10:15:40.996892 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:40Z","lastTransitionTime":"2025-12-04T10:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.100309 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.100367 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.100384 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.100408 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.100425 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:41Z","lastTransitionTime":"2025-12-04T10:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.203907 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.203968 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.203988 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.204012 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.204028 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:41Z","lastTransitionTime":"2025-12-04T10:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.276021 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:15:41 crc kubenswrapper[4831]: E1204 10:15:41.276249 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.306738 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.306828 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.306844 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.306866 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.306881 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:41Z","lastTransitionTime":"2025-12-04T10:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.411151 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.411192 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.411202 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.411217 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.411227 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:41Z","lastTransitionTime":"2025-12-04T10:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.514076 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.514119 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.514134 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.514154 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.514170 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:41Z","lastTransitionTime":"2025-12-04T10:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.617907 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.617959 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.617974 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.617995 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.618008 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:41Z","lastTransitionTime":"2025-12-04T10:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.720779 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.720816 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.720826 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.720842 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.720853 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:41Z","lastTransitionTime":"2025-12-04T10:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.768547 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b50ce71-ca0a-4532-86e9-4f779dcc7b93-metrics-certs\") pod \"network-metrics-daemon-fd6cw\" (UID: \"5b50ce71-ca0a-4532-86e9-4f779dcc7b93\") " pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:15:41 crc kubenswrapper[4831]: E1204 10:15:41.768698 4831 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 10:15:41 crc kubenswrapper[4831]: E1204 10:15:41.768761 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b50ce71-ca0a-4532-86e9-4f779dcc7b93-metrics-certs podName:5b50ce71-ca0a-4532-86e9-4f779dcc7b93 nodeName:}" failed. No retries permitted until 2025-12-04 10:15:45.768744864 +0000 UTC m=+42.717920178 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b50ce71-ca0a-4532-86e9-4f779dcc7b93-metrics-certs") pod "network-metrics-daemon-fd6cw" (UID: "5b50ce71-ca0a-4532-86e9-4f779dcc7b93") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.823314 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.823358 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.823367 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.823379 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.823388 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:41Z","lastTransitionTime":"2025-12-04T10:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.925705 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.925749 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.925768 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.925787 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:41 crc kubenswrapper[4831]: I1204 10:15:41.925796 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:41Z","lastTransitionTime":"2025-12-04T10:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.028284 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.028344 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.028355 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.028370 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.028379 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:42Z","lastTransitionTime":"2025-12-04T10:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.131823 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.131869 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.131880 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.131896 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.131905 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:42Z","lastTransitionTime":"2025-12-04T10:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.234088 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.234162 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.234186 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.234216 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.234239 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:42Z","lastTransitionTime":"2025-12-04T10:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.276017 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.276062 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.276085 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:15:42 crc kubenswrapper[4831]: E1204 10:15:42.276227 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:15:42 crc kubenswrapper[4831]: E1204 10:15:42.276356 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:15:42 crc kubenswrapper[4831]: E1204 10:15:42.276518 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.337446 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.337559 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.337580 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.337608 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.337630 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:42Z","lastTransitionTime":"2025-12-04T10:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.441217 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.441329 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.441357 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.441386 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.441407 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:42Z","lastTransitionTime":"2025-12-04T10:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.544123 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.544165 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.544173 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.544188 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.544198 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:42Z","lastTransitionTime":"2025-12-04T10:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.646415 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.646458 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.646469 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.646485 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.646497 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:42Z","lastTransitionTime":"2025-12-04T10:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.748865 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.748911 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.748920 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.748935 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.748946 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:42Z","lastTransitionTime":"2025-12-04T10:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.851837 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.851895 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.851905 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.851921 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.851933 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:42Z","lastTransitionTime":"2025-12-04T10:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.954860 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.954928 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.954939 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.954955 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:42 crc kubenswrapper[4831]: I1204 10:15:42.954967 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:42Z","lastTransitionTime":"2025-12-04T10:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.058070 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.058122 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.058133 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.058150 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.058162 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:43Z","lastTransitionTime":"2025-12-04T10:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.161423 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.161483 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.161493 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.161507 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.161517 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:43Z","lastTransitionTime":"2025-12-04T10:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.263382 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.263446 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.263454 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.263468 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.263476 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:43Z","lastTransitionTime":"2025-12-04T10:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.275743 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:15:43 crc kubenswrapper[4831]: E1204 10:15:43.275859 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.289865 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8a90f1a-4f3b-4007-8635-52d0af689af0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3ec5dcde986c3e369253eb2c4fc8d6e646c9c1557bc5c28a98a6c71562d04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e3c29b09715bd660a94365179740c46dd6bbeb04b333ecb5dadbf8cdea5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6c54394c01d6097ec29247b0676fc04989db88066d482eb053b5e9b040b630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://941587f1cf00e0b2279e8f08f1bad8e477f14a02f6cfc8ed858d0196e01f29f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:43Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.303235 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94431d54539f9725aa9b32ac709bc9406dd0f1a1f82c3de77b3fefe7a34e3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:43Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.315126 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:43Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.325719 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8475bb26-8864-4d49-935b-db7d4cb73387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bf0cca6fa6a2c105fe80613113f4e04f5252d70eb5de23b53632ba8479e070\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399a139e421589fc50e4bad256402be123e37600f5c39eadf919ce767e9eeeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g76nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:43Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.335990 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fd6cw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b50ce71-ca0a-4532-86e9-4f779dcc7b93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fd6cw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:43Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.346914 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:43Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.359418 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5g27v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6a78509-d612-4338-8562-9b0627c1793f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8109a06b8e0bea96f12e0e1cf757a34d1da4d89159cac1247613d59cfa5645df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5g27v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:43Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.369705 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.369775 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.369793 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.369818 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.369835 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:43Z","lastTransitionTime":"2025-12-04T10:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.381035 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d36280e9b7a9c59f592f19bf2d522693e7c55a6621cc5d65fd9e6b61314417e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d36280e9b7a9c59f592f19bf2d522693e7c55a6621cc5d65fd9e6b61314417e7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"message\\\":\\\"5:35.785914 6259 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1204 10:15:35.785917 6259 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1204 10:15:35.785923 6259 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 10:15:35.786159 6259 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 10:15:35.786390 6259 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 10:15:35.786452 6259 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 10:15:35.786578 6259 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1204 10:15:35.786806 6259 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 10:15:35.786944 6259 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 10:15:35.787222 6259 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4xzkp_openshift-ovn-kubernetes(1261b9db-fe52-4fbc-9a9c-7e0c3486276e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xzkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:43Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.392123 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9ft9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3938322-cab2-412a-91e4-904ce2d99adf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a45d7846aee5ad8d18b94a52fb2d4e99365bc0a6c5b774d33b26e7a107fc825d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4l7pt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9ft9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:43Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.409628 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7266f967-3803-4ef3-9609-5a9c540a8305\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 10:15:16.683634 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 10:15:16.685267 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-316787488/tls.crt::/tmp/serving-cert-316787488/tls.key\\\\\\\"\\\\nI1204 10:15:22.063722 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 10:15:22.066433 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 10:15:22.066471 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 10:15:22.066492 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 10:15:22.066499 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 10:15:22.070852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 10:15:22.070873 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070879 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 10:15:22.070889 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 10:15:22.070892 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI1204 10:15:22.070891 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 10:15:22.070895 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 10:15:22.073753 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:43Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.424825 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c223077ddfa1e5cb3f1efbb02dcac1239c34d2337f398635019c8fe6e4509097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zk2rt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:43Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.435952 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttr4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca0a26ee-8553-41fc-8723-935bd994e3dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87e3574c160272839c8c85471f3e37b92c4bfaac028513cb345db9c804f73ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmwqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c389f190e572a728addfeaa6cb3bac8ece2ffa00fd451e584b5bcf4965256fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmwqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ttr4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:43Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.448182 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:43Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.461196 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5814902f9f02bed9eb4600d77993f20ad463f2f78064f66755641f509913f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d1e9dabb13deee46d2c9ce72c90d2ed09c3e293b11b987c0c4be5b9a3cebe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:43Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.471717 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.471758 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.471776 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.471796 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.471813 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:43Z","lastTransitionTime":"2025-12-04T10:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.473601 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fade3c6b87d8b8d465b4e59ca3685265f09365b19feab5f4d31c18fecdea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:43Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.486417 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc448500592bc03a05d95154710642daf9362093c92bed1334c84b3f23ebf556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:43Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.574097 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.574144 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.574156 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.574185 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.574210 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:43Z","lastTransitionTime":"2025-12-04T10:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.676506 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.676608 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.676627 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.676650 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.676697 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:43Z","lastTransitionTime":"2025-12-04T10:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.779446 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.779504 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.779520 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.779543 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.779555 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:43Z","lastTransitionTime":"2025-12-04T10:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.882478 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.882569 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.882585 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.882611 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.882629 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:43Z","lastTransitionTime":"2025-12-04T10:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.986056 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.986166 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.986185 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.986212 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:43 crc kubenswrapper[4831]: I1204 10:15:43.986229 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:43Z","lastTransitionTime":"2025-12-04T10:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.089392 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.089470 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.089488 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.089511 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.089528 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:44Z","lastTransitionTime":"2025-12-04T10:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.192580 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.192640 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.192702 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.192727 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.192743 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:44Z","lastTransitionTime":"2025-12-04T10:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.275954 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.275981 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.275980 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:15:44 crc kubenswrapper[4831]: E1204 10:15:44.276078 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:15:44 crc kubenswrapper[4831]: E1204 10:15:44.276228 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:15:44 crc kubenswrapper[4831]: E1204 10:15:44.276344 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.294826 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.294853 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.294861 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.294873 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.294881 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:44Z","lastTransitionTime":"2025-12-04T10:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.397771 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.397817 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.397836 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.397858 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.397876 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:44Z","lastTransitionTime":"2025-12-04T10:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.501060 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.501118 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.501154 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.501191 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.501215 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:44Z","lastTransitionTime":"2025-12-04T10:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.604334 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.604397 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.604413 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.604436 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.604453 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:44Z","lastTransitionTime":"2025-12-04T10:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.707537 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.707603 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.707624 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.707651 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.707702 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:44Z","lastTransitionTime":"2025-12-04T10:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.811032 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.811088 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.811104 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.811125 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.811141 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:44Z","lastTransitionTime":"2025-12-04T10:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.913544 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.913581 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.913593 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.913607 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:44 crc kubenswrapper[4831]: I1204 10:15:44.913619 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:44Z","lastTransitionTime":"2025-12-04T10:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.017743 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.018120 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.018277 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.018388 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.018475 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:45Z","lastTransitionTime":"2025-12-04T10:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.122231 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.122538 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.122762 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.122907 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.123041 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:45Z","lastTransitionTime":"2025-12-04T10:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.226689 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.226755 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.226769 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.226796 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.226817 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:45Z","lastTransitionTime":"2025-12-04T10:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.276186 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:15:45 crc kubenswrapper[4831]: E1204 10:15:45.276371 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.330211 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.330581 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.330763 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.330898 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.331108 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:45Z","lastTransitionTime":"2025-12-04T10:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.433992 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.434045 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.434087 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.434109 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.434119 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:45Z","lastTransitionTime":"2025-12-04T10:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.537355 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.537457 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.537476 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.537501 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.537521 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:45Z","lastTransitionTime":"2025-12-04T10:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.640626 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.640721 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.640739 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.640764 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.640782 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:45Z","lastTransitionTime":"2025-12-04T10:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.743560 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.743626 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.743643 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.743705 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.743726 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:45Z","lastTransitionTime":"2025-12-04T10:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.812943 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b50ce71-ca0a-4532-86e9-4f779dcc7b93-metrics-certs\") pod \"network-metrics-daemon-fd6cw\" (UID: \"5b50ce71-ca0a-4532-86e9-4f779dcc7b93\") " pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:15:45 crc kubenswrapper[4831]: E1204 10:15:45.813202 4831 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 10:15:45 crc kubenswrapper[4831]: E1204 10:15:45.813312 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b50ce71-ca0a-4532-86e9-4f779dcc7b93-metrics-certs podName:5b50ce71-ca0a-4532-86e9-4f779dcc7b93 nodeName:}" failed. No retries permitted until 2025-12-04 10:15:53.813279178 +0000 UTC m=+50.762454532 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b50ce71-ca0a-4532-86e9-4f779dcc7b93-metrics-certs") pod "network-metrics-daemon-fd6cw" (UID: "5b50ce71-ca0a-4532-86e9-4f779dcc7b93") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.846492 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.846540 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.846551 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.846565 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.846574 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:45Z","lastTransitionTime":"2025-12-04T10:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.949378 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.949434 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.949450 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.949472 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:45 crc kubenswrapper[4831]: I1204 10:15:45.949490 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:45Z","lastTransitionTime":"2025-12-04T10:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.052579 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.052704 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.052732 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.052764 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.052787 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:46Z","lastTransitionTime":"2025-12-04T10:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.156182 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.156680 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.156780 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.156859 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.156933 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:46Z","lastTransitionTime":"2025-12-04T10:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.259590 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.259647 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.259685 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.259709 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.259726 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:46Z","lastTransitionTime":"2025-12-04T10:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.276389 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.276431 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.276449 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:15:46 crc kubenswrapper[4831]: E1204 10:15:46.276582 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:15:46 crc kubenswrapper[4831]: E1204 10:15:46.276795 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:15:46 crc kubenswrapper[4831]: E1204 10:15:46.277002 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.363044 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.363097 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.363110 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.363131 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.363146 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:46Z","lastTransitionTime":"2025-12-04T10:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.465781 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.466103 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.466185 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.466291 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.466384 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:46Z","lastTransitionTime":"2025-12-04T10:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.569446 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.569488 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.569499 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.569516 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.569526 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:46Z","lastTransitionTime":"2025-12-04T10:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.600443 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.600499 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.600515 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.600540 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.600562 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:46Z","lastTransitionTime":"2025-12-04T10:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:46 crc kubenswrapper[4831]: E1204 10:15:46.614035 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae8c7b9f-5f4a-44bc-a820-5600f29471a7\\\",\\\"systemUUID\\\":\\\"aaf9904b-e604-46a1-bdf5-7d2b7b9a992c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:46Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.618517 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.618563 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.618577 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.618598 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.618613 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:46Z","lastTransitionTime":"2025-12-04T10:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:46 crc kubenswrapper[4831]: E1204 10:15:46.630562 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae8c7b9f-5f4a-44bc-a820-5600f29471a7\\\",\\\"systemUUID\\\":\\\"aaf9904b-e604-46a1-bdf5-7d2b7b9a992c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:46Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.634787 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.634844 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.634857 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.634878 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.634892 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:46Z","lastTransitionTime":"2025-12-04T10:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:46 crc kubenswrapper[4831]: E1204 10:15:46.652512 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae8c7b9f-5f4a-44bc-a820-5600f29471a7\\\",\\\"systemUUID\\\":\\\"aaf9904b-e604-46a1-bdf5-7d2b7b9a992c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:46Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.656707 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.656744 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.656757 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.656772 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.656784 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:46Z","lastTransitionTime":"2025-12-04T10:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:46 crc kubenswrapper[4831]: E1204 10:15:46.670930 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae8c7b9f-5f4a-44bc-a820-5600f29471a7\\\",\\\"systemUUID\\\":\\\"aaf9904b-e604-46a1-bdf5-7d2b7b9a992c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:46Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.675354 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.675406 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.675421 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.675444 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.675461 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:46Z","lastTransitionTime":"2025-12-04T10:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:46 crc kubenswrapper[4831]: E1204 10:15:46.692731 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae8c7b9f-5f4a-44bc-a820-5600f29471a7\\\",\\\"systemUUID\\\":\\\"aaf9904b-e604-46a1-bdf5-7d2b7b9a992c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:46Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:46 crc kubenswrapper[4831]: E1204 10:15:46.692984 4831 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.707705 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.707801 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.707831 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.707867 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.707906 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:46Z","lastTransitionTime":"2025-12-04T10:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.811388 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.811454 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.811472 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.811496 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.811513 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:46Z","lastTransitionTime":"2025-12-04T10:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.914058 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.914101 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.914111 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.914124 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:46 crc kubenswrapper[4831]: I1204 10:15:46.914133 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:46Z","lastTransitionTime":"2025-12-04T10:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.017441 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.017497 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.017514 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.017540 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.017558 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:47Z","lastTransitionTime":"2025-12-04T10:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.120020 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.120091 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.120101 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.120121 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.120135 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:47Z","lastTransitionTime":"2025-12-04T10:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.223592 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.223635 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.223645 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.223687 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.223699 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:47Z","lastTransitionTime":"2025-12-04T10:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.275589 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:15:47 crc kubenswrapper[4831]: E1204 10:15:47.275978 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.326608 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.326686 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.326696 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.326717 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.326729 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:47Z","lastTransitionTime":"2025-12-04T10:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.429108 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.429164 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.429173 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.429189 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.429199 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:47Z","lastTransitionTime":"2025-12-04T10:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.532409 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.532468 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.532483 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.532503 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.532517 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:47Z","lastTransitionTime":"2025-12-04T10:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.634939 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.634998 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.635009 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.635027 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.635040 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:47Z","lastTransitionTime":"2025-12-04T10:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.737916 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.738011 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.738036 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.738070 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.738095 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:47Z","lastTransitionTime":"2025-12-04T10:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.840957 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.841023 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.841042 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.841071 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.841094 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:47Z","lastTransitionTime":"2025-12-04T10:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.944095 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.944346 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.944459 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.944548 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:47 crc kubenswrapper[4831]: I1204 10:15:47.944694 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:47Z","lastTransitionTime":"2025-12-04T10:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.047803 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.047863 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.047880 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.047908 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.047926 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:48Z","lastTransitionTime":"2025-12-04T10:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.151397 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.151485 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.151510 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.151547 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.151575 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:48Z","lastTransitionTime":"2025-12-04T10:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.255341 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.255386 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.255398 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.255424 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.255436 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:48Z","lastTransitionTime":"2025-12-04T10:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.276372 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.276459 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:15:48 crc kubenswrapper[4831]: E1204 10:15:48.276565 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.276639 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:15:48 crc kubenswrapper[4831]: E1204 10:15:48.276857 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:15:48 crc kubenswrapper[4831]: E1204 10:15:48.277464 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.277940 4831 scope.go:117] "RemoveContainer" containerID="d36280e9b7a9c59f592f19bf2d522693e7c55a6621cc5d65fd9e6b61314417e7" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.358284 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.358334 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.358354 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.358379 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.358396 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:48Z","lastTransitionTime":"2025-12-04T10:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.461349 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.461415 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.461434 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.461459 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.461477 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:48Z","lastTransitionTime":"2025-12-04T10:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.564863 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.564938 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.564964 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.564999 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.565023 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:48Z","lastTransitionTime":"2025-12-04T10:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.667395 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.667467 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.667489 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.667521 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.667544 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:48Z","lastTransitionTime":"2025-12-04T10:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.770465 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.771051 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.771073 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.771101 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.771121 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:48Z","lastTransitionTime":"2025-12-04T10:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.873942 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.873996 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.874011 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.874029 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.874043 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:48Z","lastTransitionTime":"2025-12-04T10:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.977336 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.977408 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.977429 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.977458 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:48 crc kubenswrapper[4831]: I1204 10:15:48.977480 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:48Z","lastTransitionTime":"2025-12-04T10:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.080157 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.080232 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.080269 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.080295 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.080304 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:49Z","lastTransitionTime":"2025-12-04T10:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.182374 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.182413 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.182422 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.182436 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.182445 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:49Z","lastTransitionTime":"2025-12-04T10:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.276232 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:15:49 crc kubenswrapper[4831]: E1204 10:15:49.276389 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.285061 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.285118 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.285133 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.285153 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.285165 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:49Z","lastTransitionTime":"2025-12-04T10:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.387122 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.387157 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.387166 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.387180 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.387190 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:49Z","lastTransitionTime":"2025-12-04T10:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.489942 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.490020 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.490033 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.490051 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.490074 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:49Z","lastTransitionTime":"2025-12-04T10:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.593090 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.593143 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.593159 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.593182 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.593199 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:49Z","lastTransitionTime":"2025-12-04T10:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.597948 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xzkp_1261b9db-fe52-4fbc-9a9c-7e0c3486276e/ovnkube-controller/1.log" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.602886 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" event={"ID":"1261b9db-fe52-4fbc-9a9c-7e0c3486276e","Type":"ContainerStarted","Data":"32af2e5d1d5e83cda9e496e431c5ea7b74aa427b478bc27e3d20db69ba776ad8"} Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.603554 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.626805 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc448500592bc03a05d95154710642daf9362093c92bed1334c84b3f23ebf556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:49Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.649631 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:49Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.669225 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5814902f9f02bed9eb4600d77993f20ad463f2f78064f66755641f509913f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d1e9dabb13deee46d2c9ce72c90d2ed09c3e293b11b987c0c4be5b9a3cebe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:49Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.689777 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fade3c6b87d8b8d465b4e59ca3685265f09365b19feab5f4d31c18fecdea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:49Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.695768 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.695848 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.695869 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.695897 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.695918 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:49Z","lastTransitionTime":"2025-12-04T10:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.706643 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8a90f1a-4f3b-4007-8635-52d0af689af0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3ec5dcde986c3e369253eb2c4fc8d6e646c9c1557bc5c28a98a6c71562d04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e3c29b09715bd660a94365179740c46dd6bbeb04b333ecb5dadbf8cdea5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6c54394c01d6097ec29247b0676fc04989db88066d482eb053b5e9b040b630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://941587f1cf00e0b2279e8f08f1bad8e477f14a02f6cfc8ed858d0196e01f29f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:49Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.722412 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94431d54539f9725aa9b32ac709bc9406dd0f1a1f82c3de77b3fefe7a34e3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:49Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.742573 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:49Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.759689 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8475bb26-8864-4d49-935b-db7d4cb73387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bf0cca6fa6a2c105fe80613113f4e04f5252d70eb5de23b53632ba8479e070\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399a139e421589fc50e4bad256402be123e37600f5c39eadf919ce767e9eeeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g76nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:49Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.772602 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fd6cw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b50ce71-ca0a-4532-86e9-4f779dcc7b93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fd6cw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:49Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.794463 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:49Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.799158 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.799213 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.799267 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.799288 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.799304 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:49Z","lastTransitionTime":"2025-12-04T10:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.815689 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5g27v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6a78509-d612-4338-8562-9b0627c1793f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8109a06b8e0bea96f12e0e1cf757a34d1da4d89159cac1247613d59cfa5645df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5g27v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:49Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.847208 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32af2e5d1d5e83cda9e496e431c5ea7b74aa427b478bc27e3d20db69ba776ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d36280e9b7a9c59f592f19bf2d522693e7c55a6621cc5d65fd9e6b61314417e7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"message\\\":\\\"5:35.785914 6259 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1204 10:15:35.785917 6259 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1204 10:15:35.785923 6259 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 10:15:35.786159 6259 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 10:15:35.786390 6259 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 10:15:35.786452 6259 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 10:15:35.786578 6259 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1204 10:15:35.786806 6259 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 10:15:35.786944 6259 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 10:15:35.787222 6259 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xzkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:49Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.862415 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9ft9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3938322-cab2-412a-91e4-904ce2d99adf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a45d7846aee5ad8d18b94a52fb2d4e99365bc0a6c5b774d33b26e7a107fc825d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4l7pt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9ft9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:49Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.881188 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7266f967-3803-4ef3-9609-5a9c540a8305\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 10:15:16.683634 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 10:15:16.685267 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-316787488/tls.crt::/tmp/serving-cert-316787488/tls.key\\\\\\\"\\\\nI1204 10:15:22.063722 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 10:15:22.066433 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 10:15:22.066471 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 10:15:22.066492 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 10:15:22.066499 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 10:15:22.070852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 10:15:22.070873 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070879 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 10:15:22.070889 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 10:15:22.070892 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI1204 10:15:22.070891 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 10:15:22.070895 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 10:15:22.073753 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:49Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.902752 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.902819 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.902847 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.902878 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.902902 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:49Z","lastTransitionTime":"2025-12-04T10:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.905121 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c223077ddfa1e5cb3f1efbb02dcac1239c34d2337f398635019c8fe6e4509097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zk2rt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:49Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:49 crc kubenswrapper[4831]: I1204 10:15:49.919654 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttr4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca0a26ee-8553-41fc-8723-935bd994e3dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87e3574c160272839c8c85471f3e37b92c4bfaac028513cb345db9c804f73ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmwqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c389f190e572a728addfeaa6cb3bac8ece2ffa00fd451e584b5bcf4965256fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmwqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ttr4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:49Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.005722 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.005788 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.005805 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.005833 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.005851 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:50Z","lastTransitionTime":"2025-12-04T10:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.109558 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.109609 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.109620 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.109644 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.109655 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:50Z","lastTransitionTime":"2025-12-04T10:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.212744 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.212804 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.212825 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.212854 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.212876 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:50Z","lastTransitionTime":"2025-12-04T10:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.276195 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:15:50 crc kubenswrapper[4831]: E1204 10:15:50.276307 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.276209 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.276195 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:15:50 crc kubenswrapper[4831]: E1204 10:15:50.276373 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:15:50 crc kubenswrapper[4831]: E1204 10:15:50.276424 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.315121 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.315152 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.315174 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.315191 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.315200 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:50Z","lastTransitionTime":"2025-12-04T10:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.418323 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.418386 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.418408 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.418440 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.418461 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:50Z","lastTransitionTime":"2025-12-04T10:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.521474 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.521535 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.521545 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.521561 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.521571 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:50Z","lastTransitionTime":"2025-12-04T10:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.613751 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xzkp_1261b9db-fe52-4fbc-9a9c-7e0c3486276e/ovnkube-controller/2.log" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.614822 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xzkp_1261b9db-fe52-4fbc-9a9c-7e0c3486276e/ovnkube-controller/1.log" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.618508 4831 generic.go:334] "Generic (PLEG): container finished" podID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerID="32af2e5d1d5e83cda9e496e431c5ea7b74aa427b478bc27e3d20db69ba776ad8" exitCode=1 Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.618551 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" event={"ID":"1261b9db-fe52-4fbc-9a9c-7e0c3486276e","Type":"ContainerDied","Data":"32af2e5d1d5e83cda9e496e431c5ea7b74aa427b478bc27e3d20db69ba776ad8"} Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.618587 4831 scope.go:117] "RemoveContainer" containerID="d36280e9b7a9c59f592f19bf2d522693e7c55a6621cc5d65fd9e6b61314417e7" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.619986 4831 scope.go:117] "RemoveContainer" containerID="32af2e5d1d5e83cda9e496e431c5ea7b74aa427b478bc27e3d20db69ba776ad8" Dec 04 10:15:50 crc kubenswrapper[4831]: E1204 10:15:50.620284 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4xzkp_openshift-ovn-kubernetes(1261b9db-fe52-4fbc-9a9c-7e0c3486276e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.623971 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.624026 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.624043 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.624066 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.624083 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:50Z","lastTransitionTime":"2025-12-04T10:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.642066 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8a90f1a-4f3b-4007-8635-52d0af689af0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3ec5dcde986c3e369253eb2c4fc8d6e646c9c1557bc5c28a98a6c71562d04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e3c29b09715bd660a94365179740c46dd6bbeb04b333ecb5dadbf8cdea5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6c54394c01d6097ec29247b0676fc04989db88066d482eb053b5e9b040b630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://941587f1cf00e0b2279e8f08f1bad8e477f14a02f6cfc8ed858d0196e01f29f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:50Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.658085 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94431d54539f9725aa9b32ac709bc9406dd0f1a1f82c3de77b3fefe7a34e3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:50Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.676339 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:50Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.690272 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8475bb26-8864-4d49-935b-db7d4cb73387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bf0cca6fa6a2c105fe80613113f4e04f5252d70eb5de23b53632ba8479e070\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399a139e421589fc50e4bad256402be123e37600f5c39eadf919ce767e9eeeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g76nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:50Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.706324 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fd6cw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b50ce71-ca0a-4532-86e9-4f779dcc7b93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fd6cw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:50Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.725495 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:50Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.726787 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.726818 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.726826 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.726840 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.726849 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:50Z","lastTransitionTime":"2025-12-04T10:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.743005 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5g27v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6a78509-d612-4338-8562-9b0627c1793f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8109a06b8e0bea96f12e0e1cf757a34d1da4d89159cac1247613d59cfa5645df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5g27v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:50Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.775475 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32af2e5d1d5e83cda9e496e431c5ea7b74aa427b478bc27e3d20db69ba776ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d36280e9b7a9c59f592f19bf2d522693e7c55a6621cc5d65fd9e6b61314417e7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"message\\\":\\\"5:35.785914 6259 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1204 10:15:35.785917 6259 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1204 10:15:35.785923 6259 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 10:15:35.786159 6259 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 10:15:35.786390 6259 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 10:15:35.786452 6259 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 10:15:35.786578 6259 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1204 10:15:35.786806 6259 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 10:15:35.786944 6259 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 10:15:35.787222 6259 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32af2e5d1d5e83cda9e496e431c5ea7b74aa427b478bc27e3d20db69ba776ad8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T10:15:49Z\\\",\\\"message\\\":\\\"e hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1204 10:15:49.415770 6474 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/olm-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/olm-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1204 10:15:49.415841 6474 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1204 10:15:49.415869 6474 ovnkube.go:599] Stopped ovnkube\\\\nI1204 10:15:49.415914 6474 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1204 10:15:49.415980 6474 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xzkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:50Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.789867 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9ft9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3938322-cab2-412a-91e4-904ce2d99adf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a45d7846aee5ad8d18b94a52fb2d4e99365bc0a6c5b774d33b26e7a107fc825d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4l7pt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9ft9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:50Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.805258 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7266f967-3803-4ef3-9609-5a9c540a8305\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 10:15:16.683634 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 10:15:16.685267 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-316787488/tls.crt::/tmp/serving-cert-316787488/tls.key\\\\\\\"\\\\nI1204 10:15:22.063722 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 10:15:22.066433 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 10:15:22.066471 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 10:15:22.066492 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 10:15:22.066499 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 10:15:22.070852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 10:15:22.070873 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070879 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 10:15:22.070889 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 10:15:22.070892 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI1204 10:15:22.070891 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 10:15:22.070895 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 10:15:22.073753 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:50Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.829027 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c223077ddfa1e5cb3f1efbb02dcac1239c34d2337f398635019c8fe6e4509097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zk2rt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:50Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.830255 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.830300 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.830321 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.830347 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.830371 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:50Z","lastTransitionTime":"2025-12-04T10:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.843748 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttr4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca0a26ee-8553-41fc-8723-935bd994e3dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87e3574c160272839c8c85471f3e37b92c4bfaac028513cb345db9c804f73ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmwqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c389f190e572a728addfeaa6cb3bac8ece2ffa00fd451e584b5bcf4965256fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmwqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ttr4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:50Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.861579 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc448500592bc03a05d95154710642daf9362093c92bed1334c84b3f23ebf556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:50Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.876635 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:50Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.894545 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5814902f9f02bed9eb4600d77993f20ad463f2f78064f66755641f509913f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d1e9dabb13deee46d2c9ce72c90d2ed09c3e293b11b987c0c4be5b9a3cebe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:50Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.916465 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fade3c6b87d8b8d465b4e59ca3685265f09365b19feab5f4d31c18fecdea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:50Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.933578 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.933890 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.934041 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.934176 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:50 crc kubenswrapper[4831]: I1204 10:15:50.934277 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:50Z","lastTransitionTime":"2025-12-04T10:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.037477 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.037531 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.037543 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.037560 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.037572 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:51Z","lastTransitionTime":"2025-12-04T10:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.140242 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.140285 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.140296 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.140314 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.140326 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:51Z","lastTransitionTime":"2025-12-04T10:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.243392 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.243443 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.243453 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.243469 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.243480 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:51Z","lastTransitionTime":"2025-12-04T10:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.275384 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:15:51 crc kubenswrapper[4831]: E1204 10:15:51.275616 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.346863 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.346904 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.346918 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.346937 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.346951 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:51Z","lastTransitionTime":"2025-12-04T10:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.449368 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.449406 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.449415 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.449431 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.449441 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:51Z","lastTransitionTime":"2025-12-04T10:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.552703 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.552741 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.552750 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.552764 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.552774 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:51Z","lastTransitionTime":"2025-12-04T10:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.627934 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xzkp_1261b9db-fe52-4fbc-9a9c-7e0c3486276e/ovnkube-controller/2.log" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.634156 4831 scope.go:117] "RemoveContainer" containerID="32af2e5d1d5e83cda9e496e431c5ea7b74aa427b478bc27e3d20db69ba776ad8" Dec 04 10:15:51 crc kubenswrapper[4831]: E1204 10:15:51.634419 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4xzkp_openshift-ovn-kubernetes(1261b9db-fe52-4fbc-9a9c-7e0c3486276e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.655466 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.655528 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.655540 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.655565 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.655580 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:51Z","lastTransitionTime":"2025-12-04T10:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.666944 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32af2e5d1d5e83cda9e496e431c5ea7b74aa427b478bc27e3d20db69ba776ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32af2e5d1d5e83cda9e496e431c5ea7b74aa427b478bc27e3d20db69ba776ad8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T10:15:49Z\\\",\\\"message\\\":\\\"e hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1204 10:15:49.415770 6474 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/olm-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/olm-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1204 10:15:49.415841 6474 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1204 10:15:49.415869 6474 ovnkube.go:599] Stopped ovnkube\\\\nI1204 10:15:49.415914 6474 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1204 10:15:49.415980 6474 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4xzkp_openshift-ovn-kubernetes(1261b9db-fe52-4fbc-9a9c-7e0c3486276e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xzkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:51Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.684832 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9ft9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3938322-cab2-412a-91e4-904ce2d99adf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a45d7846aee5ad8d18b94a52fb2d4e99365bc0a6c5b774d33b26e7a107fc825d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4l7pt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9ft9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:51Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.705657 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:51Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.729305 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5g27v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6a78509-d612-4338-8562-9b0627c1793f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8109a06b8e0bea96f12e0e1cf757a34d1da4d89159cac1247613d59cfa5645df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5g27v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:51Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.749823 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c223077ddfa1e5cb3f1efbb02dcac1239c34d2337f398635019c8fe6e4509097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zk2rt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:51Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.759522 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.759628 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.759650 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.759708 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.759725 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:51Z","lastTransitionTime":"2025-12-04T10:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.771529 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttr4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca0a26ee-8553-41fc-8723-935bd994e3dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87e3574c160272839c8c85471f3e37b92c4bfaac028513cb345db9c804f73ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmwqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c389f190e572a728addfeaa6cb3bac8ece2ffa00fd451e584b5bcf4965256fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmwqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ttr4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:51Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.793817 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7266f967-3803-4ef3-9609-5a9c540a8305\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 10:15:16.683634 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 10:15:16.685267 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-316787488/tls.crt::/tmp/serving-cert-316787488/tls.key\\\\\\\"\\\\nI1204 10:15:22.063722 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 10:15:22.066433 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 10:15:22.066471 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 10:15:22.066492 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 10:15:22.066499 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 10:15:22.070852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 10:15:22.070873 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070879 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 10:15:22.070889 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 10:15:22.070892 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI1204 10:15:22.070891 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 10:15:22.070895 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 10:15:22.073753 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:51Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.813832 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:51Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.834809 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5814902f9f02bed9eb4600d77993f20ad463f2f78064f66755641f509913f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d1e9dabb13deee46d2c9ce72c90d2ed09c3e293b11b987c0c4be5b9a3cebe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:51Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.856177 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fade3c6b87d8b8d465b4e59ca3685265f09365b19feab5f4d31c18fecdea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:51Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.862556 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.862626 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.862642 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.862689 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.862706 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:51Z","lastTransitionTime":"2025-12-04T10:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.872931 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc448500592bc03a05d95154710642daf9362093c92bed1334c84b3f23ebf556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:51Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.890236 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:51Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.907053 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8475bb26-8864-4d49-935b-db7d4cb73387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bf0cca6fa6a2c105fe80613113f4e04f5252d70eb5de23b53632ba8479e070\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399a139e421589fc50e4bad256402be123e37600f5c39eadf919ce767e9eeeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g76nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:51Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.924464 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fd6cw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b50ce71-ca0a-4532-86e9-4f779dcc7b93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fd6cw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:51Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.945633 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8a90f1a-4f3b-4007-8635-52d0af689af0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3ec5dcde986c3e369253eb2c4fc8d6e646c9c1557bc5c28a98a6c71562d04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e3c29b09715bd660a94365179740c46dd6bbeb04b333ecb5dadbf8cdea5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6c54394c01d6097ec29247b0676fc04989db88066d482eb053b5e9b040b630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://941587f1cf00e0b2279e8f08f1bad8e477f14a02f6cfc8ed858d0196e01f29f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:51Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.962092 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94431d54539f9725aa9b32ac709bc9406dd0f1a1f82c3de77b3fefe7a34e3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:51Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.965452 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.965559 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.965580 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.965604 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:51 crc kubenswrapper[4831]: I1204 10:15:51.965623 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:51Z","lastTransitionTime":"2025-12-04T10:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.070589 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.070646 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.070679 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.070701 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.070715 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:52Z","lastTransitionTime":"2025-12-04T10:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.174028 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.174078 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.174095 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.174120 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.174139 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:52Z","lastTransitionTime":"2025-12-04T10:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.275361 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.275553 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:15:52 crc kubenswrapper[4831]: E1204 10:15:52.275684 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.275734 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:15:52 crc kubenswrapper[4831]: E1204 10:15:52.275926 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:15:52 crc kubenswrapper[4831]: E1204 10:15:52.276121 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.276850 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.276942 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.276966 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.276993 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.277025 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:52Z","lastTransitionTime":"2025-12-04T10:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.379396 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.379438 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.379450 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.379466 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.379477 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:52Z","lastTransitionTime":"2025-12-04T10:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.481373 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.481447 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.481469 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.481499 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.481521 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:52Z","lastTransitionTime":"2025-12-04T10:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.585108 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.585145 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.585157 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.585176 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.585188 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:52Z","lastTransitionTime":"2025-12-04T10:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.687277 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.688240 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.688434 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.688585 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.688765 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:52Z","lastTransitionTime":"2025-12-04T10:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.791842 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.791910 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.791926 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.791977 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.791994 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:52Z","lastTransitionTime":"2025-12-04T10:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.895172 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.895246 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.895269 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.895302 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.895324 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:52Z","lastTransitionTime":"2025-12-04T10:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.998333 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.998387 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.998404 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.998431 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:52 crc kubenswrapper[4831]: I1204 10:15:52.998456 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:52Z","lastTransitionTime":"2025-12-04T10:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.101507 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.101573 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.101592 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.101627 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.101708 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:53Z","lastTransitionTime":"2025-12-04T10:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.204361 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.204534 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.204563 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.204596 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.204618 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:53Z","lastTransitionTime":"2025-12-04T10:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.276163 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:15:53 crc kubenswrapper[4831]: E1204 10:15:53.276405 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.298406 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:53Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.307398 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.307466 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.307487 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.307511 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.307527 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:53Z","lastTransitionTime":"2025-12-04T10:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.318725 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5g27v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6a78509-d612-4338-8562-9b0627c1793f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8109a06b8e0bea96f12e0e1cf757a34d1da4d89159cac1247613d59cfa5645df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5g27v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:53Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.347684 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32af2e5d1d5e83cda9e496e431c5ea7b74aa427b478bc27e3d20db69ba776ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32af2e5d1d5e83cda9e496e431c5ea7b74aa427b478bc27e3d20db69ba776ad8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T10:15:49Z\\\",\\\"message\\\":\\\"e hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1204 10:15:49.415770 6474 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/olm-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/olm-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1204 10:15:49.415841 6474 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1204 10:15:49.415869 6474 ovnkube.go:599] Stopped ovnkube\\\\nI1204 10:15:49.415914 6474 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1204 10:15:49.415980 6474 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4xzkp_openshift-ovn-kubernetes(1261b9db-fe52-4fbc-9a9c-7e0c3486276e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xzkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:53Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.368288 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9ft9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3938322-cab2-412a-91e4-904ce2d99adf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a45d7846aee5ad8d18b94a52fb2d4e99365bc0a6c5b774d33b26e7a107fc825d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4l7pt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9ft9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:53Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.392345 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7266f967-3803-4ef3-9609-5a9c540a8305\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 10:15:16.683634 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 10:15:16.685267 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-316787488/tls.crt::/tmp/serving-cert-316787488/tls.key\\\\\\\"\\\\nI1204 10:15:22.063722 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 10:15:22.066433 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 10:15:22.066471 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 10:15:22.066492 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 10:15:22.066499 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 10:15:22.070852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 10:15:22.070873 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070879 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 10:15:22.070889 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 10:15:22.070892 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI1204 10:15:22.070891 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 10:15:22.070895 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 10:15:22.073753 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:53Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.409427 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.409477 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.409493 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.409516 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.409533 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:53Z","lastTransitionTime":"2025-12-04T10:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.422326 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c223077ddfa1e5cb3f1efbb02dcac1239c34d2337f398635019c8fe6e4509097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zk2rt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:53Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.440967 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttr4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca0a26ee-8553-41fc-8723-935bd994e3dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87e3574c160272839c8c85471f3e37b92c4bfaac028513cb345db9c804f73ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmwqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c389f190e572a728addfeaa6cb3bac8ece2ffa00fd451e584b5bcf4965256fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmwqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ttr4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:53Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.460691 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:53Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.480732 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5814902f9f02bed9eb4600d77993f20ad463f2f78064f66755641f509913f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d1e9dabb13deee46d2c9ce72c90d2ed09c3e293b11b987c0c4be5b9a3cebe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:53Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.499185 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fade3c6b87d8b8d465b4e59ca3685265f09365b19feab5f4d31c18fecdea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:53Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.512958 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.513031 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.513055 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.513084 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.513106 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:53Z","lastTransitionTime":"2025-12-04T10:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.519487 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc448500592bc03a05d95154710642daf9362093c92bed1334c84b3f23ebf556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:53Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.542523 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8a90f1a-4f3b-4007-8635-52d0af689af0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3ec5dcde986c3e369253eb2c4fc8d6e646c9c1557bc5c28a98a6c71562d04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e3c29b09715bd660a94365179740c46dd6bbeb04b333ecb5dadbf8cdea5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6c54394c01d6097ec29247b0676fc04989db88066d482eb053b5e9b040b630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://941587f1cf00e0b2279e8f08f1bad8e477f14a02f6cfc8ed858d0196e01f29f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:53Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.560050 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94431d54539f9725aa9b32ac709bc9406dd0f1a1f82c3de77b3fefe7a34e3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:53Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.582005 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:53Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.599842 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8475bb26-8864-4d49-935b-db7d4cb73387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bf0cca6fa6a2c105fe80613113f4e04f5252d70eb5de23b53632ba8479e070\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399a139e421589fc50e4bad256402be123e37600f5c39eadf919ce767e9eeeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g76nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:53Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.616699 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.616758 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.616778 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.616802 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.616819 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:53Z","lastTransitionTime":"2025-12-04T10:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.617189 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fd6cw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b50ce71-ca0a-4532-86e9-4f779dcc7b93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fd6cw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:53Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.720302 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.720349 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.720365 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.720388 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.720408 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:53Z","lastTransitionTime":"2025-12-04T10:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.822941 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.822982 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.822994 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.823009 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.823022 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:53Z","lastTransitionTime":"2025-12-04T10:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.905291 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b50ce71-ca0a-4532-86e9-4f779dcc7b93-metrics-certs\") pod \"network-metrics-daemon-fd6cw\" (UID: \"5b50ce71-ca0a-4532-86e9-4f779dcc7b93\") " pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:15:53 crc kubenswrapper[4831]: E1204 10:15:53.905491 4831 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 10:15:53 crc kubenswrapper[4831]: E1204 10:15:53.905630 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b50ce71-ca0a-4532-86e9-4f779dcc7b93-metrics-certs podName:5b50ce71-ca0a-4532-86e9-4f779dcc7b93 nodeName:}" failed. No retries permitted until 2025-12-04 10:16:09.905596275 +0000 UTC m=+66.854771629 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b50ce71-ca0a-4532-86e9-4f779dcc7b93-metrics-certs") pod "network-metrics-daemon-fd6cw" (UID: "5b50ce71-ca0a-4532-86e9-4f779dcc7b93") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.926301 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.926350 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.926366 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.926385 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:53 crc kubenswrapper[4831]: I1204 10:15:53.926401 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:53Z","lastTransitionTime":"2025-12-04T10:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.029387 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.029432 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.029450 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.029469 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.029481 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:54Z","lastTransitionTime":"2025-12-04T10:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.107446 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:15:54 crc kubenswrapper[4831]: E1204 10:15:54.107626 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:16:26.107595039 +0000 UTC m=+83.056770363 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.132170 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.132229 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.132238 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.132255 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.132264 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:54Z","lastTransitionTime":"2025-12-04T10:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.208966 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.209075 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.209123 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:15:54 crc kubenswrapper[4831]: E1204 10:15:54.209127 4831 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.209171 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:15:54 crc kubenswrapper[4831]: E1204 10:15:54.209223 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 10:16:26.209198646 +0000 UTC m=+83.158374000 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 10:15:54 crc kubenswrapper[4831]: E1204 10:15:54.209303 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 10:15:54 crc kubenswrapper[4831]: E1204 10:15:54.209333 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 10:15:54 crc kubenswrapper[4831]: E1204 10:15:54.209328 4831 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 10:15:54 crc kubenswrapper[4831]: E1204 10:15:54.209413 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 10:15:54 crc kubenswrapper[4831]: E1204 10:15:54.209344 4831 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 10:15:54 crc kubenswrapper[4831]: E1204 10:15:54.209458 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 10:15:54 crc kubenswrapper[4831]: E1204 10:15:54.209486 4831 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 10:15:54 crc kubenswrapper[4831]: E1204 10:15:54.209454 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 10:16:26.209421012 +0000 UTC m=+83.158596376 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 10:15:54 crc kubenswrapper[4831]: E1204 10:15:54.209544 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 10:16:26.209527915 +0000 UTC m=+83.158703229 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 10:15:54 crc kubenswrapper[4831]: E1204 10:15:54.209558 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 10:16:26.209551716 +0000 UTC m=+83.158727030 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.234889 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.235019 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.235044 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.235072 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.235093 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:54Z","lastTransitionTime":"2025-12-04T10:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.275633 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.275732 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.275734 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:15:54 crc kubenswrapper[4831]: E1204 10:15:54.275836 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:15:54 crc kubenswrapper[4831]: E1204 10:15:54.275933 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:15:54 crc kubenswrapper[4831]: E1204 10:15:54.276136 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.338971 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.339379 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.339539 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.339722 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.339879 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:54Z","lastTransitionTime":"2025-12-04T10:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.442980 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.443053 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.443078 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.443104 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.443121 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:54Z","lastTransitionTime":"2025-12-04T10:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.545782 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.545852 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.545875 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.545902 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.545923 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:54Z","lastTransitionTime":"2025-12-04T10:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.649239 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.649317 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.649338 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.649368 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.649390 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:54Z","lastTransitionTime":"2025-12-04T10:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.752806 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.753544 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.753834 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.754134 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.754393 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:54Z","lastTransitionTime":"2025-12-04T10:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.858145 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.858212 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.858235 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.858266 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.858289 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:54Z","lastTransitionTime":"2025-12-04T10:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.961143 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.961178 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.961187 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.961201 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:54 crc kubenswrapper[4831]: I1204 10:15:54.961211 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:54Z","lastTransitionTime":"2025-12-04T10:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.064806 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.064863 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.064879 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.064901 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.064917 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:55Z","lastTransitionTime":"2025-12-04T10:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.167939 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.167995 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.168016 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.168044 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.168067 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:55Z","lastTransitionTime":"2025-12-04T10:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.271158 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.271227 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.271244 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.271282 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.271300 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:55Z","lastTransitionTime":"2025-12-04T10:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.277696 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:15:55 crc kubenswrapper[4831]: E1204 10:15:55.277799 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.374090 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.374154 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.374171 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.374203 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.374223 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:55Z","lastTransitionTime":"2025-12-04T10:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.477327 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.477398 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.477423 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.477452 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.477475 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:55Z","lastTransitionTime":"2025-12-04T10:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.579868 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.579908 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.579920 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.579936 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.579949 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:55Z","lastTransitionTime":"2025-12-04T10:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.683181 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.683226 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.683236 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.683254 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.683265 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:55Z","lastTransitionTime":"2025-12-04T10:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.786905 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.786967 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.786987 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.787009 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.787026 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:55Z","lastTransitionTime":"2025-12-04T10:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.889569 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.889652 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.889720 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.889744 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.889761 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:55Z","lastTransitionTime":"2025-12-04T10:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.992579 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.992623 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.992640 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.992691 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:55 crc kubenswrapper[4831]: I1204 10:15:55.992709 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:55Z","lastTransitionTime":"2025-12-04T10:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.095747 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.096179 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.096319 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.096489 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.096642 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:56Z","lastTransitionTime":"2025-12-04T10:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.199731 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.199799 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.199816 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.199842 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.199860 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:56Z","lastTransitionTime":"2025-12-04T10:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.275705 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.275717 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.275764 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:15:56 crc kubenswrapper[4831]: E1204 10:15:56.276383 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:15:56 crc kubenswrapper[4831]: E1204 10:15:56.276427 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:15:56 crc kubenswrapper[4831]: E1204 10:15:56.276256 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.302720 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.302774 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.302786 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.302804 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.302819 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:56Z","lastTransitionTime":"2025-12-04T10:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.406031 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.406093 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.406111 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.406137 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.406158 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:56Z","lastTransitionTime":"2025-12-04T10:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.509192 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.509233 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.509242 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.509257 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.509266 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:56Z","lastTransitionTime":"2025-12-04T10:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.612763 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.612823 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.612840 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.612864 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.612881 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:56Z","lastTransitionTime":"2025-12-04T10:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.715483 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.715530 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.715544 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.715562 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.715575 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:56Z","lastTransitionTime":"2025-12-04T10:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.818797 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.818841 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.818853 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.818873 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.818886 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:56Z","lastTransitionTime":"2025-12-04T10:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.842991 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.843042 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.843060 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.843083 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.843100 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:56Z","lastTransitionTime":"2025-12-04T10:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:56 crc kubenswrapper[4831]: E1204 10:15:56.869638 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae8c7b9f-5f4a-44bc-a820-5600f29471a7\\\",\\\"systemUUID\\\":\\\"aaf9904b-e604-46a1-bdf5-7d2b7b9a992c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:56Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.874399 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.874468 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.874487 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.874958 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.875021 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:56Z","lastTransitionTime":"2025-12-04T10:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:56 crc kubenswrapper[4831]: E1204 10:15:56.892543 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae8c7b9f-5f4a-44bc-a820-5600f29471a7\\\",\\\"systemUUID\\\":\\\"aaf9904b-e604-46a1-bdf5-7d2b7b9a992c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:56Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.896731 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.896767 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.896775 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.896799 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.896809 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:56Z","lastTransitionTime":"2025-12-04T10:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:56 crc kubenswrapper[4831]: E1204 10:15:56.909581 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae8c7b9f-5f4a-44bc-a820-5600f29471a7\\\",\\\"systemUUID\\\":\\\"aaf9904b-e604-46a1-bdf5-7d2b7b9a992c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:56Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.912737 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.912764 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.912775 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.912789 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.912800 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:56Z","lastTransitionTime":"2025-12-04T10:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:56 crc kubenswrapper[4831]: E1204 10:15:56.922899 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae8c7b9f-5f4a-44bc-a820-5600f29471a7\\\",\\\"systemUUID\\\":\\\"aaf9904b-e604-46a1-bdf5-7d2b7b9a992c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:56Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.926290 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.926440 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.926548 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.926674 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.926809 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:56Z","lastTransitionTime":"2025-12-04T10:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:56 crc kubenswrapper[4831]: E1204 10:15:56.944561 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:15:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae8c7b9f-5f4a-44bc-a820-5600f29471a7\\\",\\\"systemUUID\\\":\\\"aaf9904b-e604-46a1-bdf5-7d2b7b9a992c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:56Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:56 crc kubenswrapper[4831]: E1204 10:15:56.944691 4831 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.946199 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.946221 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.946230 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.946245 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:56 crc kubenswrapper[4831]: I1204 10:15:56.946256 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:56Z","lastTransitionTime":"2025-12-04T10:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.049205 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.049245 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.049257 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.049274 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.049286 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:57Z","lastTransitionTime":"2025-12-04T10:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.151554 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.151588 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.151597 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.151610 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.151625 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:57Z","lastTransitionTime":"2025-12-04T10:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.254801 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.254873 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.254894 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.254925 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.254947 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:57Z","lastTransitionTime":"2025-12-04T10:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.275473 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:15:57 crc kubenswrapper[4831]: E1204 10:15:57.275764 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.357557 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.357795 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.357817 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.357833 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.357843 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:57Z","lastTransitionTime":"2025-12-04T10:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.460061 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.460137 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.460160 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.460187 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.460209 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:57Z","lastTransitionTime":"2025-12-04T10:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.563322 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.563364 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.563376 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.563393 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.563408 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:57Z","lastTransitionTime":"2025-12-04T10:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.666628 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.666726 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.666744 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.666770 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.666789 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:57Z","lastTransitionTime":"2025-12-04T10:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.769225 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.769593 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.769729 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.769843 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.769921 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:57Z","lastTransitionTime":"2025-12-04T10:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.873503 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.873561 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.873583 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.873614 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.873637 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:57Z","lastTransitionTime":"2025-12-04T10:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.977218 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.977285 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.977306 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.977336 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:57 crc kubenswrapper[4831]: I1204 10:15:57.977356 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:57Z","lastTransitionTime":"2025-12-04T10:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.080213 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.080251 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.080260 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.080272 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.080280 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:58Z","lastTransitionTime":"2025-12-04T10:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.183220 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.183262 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.183273 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.183295 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.183320 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:58Z","lastTransitionTime":"2025-12-04T10:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.276188 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.276233 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.276360 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:15:58 crc kubenswrapper[4831]: E1204 10:15:58.276527 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:15:58 crc kubenswrapper[4831]: E1204 10:15:58.276637 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:15:58 crc kubenswrapper[4831]: E1204 10:15:58.276919 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.286363 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.286419 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.286438 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.286461 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.286478 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:58Z","lastTransitionTime":"2025-12-04T10:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.389386 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.389441 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.389462 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.389489 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.389512 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:58Z","lastTransitionTime":"2025-12-04T10:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.494303 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.494362 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.494380 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.494405 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.494423 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:58Z","lastTransitionTime":"2025-12-04T10:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.598216 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.598287 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.598305 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.598336 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.598352 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:58Z","lastTransitionTime":"2025-12-04T10:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.701620 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.701731 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.701751 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.701775 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.701793 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:58Z","lastTransitionTime":"2025-12-04T10:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.805520 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.805960 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.806156 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.806415 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.806708 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:58Z","lastTransitionTime":"2025-12-04T10:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.910160 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.910875 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.910914 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.910941 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:58 crc kubenswrapper[4831]: I1204 10:15:58.910959 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:58Z","lastTransitionTime":"2025-12-04T10:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.013620 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.013702 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.013723 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.013747 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.013765 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:59Z","lastTransitionTime":"2025-12-04T10:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.116189 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.116234 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.116254 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.116274 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.116285 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:59Z","lastTransitionTime":"2025-12-04T10:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.218796 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.218841 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.218857 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.218877 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.218890 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:59Z","lastTransitionTime":"2025-12-04T10:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.275477 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:15:59 crc kubenswrapper[4831]: E1204 10:15:59.275714 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.321928 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.321994 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.322015 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.322040 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.322058 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:59Z","lastTransitionTime":"2025-12-04T10:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.425089 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.425147 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.425163 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.425186 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.425204 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:59Z","lastTransitionTime":"2025-12-04T10:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.528107 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.528139 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.528147 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.528159 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.528169 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:59Z","lastTransitionTime":"2025-12-04T10:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.631357 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.631427 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.631438 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.631454 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.631486 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:59Z","lastTransitionTime":"2025-12-04T10:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.735328 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.735804 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.735828 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.735853 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.735872 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:59Z","lastTransitionTime":"2025-12-04T10:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.780568 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.793622 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.804921 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c223077ddfa1e5cb3f1efbb02dcac1239c34d2337f398635019c8fe6e4509097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zk2rt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:59Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.820308 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttr4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca0a26ee-8553-41fc-8723-935bd994e3dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87e3574c160272839c8c85471f3e37b92c4bfaac028513cb345db9c804f73ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmwqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c389f190e572a728addfeaa6cb3bac8ece2ffa00fd451e584b5bcf4965256fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmwqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ttr4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:59Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.836926 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7266f967-3803-4ef3-9609-5a9c540a8305\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 10:15:16.683634 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 10:15:16.685267 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-316787488/tls.crt::/tmp/serving-cert-316787488/tls.key\\\\\\\"\\\\nI1204 10:15:22.063722 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 10:15:22.066433 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 10:15:22.066471 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 10:15:22.066492 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 10:15:22.066499 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 10:15:22.070852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 10:15:22.070873 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070879 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 10:15:22.070889 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 10:15:22.070892 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI1204 10:15:22.070891 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 10:15:22.070895 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 10:15:22.073753 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:59Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.837929 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.837979 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.837995 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.838014 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.838031 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:59Z","lastTransitionTime":"2025-12-04T10:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.853436 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:59Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.871846 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5814902f9f02bed9eb4600d77993f20ad463f2f78064f66755641f509913f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d1e9dabb13deee46d2c9ce72c90d2ed09c3e293b11b987c0c4be5b9a3cebe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:59Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.883952 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fade3c6b87d8b8d465b4e59ca3685265f09365b19feab5f4d31c18fecdea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:59Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.897038 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc448500592bc03a05d95154710642daf9362093c92bed1334c84b3f23ebf556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:59Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.910616 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:59Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.923172 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8475bb26-8864-4d49-935b-db7d4cb73387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bf0cca6fa6a2c105fe80613113f4e04f5252d70eb5de23b53632ba8479e070\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399a139e421589fc50e4bad256402be123e37600f5c39eadf919ce767e9eeeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g76nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:59Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.933783 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fd6cw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b50ce71-ca0a-4532-86e9-4f779dcc7b93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fd6cw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:59Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.940246 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.940295 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.940310 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.940328 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.940340 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:15:59Z","lastTransitionTime":"2025-12-04T10:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.946542 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8a90f1a-4f3b-4007-8635-52d0af689af0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3ec5dcde986c3e369253eb2c4fc8d6e646c9c1557bc5c28a98a6c71562d04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e3c29b09715bd660a94365179740c46dd6bbeb04b333ecb5dadbf8cdea5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6c54394c01d6097ec29247b0676fc04989db88066d482eb053b5e9b040b630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://941587f1cf00e0b2279e8f08f1bad8e477f14a02f6cfc8ed858d0196e01f29f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:59Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.956094 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94431d54539f9725aa9b32ac709bc9406dd0f1a1f82c3de77b3fefe7a34e3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:59Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.974546 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32af2e5d1d5e83cda9e496e431c5ea7b74aa427b478bc27e3d20db69ba776ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32af2e5d1d5e83cda9e496e431c5ea7b74aa427b478bc27e3d20db69ba776ad8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T10:15:49Z\\\",\\\"message\\\":\\\"e hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1204 10:15:49.415770 6474 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/olm-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/olm-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1204 10:15:49.415841 6474 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1204 10:15:49.415869 6474 ovnkube.go:599] Stopped ovnkube\\\\nI1204 10:15:49.415914 6474 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1204 10:15:49.415980 6474 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4xzkp_openshift-ovn-kubernetes(1261b9db-fe52-4fbc-9a9c-7e0c3486276e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xzkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:59Z is after 2025-08-24T17:21:41Z" Dec 04 10:15:59 crc kubenswrapper[4831]: I1204 10:15:59.989032 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9ft9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3938322-cab2-412a-91e4-904ce2d99adf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a45d7846aee5ad8d18b94a52fb2d4e99365bc0a6c5b774d33b26e7a107fc825d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4l7pt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9ft9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:15:59Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.003374 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:00Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.019087 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5g27v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6a78509-d612-4338-8562-9b0627c1793f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8109a06b8e0bea96f12e0e1cf757a34d1da4d89159cac1247613d59cfa5645df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5g27v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:00Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.043381 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.043419 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.043429 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.043444 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.043454 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:00Z","lastTransitionTime":"2025-12-04T10:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.145321 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.145357 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.145368 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.145385 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.145396 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:00Z","lastTransitionTime":"2025-12-04T10:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.248393 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.248460 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.248480 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.248505 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.248524 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:00Z","lastTransitionTime":"2025-12-04T10:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.276192 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.276231 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.276295 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:16:00 crc kubenswrapper[4831]: E1204 10:16:00.276373 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:16:00 crc kubenswrapper[4831]: E1204 10:16:00.276513 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:16:00 crc kubenswrapper[4831]: E1204 10:16:00.276677 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.354581 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.354643 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.354697 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.354729 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.354749 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:00Z","lastTransitionTime":"2025-12-04T10:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.457951 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.458128 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.458167 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.458194 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.458212 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:00Z","lastTransitionTime":"2025-12-04T10:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.560895 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.560976 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.560988 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.561005 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.561016 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:00Z","lastTransitionTime":"2025-12-04T10:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.663870 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.663963 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.663981 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.664003 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.664020 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:00Z","lastTransitionTime":"2025-12-04T10:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.767015 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.767076 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.767095 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.767117 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.767135 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:00Z","lastTransitionTime":"2025-12-04T10:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.870800 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.870867 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.870888 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.870915 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.870934 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:00Z","lastTransitionTime":"2025-12-04T10:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.974032 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.974092 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.974109 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.974136 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:00 crc kubenswrapper[4831]: I1204 10:16:00.974178 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:00Z","lastTransitionTime":"2025-12-04T10:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.076901 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.077098 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.077120 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.077144 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.077163 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:01Z","lastTransitionTime":"2025-12-04T10:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.180248 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.180297 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.180306 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.180323 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.180334 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:01Z","lastTransitionTime":"2025-12-04T10:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.275977 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:16:01 crc kubenswrapper[4831]: E1204 10:16:01.276188 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.283019 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.283083 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.283095 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.283110 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.283123 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:01Z","lastTransitionTime":"2025-12-04T10:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.385568 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.385611 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.385619 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.385633 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.385642 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:01Z","lastTransitionTime":"2025-12-04T10:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.488265 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.488331 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.488347 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.488376 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.488393 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:01Z","lastTransitionTime":"2025-12-04T10:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.591245 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.591304 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.591326 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.591348 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.591364 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:01Z","lastTransitionTime":"2025-12-04T10:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.694092 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.694162 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.694184 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.694209 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.694227 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:01Z","lastTransitionTime":"2025-12-04T10:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.798067 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.798127 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.798176 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.798207 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.798230 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:01Z","lastTransitionTime":"2025-12-04T10:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.900463 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.900532 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.900548 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.900572 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:01 crc kubenswrapper[4831]: I1204 10:16:01.900589 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:01Z","lastTransitionTime":"2025-12-04T10:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.002626 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.002689 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.002701 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.002716 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.002728 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:02Z","lastTransitionTime":"2025-12-04T10:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.105234 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.105297 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.105316 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.105340 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.105356 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:02Z","lastTransitionTime":"2025-12-04T10:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.207981 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.208049 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.208063 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.208081 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.208094 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:02Z","lastTransitionTime":"2025-12-04T10:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.275807 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.275840 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:16:02 crc kubenswrapper[4831]: E1204 10:16:02.275918 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.276022 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:16:02 crc kubenswrapper[4831]: E1204 10:16:02.276111 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:16:02 crc kubenswrapper[4831]: E1204 10:16:02.276209 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.310907 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.310955 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.310962 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.310976 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.310984 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:02Z","lastTransitionTime":"2025-12-04T10:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.414060 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.414121 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.414137 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.414203 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.414256 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:02Z","lastTransitionTime":"2025-12-04T10:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.517422 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.517487 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.517504 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.517529 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.517547 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:02Z","lastTransitionTime":"2025-12-04T10:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.620044 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.620129 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.620153 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.620183 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.620204 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:02Z","lastTransitionTime":"2025-12-04T10:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.724107 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.724151 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.724162 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.724179 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.724190 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:02Z","lastTransitionTime":"2025-12-04T10:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.827083 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.827143 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.827160 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.827184 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.827200 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:02Z","lastTransitionTime":"2025-12-04T10:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.930148 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.930213 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.930233 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.930261 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:02 crc kubenswrapper[4831]: I1204 10:16:02.930281 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:02Z","lastTransitionTime":"2025-12-04T10:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.032616 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.032840 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.032869 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.032897 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.032919 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:03Z","lastTransitionTime":"2025-12-04T10:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.136353 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.136441 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.136486 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.136572 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.136610 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:03Z","lastTransitionTime":"2025-12-04T10:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.239720 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.239776 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.239795 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.239820 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.239839 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:03Z","lastTransitionTime":"2025-12-04T10:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.275787 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:16:03 crc kubenswrapper[4831]: E1204 10:16:03.275938 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.299153 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:03Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.318386 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5g27v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6a78509-d612-4338-8562-9b0627c1793f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8109a06b8e0bea96f12e0e1cf757a34d1da4d89159cac1247613d59cfa5645df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5g27v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:03Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.340969 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32af2e5d1d5e83cda9e496e431c5ea7b74aa427b478bc27e3d20db69ba776ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32af2e5d1d5e83cda9e496e431c5ea7b74aa427b478bc27e3d20db69ba776ad8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T10:15:49Z\\\",\\\"message\\\":\\\"e hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1204 10:15:49.415770 6474 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/olm-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/olm-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1204 10:15:49.415841 6474 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1204 10:15:49.415869 6474 ovnkube.go:599] Stopped ovnkube\\\\nI1204 10:15:49.415914 6474 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1204 10:15:49.415980 6474 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4xzkp_openshift-ovn-kubernetes(1261b9db-fe52-4fbc-9a9c-7e0c3486276e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xzkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:03Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.342146 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.342187 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.342195 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.342209 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.342218 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:03Z","lastTransitionTime":"2025-12-04T10:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.353134 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9ft9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3938322-cab2-412a-91e4-904ce2d99adf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a45d7846aee5ad8d18b94a52fb2d4e99365bc0a6c5b774d33b26e7a107fc825d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4l7pt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9ft9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:03Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.368191 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7266f967-3803-4ef3-9609-5a9c540a8305\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 10:15:16.683634 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 10:15:16.685267 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-316787488/tls.crt::/tmp/serving-cert-316787488/tls.key\\\\\\\"\\\\nI1204 10:15:22.063722 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 10:15:22.066433 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 10:15:22.066471 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 10:15:22.066492 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 10:15:22.066499 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 10:15:22.070852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 10:15:22.070873 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070879 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 10:15:22.070889 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 10:15:22.070892 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI1204 10:15:22.070891 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 10:15:22.070895 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 10:15:22.073753 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:03Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.383246 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c223077ddfa1e5cb3f1efbb02dcac1239c34d2337f398635019c8fe6e4509097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zk2rt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:03Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.395966 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttr4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca0a26ee-8553-41fc-8723-935bd994e3dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87e3574c160272839c8c85471f3e37b92c4bfaac028513cb345db9c804f73ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmwqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c389f190e572a728addfeaa6cb3bac8ece2ffa00fd451e584b5bcf4965256fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmwqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ttr4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:03Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.406751 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc448500592bc03a05d95154710642daf9362093c92bed1334c84b3f23ebf556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:03Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.423688 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15fc060-02d6-4ae3-95f5-60660050cced\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fcdee1a860ddfe046b4983d9b2422cf6c993be574f549cb3667179d92944884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa8d355d4394ba0854fc067550b15f70e9e72a2a1efb0d7f981574db2f81836c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://745df7e6079182e2ba297b8e268d8a1647ce0f856e99f8f90d3dc92a6a071087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae20d8a14fc0db716ccf1450078acdb98f14e5522b17a00b243c67300d7e0b43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae20d8a14fc0db716ccf1450078acdb98f14e5522b17a00b243c67300d7e0b43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:03Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.435350 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:03Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.445001 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.445045 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.445056 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.445073 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.445087 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:03Z","lastTransitionTime":"2025-12-04T10:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.452466 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5814902f9f02bed9eb4600d77993f20ad463f2f78064f66755641f509913f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d1e9dabb13deee46d2c9ce72c90d2ed09c3e293b11b987c0c4be5b9a3cebe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:03Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.464392 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fade3c6b87d8b8d465b4e59ca3685265f09365b19feab5f4d31c18fecdea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:03Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.481030 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8a90f1a-4f3b-4007-8635-52d0af689af0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3ec5dcde986c3e369253eb2c4fc8d6e646c9c1557bc5c28a98a6c71562d04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e3c29b09715bd660a94365179740c46dd6bbeb04b333ecb5dadbf8cdea5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6c54394c01d6097ec29247b0676fc04989db88066d482eb053b5e9b040b630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://941587f1cf00e0b2279e8f08f1bad8e477f14a02f6cfc8ed858d0196e01f29f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:03Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.491631 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94431d54539f9725aa9b32ac709bc9406dd0f1a1f82c3de77b3fefe7a34e3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:03Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.502267 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:03Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.511032 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8475bb26-8864-4d49-935b-db7d4cb73387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bf0cca6fa6a2c105fe80613113f4e04f5252d70eb5de23b53632ba8479e070\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399a139e421589fc50e4bad256402be123e37600f5c39eadf919ce767e9eeeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g76nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:03Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.521346 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fd6cw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b50ce71-ca0a-4532-86e9-4f779dcc7b93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fd6cw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:03Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.546992 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.547119 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.547138 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.547160 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.547175 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:03Z","lastTransitionTime":"2025-12-04T10:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.649802 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.649855 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.649866 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.649882 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.649893 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:03Z","lastTransitionTime":"2025-12-04T10:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.753281 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.753333 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.753348 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.753365 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.753378 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:03Z","lastTransitionTime":"2025-12-04T10:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.855764 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.855808 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.855817 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.855834 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.855846 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:03Z","lastTransitionTime":"2025-12-04T10:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.958466 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.958864 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.958969 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.959077 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:03 crc kubenswrapper[4831]: I1204 10:16:03.959176 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:03Z","lastTransitionTime":"2025-12-04T10:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.062532 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.062597 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.062620 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.062860 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.062884 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:04Z","lastTransitionTime":"2025-12-04T10:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.165815 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.165895 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.165918 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.165945 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.165962 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:04Z","lastTransitionTime":"2025-12-04T10:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.268768 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.268849 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.268876 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.268908 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.268932 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:04Z","lastTransitionTime":"2025-12-04T10:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.276085 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.276163 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.276326 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:16:04 crc kubenswrapper[4831]: E1204 10:16:04.276535 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:16:04 crc kubenswrapper[4831]: E1204 10:16:04.276685 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:16:04 crc kubenswrapper[4831]: E1204 10:16:04.276884 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.371652 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.371993 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.372202 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.372396 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.372542 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:04Z","lastTransitionTime":"2025-12-04T10:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.476077 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.476400 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.476522 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.476699 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.476855 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:04Z","lastTransitionTime":"2025-12-04T10:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.579934 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.579984 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.579999 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.580020 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.580037 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:04Z","lastTransitionTime":"2025-12-04T10:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.682533 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.682632 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.682734 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.682767 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.682784 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:04Z","lastTransitionTime":"2025-12-04T10:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.785345 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.785410 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.785420 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.785435 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.785444 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:04Z","lastTransitionTime":"2025-12-04T10:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.888308 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.888375 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.888383 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.888416 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.888428 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:04Z","lastTransitionTime":"2025-12-04T10:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.990980 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.991053 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.991070 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.991095 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:04 crc kubenswrapper[4831]: I1204 10:16:04.991116 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:04Z","lastTransitionTime":"2025-12-04T10:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.094626 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.094692 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.094701 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.094716 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.094725 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:05Z","lastTransitionTime":"2025-12-04T10:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.197119 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.197524 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.197541 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.197564 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.197580 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:05Z","lastTransitionTime":"2025-12-04T10:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.276775 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.277396 4831 scope.go:117] "RemoveContainer" containerID="32af2e5d1d5e83cda9e496e431c5ea7b74aa427b478bc27e3d20db69ba776ad8" Dec 04 10:16:05 crc kubenswrapper[4831]: E1204 10:16:05.277569 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4xzkp_openshift-ovn-kubernetes(1261b9db-fe52-4fbc-9a9c-7e0c3486276e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" Dec 04 10:16:05 crc kubenswrapper[4831]: E1204 10:16:05.277776 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.299649 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.299839 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.299952 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.300056 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.300154 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:05Z","lastTransitionTime":"2025-12-04T10:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.402621 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.402868 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.402937 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.403001 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.403062 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:05Z","lastTransitionTime":"2025-12-04T10:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.506569 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.506613 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.506623 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.506638 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.506647 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:05Z","lastTransitionTime":"2025-12-04T10:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.609400 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.609807 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.610126 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.610301 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.610449 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:05Z","lastTransitionTime":"2025-12-04T10:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.713627 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.713743 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.713765 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.713795 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.713814 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:05Z","lastTransitionTime":"2025-12-04T10:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.817137 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.817184 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.817196 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.817214 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.817226 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:05Z","lastTransitionTime":"2025-12-04T10:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.919595 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.919680 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.919706 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.919729 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:05 crc kubenswrapper[4831]: I1204 10:16:05.919743 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:05Z","lastTransitionTime":"2025-12-04T10:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.022281 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.022344 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.022365 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.022390 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.022409 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:06Z","lastTransitionTime":"2025-12-04T10:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.126044 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.126113 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.126133 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.126157 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.126176 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:06Z","lastTransitionTime":"2025-12-04T10:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.231055 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.231107 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.231119 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.231139 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.231151 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:06Z","lastTransitionTime":"2025-12-04T10:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.276447 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.276564 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.276452 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:16:06 crc kubenswrapper[4831]: E1204 10:16:06.276696 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:16:06 crc kubenswrapper[4831]: E1204 10:16:06.276871 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:16:06 crc kubenswrapper[4831]: E1204 10:16:06.276981 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.334015 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.334101 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.334127 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.334157 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.334183 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:06Z","lastTransitionTime":"2025-12-04T10:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.437309 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.437352 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.437365 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.437383 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.437395 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:06Z","lastTransitionTime":"2025-12-04T10:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.540464 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.540919 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.541135 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.541291 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.541487 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:06Z","lastTransitionTime":"2025-12-04T10:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.643884 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.643936 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.643952 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.643976 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.643993 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:06Z","lastTransitionTime":"2025-12-04T10:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.746049 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.746086 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.746096 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.746109 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.746117 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:06Z","lastTransitionTime":"2025-12-04T10:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.848600 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.848933 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.849006 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.849076 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.849143 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:06Z","lastTransitionTime":"2025-12-04T10:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.952696 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.952809 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.952835 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.952885 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:06 crc kubenswrapper[4831]: I1204 10:16:06.952910 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:06Z","lastTransitionTime":"2025-12-04T10:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.055455 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.055523 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.055542 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.055577 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.055596 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:07Z","lastTransitionTime":"2025-12-04T10:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.158939 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.159700 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.159845 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.159972 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.160084 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:07Z","lastTransitionTime":"2025-12-04T10:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.196474 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.196901 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.196994 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.197081 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.197164 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:07Z","lastTransitionTime":"2025-12-04T10:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:07 crc kubenswrapper[4831]: E1204 10:16:07.213083 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae8c7b9f-5f4a-44bc-a820-5600f29471a7\\\",\\\"systemUUID\\\":\\\"aaf9904b-e604-46a1-bdf5-7d2b7b9a992c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:07Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.217948 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.217989 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.218005 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.218021 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.218034 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:07Z","lastTransitionTime":"2025-12-04T10:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:07 crc kubenswrapper[4831]: E1204 10:16:07.231976 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae8c7b9f-5f4a-44bc-a820-5600f29471a7\\\",\\\"systemUUID\\\":\\\"aaf9904b-e604-46a1-bdf5-7d2b7b9a992c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:07Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.236219 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.236263 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.236274 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.236290 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.236302 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:07Z","lastTransitionTime":"2025-12-04T10:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:07 crc kubenswrapper[4831]: E1204 10:16:07.251190 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae8c7b9f-5f4a-44bc-a820-5600f29471a7\\\",\\\"systemUUID\\\":\\\"aaf9904b-e604-46a1-bdf5-7d2b7b9a992c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:07Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.255265 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.255427 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.255549 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.255700 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.255875 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:07Z","lastTransitionTime":"2025-12-04T10:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:07 crc kubenswrapper[4831]: E1204 10:16:07.269833 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae8c7b9f-5f4a-44bc-a820-5600f29471a7\\\",\\\"systemUUID\\\":\\\"aaf9904b-e604-46a1-bdf5-7d2b7b9a992c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:07Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.274370 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.274536 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.274707 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.274840 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.275013 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:07Z","lastTransitionTime":"2025-12-04T10:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.275359 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:16:07 crc kubenswrapper[4831]: E1204 10:16:07.275453 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:16:07 crc kubenswrapper[4831]: E1204 10:16:07.292392 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae8c7b9f-5f4a-44bc-a820-5600f29471a7\\\",\\\"systemUUID\\\":\\\"aaf9904b-e604-46a1-bdf5-7d2b7b9a992c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:07Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:07 crc kubenswrapper[4831]: E1204 10:16:07.292992 4831 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.294879 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.295054 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.295174 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.295307 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.295423 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:07Z","lastTransitionTime":"2025-12-04T10:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.397823 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.398076 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.398151 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.398258 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.398350 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:07Z","lastTransitionTime":"2025-12-04T10:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.501108 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.501150 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.501162 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.501180 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.501192 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:07Z","lastTransitionTime":"2025-12-04T10:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.604363 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.604417 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.604437 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.604459 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.604479 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:07Z","lastTransitionTime":"2025-12-04T10:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.706633 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.706749 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.706777 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.706806 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.706827 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:07Z","lastTransitionTime":"2025-12-04T10:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.810130 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.810193 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.810216 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.810247 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.810268 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:07Z","lastTransitionTime":"2025-12-04T10:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.913100 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.913196 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.913206 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.913225 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:07 crc kubenswrapper[4831]: I1204 10:16:07.913235 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:07Z","lastTransitionTime":"2025-12-04T10:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.015887 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.015985 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.016003 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.016055 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.016106 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:08Z","lastTransitionTime":"2025-12-04T10:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.118286 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.118323 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.118331 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.118549 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.118578 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:08Z","lastTransitionTime":"2025-12-04T10:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.221232 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.221278 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.221294 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.221312 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.221325 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:08Z","lastTransitionTime":"2025-12-04T10:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.276222 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.276301 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.276257 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:16:08 crc kubenswrapper[4831]: E1204 10:16:08.276415 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:16:08 crc kubenswrapper[4831]: E1204 10:16:08.276513 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:16:08 crc kubenswrapper[4831]: E1204 10:16:08.276622 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.324147 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.324199 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.324211 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.324226 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.324235 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:08Z","lastTransitionTime":"2025-12-04T10:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.426027 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.426087 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.426101 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.426116 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.426127 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:08Z","lastTransitionTime":"2025-12-04T10:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.529556 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.529594 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.529604 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.529621 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.529632 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:08Z","lastTransitionTime":"2025-12-04T10:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.631889 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.631925 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.631937 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.631951 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.631963 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:08Z","lastTransitionTime":"2025-12-04T10:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.734046 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.734080 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.734090 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.734103 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.734111 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:08Z","lastTransitionTime":"2025-12-04T10:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.836258 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.836317 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.836350 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.836366 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.836378 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:08Z","lastTransitionTime":"2025-12-04T10:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.938343 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.938369 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.938377 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.938389 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:08 crc kubenswrapper[4831]: I1204 10:16:08.938397 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:08Z","lastTransitionTime":"2025-12-04T10:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.040891 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.040946 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.040959 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.040972 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.040981 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:09Z","lastTransitionTime":"2025-12-04T10:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.143318 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.143368 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.143379 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.143392 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.143402 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:09Z","lastTransitionTime":"2025-12-04T10:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.244872 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.244923 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.244940 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.244961 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.244983 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:09Z","lastTransitionTime":"2025-12-04T10:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.275427 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:16:09 crc kubenswrapper[4831]: E1204 10:16:09.275606 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.346952 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.347161 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.347285 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.347372 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.347448 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:09Z","lastTransitionTime":"2025-12-04T10:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.450205 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.450255 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.450267 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.450283 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.450296 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:09Z","lastTransitionTime":"2025-12-04T10:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.552349 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.552390 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.552401 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.552416 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.552426 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:09Z","lastTransitionTime":"2025-12-04T10:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.654940 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.654979 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.655006 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.655020 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.655044 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:09Z","lastTransitionTime":"2025-12-04T10:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.757179 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.757213 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.757222 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.757237 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.757247 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:09Z","lastTransitionTime":"2025-12-04T10:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.859651 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.859728 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.859743 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.859765 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.859781 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:09Z","lastTransitionTime":"2025-12-04T10:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.962099 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.962132 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.962140 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.962152 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.962160 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:09Z","lastTransitionTime":"2025-12-04T10:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:09 crc kubenswrapper[4831]: I1204 10:16:09.992572 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b50ce71-ca0a-4532-86e9-4f779dcc7b93-metrics-certs\") pod \"network-metrics-daemon-fd6cw\" (UID: \"5b50ce71-ca0a-4532-86e9-4f779dcc7b93\") " pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:16:09 crc kubenswrapper[4831]: E1204 10:16:09.992692 4831 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 10:16:09 crc kubenswrapper[4831]: E1204 10:16:09.992750 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b50ce71-ca0a-4532-86e9-4f779dcc7b93-metrics-certs podName:5b50ce71-ca0a-4532-86e9-4f779dcc7b93 nodeName:}" failed. No retries permitted until 2025-12-04 10:16:41.992736619 +0000 UTC m=+98.941911923 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b50ce71-ca0a-4532-86e9-4f779dcc7b93-metrics-certs") pod "network-metrics-daemon-fd6cw" (UID: "5b50ce71-ca0a-4532-86e9-4f779dcc7b93") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.064755 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.064788 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.064798 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.064814 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.064824 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:10Z","lastTransitionTime":"2025-12-04T10:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.167064 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.167094 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.167107 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.167121 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.167131 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:10Z","lastTransitionTime":"2025-12-04T10:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.269183 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.269207 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.269218 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.269229 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.269238 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:10Z","lastTransitionTime":"2025-12-04T10:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.275820 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.275856 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.275926 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:16:10 crc kubenswrapper[4831]: E1204 10:16:10.276003 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:16:10 crc kubenswrapper[4831]: E1204 10:16:10.276123 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:16:10 crc kubenswrapper[4831]: E1204 10:16:10.276233 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.371974 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.372024 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.372040 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.372064 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.372081 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:10Z","lastTransitionTime":"2025-12-04T10:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.473778 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.473828 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.473846 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.473871 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.473890 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:10Z","lastTransitionTime":"2025-12-04T10:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.577219 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.577267 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.577280 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.577298 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.577310 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:10Z","lastTransitionTime":"2025-12-04T10:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.680877 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.680918 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.680930 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.680950 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.680970 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:10Z","lastTransitionTime":"2025-12-04T10:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.784841 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.784892 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.784900 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.784912 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.784948 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:10Z","lastTransitionTime":"2025-12-04T10:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.888551 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.888592 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.888600 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.888617 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.888629 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:10Z","lastTransitionTime":"2025-12-04T10:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.990649 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.990722 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.990732 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.990747 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:10 crc kubenswrapper[4831]: I1204 10:16:10.990763 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:10Z","lastTransitionTime":"2025-12-04T10:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.092822 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.092861 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.092869 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.092887 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.092900 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:11Z","lastTransitionTime":"2025-12-04T10:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.195105 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.195176 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.195200 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.195232 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.195259 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:11Z","lastTransitionTime":"2025-12-04T10:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.275815 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:16:11 crc kubenswrapper[4831]: E1204 10:16:11.275984 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.297763 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.297809 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.297821 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.297838 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.297850 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:11Z","lastTransitionTime":"2025-12-04T10:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.400149 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.400192 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.400204 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.400222 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.400235 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:11Z","lastTransitionTime":"2025-12-04T10:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.503050 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.503116 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.503139 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.503169 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.503191 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:11Z","lastTransitionTime":"2025-12-04T10:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.606698 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.606746 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.606757 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.606772 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.606784 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:11Z","lastTransitionTime":"2025-12-04T10:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.702126 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5g27v_c6a78509-d612-4338-8562-9b0627c1793f/kube-multus/0.log" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.702181 4831 generic.go:334] "Generic (PLEG): container finished" podID="c6a78509-d612-4338-8562-9b0627c1793f" containerID="8109a06b8e0bea96f12e0e1cf757a34d1da4d89159cac1247613d59cfa5645df" exitCode=1 Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.702212 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5g27v" event={"ID":"c6a78509-d612-4338-8562-9b0627c1793f","Type":"ContainerDied","Data":"8109a06b8e0bea96f12e0e1cf757a34d1da4d89159cac1247613d59cfa5645df"} Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.702596 4831 scope.go:117] "RemoveContainer" containerID="8109a06b8e0bea96f12e0e1cf757a34d1da4d89159cac1247613d59cfa5645df" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.708095 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.708121 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.708129 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.708142 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.708150 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:11Z","lastTransitionTime":"2025-12-04T10:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.725194 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fade3c6b87d8b8d465b4e59ca3685265f09365b19feab5f4d31c18fecdea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:11Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.737996 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc448500592bc03a05d95154710642daf9362093c92bed1334c84b3f23ebf556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:11Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.748897 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15fc060-02d6-4ae3-95f5-60660050cced\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fcdee1a860ddfe046b4983d9b2422cf6c993be574f549cb3667179d92944884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa8d355d4394ba0854fc067550b15f70e9e72a2a1efb0d7f981574db2f81836c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://745df7e6079182e2ba297b8e268d8a1647ce0f856e99f8f90d3dc92a6a071087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae20d8a14fc0db716ccf1450078acdb98f14e5522b17a00b243c67300d7e0b43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae20d8a14fc0db716ccf1450078acdb98f14e5522b17a00b243c67300d7e0b43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:11Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.760357 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:11Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.776674 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5814902f9f02bed9eb4600d77993f20ad463f2f78064f66755641f509913f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d1e9dabb13deee46d2c9ce72c90d2ed09c3e293b11b987c0c4be5b9a3cebe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:11Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.788518 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fd6cw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b50ce71-ca0a-4532-86e9-4f779dcc7b93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fd6cw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:11Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.799783 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8a90f1a-4f3b-4007-8635-52d0af689af0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3ec5dcde986c3e369253eb2c4fc8d6e646c9c1557bc5c28a98a6c71562d04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e3c29b09715bd660a94365179740c46dd6bbeb04b333ecb5dadbf8cdea5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6c54394c01d6097ec29247b0676fc04989db88066d482eb053b5e9b040b630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://941587f1cf00e0b2279e8f08f1bad8e477f14a02f6cfc8ed858d0196e01f29f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:11Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.809810 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.809843 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.809853 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.809868 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.809880 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:11Z","lastTransitionTime":"2025-12-04T10:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.814053 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94431d54539f9725aa9b32ac709bc9406dd0f1a1f82c3de77b3fefe7a34e3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:11Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.825566 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:11Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.836257 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8475bb26-8864-4d49-935b-db7d4cb73387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bf0cca6fa6a2c105fe80613113f4e04f5252d70eb5de23b53632ba8479e070\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399a139e421589fc50e4bad256402be123e37600f5c39eadf919ce767e9eeeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g76nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:11Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.848707 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:11Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.861112 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5g27v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6a78509-d612-4338-8562-9b0627c1793f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8109a06b8e0bea96f12e0e1cf757a34d1da4d89159cac1247613d59cfa5645df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8109a06b8e0bea96f12e0e1cf757a34d1da4d89159cac1247613d59cfa5645df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T10:16:10Z\\\",\\\"message\\\":\\\"2025-12-04T10:15:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_fa2cef0a-01d4-4a99-8b70-548df287b1a3\\\\n2025-12-04T10:15:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_fa2cef0a-01d4-4a99-8b70-548df287b1a3 to /host/opt/cni/bin/\\\\n2025-12-04T10:15:25Z [verbose] multus-daemon started\\\\n2025-12-04T10:15:25Z [verbose] Readiness Indicator file check\\\\n2025-12-04T10:16:10Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5g27v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:11Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.876597 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32af2e5d1d5e83cda9e496e431c5ea7b74aa427b478bc27e3d20db69ba776ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32af2e5d1d5e83cda9e496e431c5ea7b74aa427b478bc27e3d20db69ba776ad8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T10:15:49Z\\\",\\\"message\\\":\\\"e hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1204 10:15:49.415770 6474 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/olm-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/olm-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1204 10:15:49.415841 6474 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1204 10:15:49.415869 6474 ovnkube.go:599] Stopped ovnkube\\\\nI1204 10:15:49.415914 6474 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1204 10:15:49.415980 6474 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4xzkp_openshift-ovn-kubernetes(1261b9db-fe52-4fbc-9a9c-7e0c3486276e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xzkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:11Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.886781 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9ft9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3938322-cab2-412a-91e4-904ce2d99adf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a45d7846aee5ad8d18b94a52fb2d4e99365bc0a6c5b774d33b26e7a107fc825d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4l7pt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9ft9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:11Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.897567 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7266f967-3803-4ef3-9609-5a9c540a8305\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 10:15:16.683634 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 10:15:16.685267 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-316787488/tls.crt::/tmp/serving-cert-316787488/tls.key\\\\\\\"\\\\nI1204 10:15:22.063722 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 10:15:22.066433 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 10:15:22.066471 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 10:15:22.066492 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 10:15:22.066499 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 10:15:22.070852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 10:15:22.070873 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070879 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 10:15:22.070889 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 10:15:22.070892 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI1204 10:15:22.070891 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 10:15:22.070895 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 10:15:22.073753 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:11Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.912324 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.912354 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.912362 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.912376 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.912386 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:11Z","lastTransitionTime":"2025-12-04T10:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.915737 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c223077ddfa1e5cb3f1efbb02dcac1239c34d2337f398635019c8fe6e4509097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zk2rt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:11Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:11 crc kubenswrapper[4831]: I1204 10:16:11.928490 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttr4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca0a26ee-8553-41fc-8723-935bd994e3dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87e3574c160272839c8c85471f3e37b92c4bfaac028513cb345db9c804f73ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmwqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c389f190e572a728addfeaa6cb3bac8ece2ffa00fd451e584b5bcf4965256fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmwqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ttr4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:11Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.014589 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.014625 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.014633 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.014649 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.014671 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:12Z","lastTransitionTime":"2025-12-04T10:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.117167 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.117209 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.117219 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.117255 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.117266 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:12Z","lastTransitionTime":"2025-12-04T10:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.219075 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.219104 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.219112 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.219126 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.219135 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:12Z","lastTransitionTime":"2025-12-04T10:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.276174 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:16:12 crc kubenswrapper[4831]: E1204 10:16:12.276296 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.276460 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:16:12 crc kubenswrapper[4831]: E1204 10:16:12.276509 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.276608 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:16:12 crc kubenswrapper[4831]: E1204 10:16:12.276649 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.323039 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.323072 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.323080 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.323093 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.323102 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:12Z","lastTransitionTime":"2025-12-04T10:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.426484 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.426518 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.426528 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.426542 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.426552 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:12Z","lastTransitionTime":"2025-12-04T10:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.530755 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.530835 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.530859 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.530889 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.530915 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:12Z","lastTransitionTime":"2025-12-04T10:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.633683 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.633718 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.633726 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.633740 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.633749 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:12Z","lastTransitionTime":"2025-12-04T10:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.708922 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5g27v_c6a78509-d612-4338-8562-9b0627c1793f/kube-multus/0.log" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.708986 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5g27v" event={"ID":"c6a78509-d612-4338-8562-9b0627c1793f","Type":"ContainerStarted","Data":"f4bd1b329fe50dcd62c415378ed6a27d47c10396502c612f82a95540cd56501a"} Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.725698 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32af2e5d1d5e83cda9e496e431c5ea7b74aa427b478bc27e3d20db69ba776ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32af2e5d1d5e83cda9e496e431c5ea7b74aa427b478bc27e3d20db69ba776ad8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T10:15:49Z\\\",\\\"message\\\":\\\"e hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1204 10:15:49.415770 6474 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/olm-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/olm-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1204 10:15:49.415841 6474 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1204 10:15:49.415869 6474 ovnkube.go:599] Stopped ovnkube\\\\nI1204 10:15:49.415914 6474 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1204 10:15:49.415980 6474 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4xzkp_openshift-ovn-kubernetes(1261b9db-fe52-4fbc-9a9c-7e0c3486276e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xzkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:12Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.736290 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.736325 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.736336 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.736353 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.736365 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:12Z","lastTransitionTime":"2025-12-04T10:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.737398 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9ft9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3938322-cab2-412a-91e4-904ce2d99adf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a45d7846aee5ad8d18b94a52fb2d4e99365bc0a6c5b774d33b26e7a107fc825d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4l7pt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9ft9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:12Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.749412 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:12Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.761732 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5g27v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6a78509-d612-4338-8562-9b0627c1793f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4bd1b329fe50dcd62c415378ed6a27d47c10396502c612f82a95540cd56501a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8109a06b8e0bea96f12e0e1cf757a34d1da4d89159cac1247613d59cfa5645df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T10:16:10Z\\\",\\\"message\\\":\\\"2025-12-04T10:15:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_fa2cef0a-01d4-4a99-8b70-548df287b1a3\\\\n2025-12-04T10:15:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_fa2cef0a-01d4-4a99-8b70-548df287b1a3 to /host/opt/cni/bin/\\\\n2025-12-04T10:15:25Z [verbose] multus-daemon started\\\\n2025-12-04T10:15:25Z [verbose] Readiness Indicator file check\\\\n2025-12-04T10:16:10Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:16:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5g27v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:12Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.776284 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c223077ddfa1e5cb3f1efbb02dcac1239c34d2337f398635019c8fe6e4509097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zk2rt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:12Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.786011 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttr4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca0a26ee-8553-41fc-8723-935bd994e3dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87e3574c160272839c8c85471f3e37b92c4bfaac028513cb345db9c804f73ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmwqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c389f190e572a728addfeaa6cb3bac8ece2ffa00fd451e584b5bcf4965256fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmwqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ttr4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:12Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.798039 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7266f967-3803-4ef3-9609-5a9c540a8305\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 10:15:16.683634 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 10:15:16.685267 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-316787488/tls.crt::/tmp/serving-cert-316787488/tls.key\\\\\\\"\\\\nI1204 10:15:22.063722 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 10:15:22.066433 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 10:15:22.066471 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 10:15:22.066492 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 10:15:22.066499 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 10:15:22.070852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 10:15:22.070873 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070879 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 10:15:22.070889 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 10:15:22.070892 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI1204 10:15:22.070891 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 10:15:22.070895 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 10:15:22.073753 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:12Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.809895 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:12Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.821412 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5814902f9f02bed9eb4600d77993f20ad463f2f78064f66755641f509913f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d1e9dabb13deee46d2c9ce72c90d2ed09c3e293b11b987c0c4be5b9a3cebe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:12Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.833870 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fade3c6b87d8b8d465b4e59ca3685265f09365b19feab5f4d31c18fecdea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:12Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.838370 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.838428 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.838438 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.838454 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.838464 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:12Z","lastTransitionTime":"2025-12-04T10:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.847267 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc448500592bc03a05d95154710642daf9362093c92bed1334c84b3f23ebf556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:12Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.858132 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15fc060-02d6-4ae3-95f5-60660050cced\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fcdee1a860ddfe046b4983d9b2422cf6c993be574f549cb3667179d92944884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa8d355d4394ba0854fc067550b15f70e9e72a2a1efb0d7f981574db2f81836c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://745df7e6079182e2ba297b8e268d8a1647ce0f856e99f8f90d3dc92a6a071087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae20d8a14fc0db716ccf1450078acdb98f14e5522b17a00b243c67300d7e0b43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae20d8a14fc0db716ccf1450078acdb98f14e5522b17a00b243c67300d7e0b43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:12Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.868045 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:12Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.893554 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8475bb26-8864-4d49-935b-db7d4cb73387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bf0cca6fa6a2c105fe80613113f4e04f5252d70eb5de23b53632ba8479e070\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399a139e421589fc50e4bad256402be123e37600f5c39eadf919ce767e9eeeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g76nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:12Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.913393 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fd6cw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b50ce71-ca0a-4532-86e9-4f779dcc7b93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fd6cw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:12Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.933927 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8a90f1a-4f3b-4007-8635-52d0af689af0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3ec5dcde986c3e369253eb2c4fc8d6e646c9c1557bc5c28a98a6c71562d04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e3c29b09715bd660a94365179740c46dd6bbeb04b333ecb5dadbf8cdea5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6c54394c01d6097ec29247b0676fc04989db88066d482eb053b5e9b040b630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://941587f1cf00e0b2279e8f08f1bad8e477f14a02f6cfc8ed858d0196e01f29f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:12Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.940349 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.940382 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.940391 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.940405 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.940414 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:12Z","lastTransitionTime":"2025-12-04T10:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:12 crc kubenswrapper[4831]: I1204 10:16:12.944481 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94431d54539f9725aa9b32ac709bc9406dd0f1a1f82c3de77b3fefe7a34e3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:12Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.043097 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.043140 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.043151 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.043169 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.043181 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:13Z","lastTransitionTime":"2025-12-04T10:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.145110 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.145162 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.145172 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.145184 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.145193 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:13Z","lastTransitionTime":"2025-12-04T10:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.248430 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.248470 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.248480 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.248494 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.248506 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:13Z","lastTransitionTime":"2025-12-04T10:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.276428 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:16:13 crc kubenswrapper[4831]: E1204 10:16:13.276560 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.290742 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:13Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.302768 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5814902f9f02bed9eb4600d77993f20ad463f2f78064f66755641f509913f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d1e9dabb13deee46d2c9ce72c90d2ed09c3e293b11b987c0c4be5b9a3cebe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:13Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.315496 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fade3c6b87d8b8d465b4e59ca3685265f09365b19feab5f4d31c18fecdea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:13Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.326676 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc448500592bc03a05d95154710642daf9362093c92bed1334c84b3f23ebf556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:13Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.337811 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15fc060-02d6-4ae3-95f5-60660050cced\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fcdee1a860ddfe046b4983d9b2422cf6c993be574f549cb3667179d92944884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa8d355d4394ba0854fc067550b15f70e9e72a2a1efb0d7f981574db2f81836c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://745df7e6079182e2ba297b8e268d8a1647ce0f856e99f8f90d3dc92a6a071087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae20d8a14fc0db716ccf1450078acdb98f14e5522b17a00b243c67300d7e0b43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae20d8a14fc0db716ccf1450078acdb98f14e5522b17a00b243c67300d7e0b43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:13Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.347445 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94431d54539f9725aa9b32ac709bc9406dd0f1a1f82c3de77b3fefe7a34e3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:13Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.350853 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.350880 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.350889 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.350901 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.350910 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:13Z","lastTransitionTime":"2025-12-04T10:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.360528 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:13Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.371066 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8475bb26-8864-4d49-935b-db7d4cb73387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bf0cca6fa6a2c105fe80613113f4e04f5252d70eb5de23b53632ba8479e070\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399a139e421589fc50e4bad256402be123e37600f5c39eadf919ce767e9eeeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g76nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:13Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.382006 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fd6cw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b50ce71-ca0a-4532-86e9-4f779dcc7b93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fd6cw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:13Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.393308 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8a90f1a-4f3b-4007-8635-52d0af689af0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3ec5dcde986c3e369253eb2c4fc8d6e646c9c1557bc5c28a98a6c71562d04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e3c29b09715bd660a94365179740c46dd6bbeb04b333ecb5dadbf8cdea5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6c54394c01d6097ec29247b0676fc04989db88066d482eb053b5e9b040b630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://941587f1cf00e0b2279e8f08f1bad8e477f14a02f6cfc8ed858d0196e01f29f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:13Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.408641 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5g27v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6a78509-d612-4338-8562-9b0627c1793f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4bd1b329fe50dcd62c415378ed6a27d47c10396502c612f82a95540cd56501a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8109a06b8e0bea96f12e0e1cf757a34d1da4d89159cac1247613d59cfa5645df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T10:16:10Z\\\",\\\"message\\\":\\\"2025-12-04T10:15:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_fa2cef0a-01d4-4a99-8b70-548df287b1a3\\\\n2025-12-04T10:15:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_fa2cef0a-01d4-4a99-8b70-548df287b1a3 to /host/opt/cni/bin/\\\\n2025-12-04T10:15:25Z [verbose] multus-daemon started\\\\n2025-12-04T10:15:25Z [verbose] Readiness Indicator file check\\\\n2025-12-04T10:16:10Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:16:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5g27v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:13Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.426414 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32af2e5d1d5e83cda9e496e431c5ea7b74aa427b478bc27e3d20db69ba776ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32af2e5d1d5e83cda9e496e431c5ea7b74aa427b478bc27e3d20db69ba776ad8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T10:15:49Z\\\",\\\"message\\\":\\\"e hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1204 10:15:49.415770 6474 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/olm-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/olm-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1204 10:15:49.415841 6474 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1204 10:15:49.415869 6474 ovnkube.go:599] Stopped ovnkube\\\\nI1204 10:15:49.415914 6474 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1204 10:15:49.415980 6474 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4xzkp_openshift-ovn-kubernetes(1261b9db-fe52-4fbc-9a9c-7e0c3486276e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xzkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:13Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.436214 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9ft9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3938322-cab2-412a-91e4-904ce2d99adf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a45d7846aee5ad8d18b94a52fb2d4e99365bc0a6c5b774d33b26e7a107fc825d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4l7pt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9ft9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:13Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.446741 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:13Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.452828 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.452861 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.452872 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.452889 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.452901 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:13Z","lastTransitionTime":"2025-12-04T10:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.458864 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7266f967-3803-4ef3-9609-5a9c540a8305\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 10:15:16.683634 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 10:15:16.685267 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-316787488/tls.crt::/tmp/serving-cert-316787488/tls.key\\\\\\\"\\\\nI1204 10:15:22.063722 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 10:15:22.066433 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 10:15:22.066471 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 10:15:22.066492 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 10:15:22.066499 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 10:15:22.070852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 10:15:22.070873 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070879 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 10:15:22.070889 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 10:15:22.070892 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI1204 10:15:22.070891 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 10:15:22.070895 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 10:15:22.073753 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:13Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.471248 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c223077ddfa1e5cb3f1efbb02dcac1239c34d2337f398635019c8fe6e4509097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zk2rt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:13Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.482982 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttr4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca0a26ee-8553-41fc-8723-935bd994e3dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87e3574c160272839c8c85471f3e37b92c4bfaac028513cb345db9c804f73ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmwqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c389f190e572a728addfeaa6cb3bac8ece2ffa00fd451e584b5bcf4965256fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmwqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ttr4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:13Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.554397 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.554449 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.554457 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.554470 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.554478 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:13Z","lastTransitionTime":"2025-12-04T10:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.656549 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.656745 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.656790 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.656810 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.656821 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:13Z","lastTransitionTime":"2025-12-04T10:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.759019 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.759055 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.759066 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.759080 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.759092 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:13Z","lastTransitionTime":"2025-12-04T10:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.861630 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.861726 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.861738 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.861755 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.861768 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:13Z","lastTransitionTime":"2025-12-04T10:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.963986 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.964069 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.964094 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.964123 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:13 crc kubenswrapper[4831]: I1204 10:16:13.964147 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:13Z","lastTransitionTime":"2025-12-04T10:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.066954 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.067027 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.067225 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.067253 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.067264 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:14Z","lastTransitionTime":"2025-12-04T10:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.170006 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.170041 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.170052 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.170067 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.170084 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:14Z","lastTransitionTime":"2025-12-04T10:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.272279 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.272311 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.272319 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.272332 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.272340 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:14Z","lastTransitionTime":"2025-12-04T10:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.275607 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:16:14 crc kubenswrapper[4831]: E1204 10:16:14.275797 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.275607 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:16:14 crc kubenswrapper[4831]: E1204 10:16:14.276010 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.275607 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:16:14 crc kubenswrapper[4831]: E1204 10:16:14.276193 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.374726 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.374756 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.374764 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.374777 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.374785 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:14Z","lastTransitionTime":"2025-12-04T10:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.477898 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.477940 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.477950 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.477966 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.477977 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:14Z","lastTransitionTime":"2025-12-04T10:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.580294 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.580338 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.580349 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.580370 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.580383 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:14Z","lastTransitionTime":"2025-12-04T10:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.682932 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.683181 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.683324 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.683436 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.683523 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:14Z","lastTransitionTime":"2025-12-04T10:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.785924 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.785967 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.785978 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.785995 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.786007 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:14Z","lastTransitionTime":"2025-12-04T10:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.889530 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.889602 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.889618 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.889647 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.889684 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:14Z","lastTransitionTime":"2025-12-04T10:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.992090 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.992137 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.992149 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.992166 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:14 crc kubenswrapper[4831]: I1204 10:16:14.992176 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:14Z","lastTransitionTime":"2025-12-04T10:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.094726 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.094764 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.094772 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.094788 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.094797 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:15Z","lastTransitionTime":"2025-12-04T10:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.197021 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.197056 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.197063 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.197078 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.197087 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:15Z","lastTransitionTime":"2025-12-04T10:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.275942 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:16:15 crc kubenswrapper[4831]: E1204 10:16:15.276085 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.299590 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.299628 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.299669 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.299684 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.299694 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:15Z","lastTransitionTime":"2025-12-04T10:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.402301 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.402343 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.402353 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.402369 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.402382 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:15Z","lastTransitionTime":"2025-12-04T10:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.505034 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.505076 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.505085 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.505101 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.505123 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:15Z","lastTransitionTime":"2025-12-04T10:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.607423 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.607508 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.607519 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.607534 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.607542 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:15Z","lastTransitionTime":"2025-12-04T10:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.709823 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.709883 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.709897 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.709928 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.709943 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:15Z","lastTransitionTime":"2025-12-04T10:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.812479 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.812520 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.812540 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.812563 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.812580 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:15Z","lastTransitionTime":"2025-12-04T10:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.914960 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.914991 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.915002 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.915018 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:15 crc kubenswrapper[4831]: I1204 10:16:15.915029 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:15Z","lastTransitionTime":"2025-12-04T10:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.018406 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.018465 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.018477 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.018498 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.018511 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:16Z","lastTransitionTime":"2025-12-04T10:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.122177 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.122244 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.122263 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.122287 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.122304 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:16Z","lastTransitionTime":"2025-12-04T10:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.225319 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.225358 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.225367 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.225381 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.225392 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:16Z","lastTransitionTime":"2025-12-04T10:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.276349 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.276449 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.276467 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:16:16 crc kubenswrapper[4831]: E1204 10:16:16.276499 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:16:16 crc kubenswrapper[4831]: E1204 10:16:16.276565 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:16:16 crc kubenswrapper[4831]: E1204 10:16:16.276634 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.328238 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.328274 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.328282 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.328316 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.328327 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:16Z","lastTransitionTime":"2025-12-04T10:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.430639 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.430708 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.430720 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.430752 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.430765 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:16Z","lastTransitionTime":"2025-12-04T10:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.532351 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.532395 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.532408 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.532426 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.532438 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:16Z","lastTransitionTime":"2025-12-04T10:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.635088 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.635332 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.635343 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.635358 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.635369 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:16Z","lastTransitionTime":"2025-12-04T10:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.737651 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.737763 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.737787 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.737816 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.737837 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:16Z","lastTransitionTime":"2025-12-04T10:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.840115 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.840761 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.840790 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.840819 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.840840 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:16Z","lastTransitionTime":"2025-12-04T10:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.944032 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.944068 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.944080 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.944096 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:16 crc kubenswrapper[4831]: I1204 10:16:16.944108 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:16Z","lastTransitionTime":"2025-12-04T10:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.046300 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.046351 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.046365 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.046383 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.046396 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:17Z","lastTransitionTime":"2025-12-04T10:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.148806 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.148842 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.148851 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.148864 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.148873 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:17Z","lastTransitionTime":"2025-12-04T10:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.250862 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.250899 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.250911 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.250926 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.250939 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:17Z","lastTransitionTime":"2025-12-04T10:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.276388 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:16:17 crc kubenswrapper[4831]: E1204 10:16:17.276726 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.288575 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.353532 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.353916 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.353930 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.353945 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.353954 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:17Z","lastTransitionTime":"2025-12-04T10:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.446295 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.446341 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.446352 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.446366 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.446376 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:17Z","lastTransitionTime":"2025-12-04T10:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:17 crc kubenswrapper[4831]: E1204 10:16:17.458117 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae8c7b9f-5f4a-44bc-a820-5600f29471a7\\\",\\\"systemUUID\\\":\\\"aaf9904b-e604-46a1-bdf5-7d2b7b9a992c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:17Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.461628 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.461683 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.461694 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.461708 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.461716 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:17Z","lastTransitionTime":"2025-12-04T10:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:17 crc kubenswrapper[4831]: E1204 10:16:17.473995 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae8c7b9f-5f4a-44bc-a820-5600f29471a7\\\",\\\"systemUUID\\\":\\\"aaf9904b-e604-46a1-bdf5-7d2b7b9a992c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:17Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.477678 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.477724 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.477737 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.477752 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.477762 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:17Z","lastTransitionTime":"2025-12-04T10:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:17 crc kubenswrapper[4831]: E1204 10:16:17.489071 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae8c7b9f-5f4a-44bc-a820-5600f29471a7\\\",\\\"systemUUID\\\":\\\"aaf9904b-e604-46a1-bdf5-7d2b7b9a992c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:17Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.492840 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.492867 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.492876 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.492891 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.492903 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:17Z","lastTransitionTime":"2025-12-04T10:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:17 crc kubenswrapper[4831]: E1204 10:16:17.503942 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae8c7b9f-5f4a-44bc-a820-5600f29471a7\\\",\\\"systemUUID\\\":\\\"aaf9904b-e604-46a1-bdf5-7d2b7b9a992c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:17Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.507227 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.507267 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.507280 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.507296 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.507307 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:17Z","lastTransitionTime":"2025-12-04T10:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:17 crc kubenswrapper[4831]: E1204 10:16:17.520798 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae8c7b9f-5f4a-44bc-a820-5600f29471a7\\\",\\\"systemUUID\\\":\\\"aaf9904b-e604-46a1-bdf5-7d2b7b9a992c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:17Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:17 crc kubenswrapper[4831]: E1204 10:16:17.520931 4831 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.522532 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.522564 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.522572 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.522587 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.522614 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:17Z","lastTransitionTime":"2025-12-04T10:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.625024 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.625072 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.625088 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.625111 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.625127 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:17Z","lastTransitionTime":"2025-12-04T10:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.727315 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.727421 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.727439 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.727481 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.727500 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:17Z","lastTransitionTime":"2025-12-04T10:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.830573 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.830624 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.830637 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.830654 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.830689 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:17Z","lastTransitionTime":"2025-12-04T10:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.932987 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.933056 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.933079 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.933099 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:17 crc kubenswrapper[4831]: I1204 10:16:17.933113 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:17Z","lastTransitionTime":"2025-12-04T10:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.035623 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.035761 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.035782 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.035806 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.035823 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:18Z","lastTransitionTime":"2025-12-04T10:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.137721 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.137755 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.137763 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.137775 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.137783 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:18Z","lastTransitionTime":"2025-12-04T10:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.240763 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.240795 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.240813 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.240828 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.240838 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:18Z","lastTransitionTime":"2025-12-04T10:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.275952 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.276022 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:16:18 crc kubenswrapper[4831]: E1204 10:16:18.276140 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.275960 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:16:18 crc kubenswrapper[4831]: E1204 10:16:18.276282 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:16:18 crc kubenswrapper[4831]: E1204 10:16:18.276382 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.343276 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.343331 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.343352 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.343378 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.343402 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:18Z","lastTransitionTime":"2025-12-04T10:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.445685 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.445723 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.445732 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.445747 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.445759 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:18Z","lastTransitionTime":"2025-12-04T10:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.547985 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.548018 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.548026 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.548039 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.548048 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:18Z","lastTransitionTime":"2025-12-04T10:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.651155 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.651979 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.652025 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.652051 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.652072 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:18Z","lastTransitionTime":"2025-12-04T10:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.755589 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.755694 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.755716 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.755743 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.755761 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:18Z","lastTransitionTime":"2025-12-04T10:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.858290 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.858319 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.858326 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.858339 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.858348 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:18Z","lastTransitionTime":"2025-12-04T10:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.961011 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.961061 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.961072 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.961089 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:18 crc kubenswrapper[4831]: I1204 10:16:18.961101 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:18Z","lastTransitionTime":"2025-12-04T10:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.063835 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.063903 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.063913 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.063936 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.063948 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:19Z","lastTransitionTime":"2025-12-04T10:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.167415 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.167476 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.167490 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.167514 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.167532 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:19Z","lastTransitionTime":"2025-12-04T10:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.270144 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.270225 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.270241 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.270261 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.270295 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:19Z","lastTransitionTime":"2025-12-04T10:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.275806 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:16:19 crc kubenswrapper[4831]: E1204 10:16:19.276031 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.372931 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.372975 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.372985 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.373003 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.373017 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:19Z","lastTransitionTime":"2025-12-04T10:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.475338 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.475376 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.475392 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.475407 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.475418 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:19Z","lastTransitionTime":"2025-12-04T10:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.577763 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.577817 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.577834 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.577855 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.577872 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:19Z","lastTransitionTime":"2025-12-04T10:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.680334 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.680382 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.680397 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.680414 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.680426 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:19Z","lastTransitionTime":"2025-12-04T10:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.784275 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.784360 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.784390 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.784423 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.784448 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:19Z","lastTransitionTime":"2025-12-04T10:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.886999 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.887052 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.887067 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.887091 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.887108 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:19Z","lastTransitionTime":"2025-12-04T10:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.990504 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.990562 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.990688 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.990731 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:19 crc kubenswrapper[4831]: I1204 10:16:19.990748 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:19Z","lastTransitionTime":"2025-12-04T10:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.093173 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.093221 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.093235 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.093257 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.093273 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:20Z","lastTransitionTime":"2025-12-04T10:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.196166 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.196226 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.196244 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.196266 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.196279 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:20Z","lastTransitionTime":"2025-12-04T10:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.275971 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.276071 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.275981 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:16:20 crc kubenswrapper[4831]: E1204 10:16:20.276103 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:16:20 crc kubenswrapper[4831]: E1204 10:16:20.276266 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:16:20 crc kubenswrapper[4831]: E1204 10:16:20.276903 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.277152 4831 scope.go:117] "RemoveContainer" containerID="32af2e5d1d5e83cda9e496e431c5ea7b74aa427b478bc27e3d20db69ba776ad8" Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.299333 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.299400 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.299423 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.299452 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.299475 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:20Z","lastTransitionTime":"2025-12-04T10:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.401860 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.401918 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.401930 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.401948 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.401961 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:20Z","lastTransitionTime":"2025-12-04T10:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.504860 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.504918 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.504930 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.504948 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.504961 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:20Z","lastTransitionTime":"2025-12-04T10:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.608110 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.608170 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.608199 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.608221 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.608235 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:20Z","lastTransitionTime":"2025-12-04T10:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.711366 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.711436 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.711454 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.711477 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.711503 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:20Z","lastTransitionTime":"2025-12-04T10:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.737437 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xzkp_1261b9db-fe52-4fbc-9a9c-7e0c3486276e/ovnkube-controller/2.log" Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.813982 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.814016 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.814027 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.814043 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.814055 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:20Z","lastTransitionTime":"2025-12-04T10:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.921419 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.921455 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.921476 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.921494 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:20 crc kubenswrapper[4831]: I1204 10:16:20.921507 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:20Z","lastTransitionTime":"2025-12-04T10:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.023884 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.023936 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.023953 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.023975 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.023994 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:21Z","lastTransitionTime":"2025-12-04T10:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.126473 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.126506 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.126523 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.126536 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.126545 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:21Z","lastTransitionTime":"2025-12-04T10:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.228248 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.228285 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.228296 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.228314 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.228362 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:21Z","lastTransitionTime":"2025-12-04T10:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.276254 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:16:21 crc kubenswrapper[4831]: E1204 10:16:21.276415 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.331150 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.331191 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.331200 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.331215 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.331227 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:21Z","lastTransitionTime":"2025-12-04T10:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.433501 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.433537 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.433548 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.433565 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.433577 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:21Z","lastTransitionTime":"2025-12-04T10:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.536595 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.536640 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.536682 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.536704 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.536720 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:21Z","lastTransitionTime":"2025-12-04T10:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.639089 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.639122 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.639132 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.639148 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.639160 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:21Z","lastTransitionTime":"2025-12-04T10:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.742148 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.742189 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.742198 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.742213 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.742222 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:21Z","lastTransitionTime":"2025-12-04T10:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.746163 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xzkp_1261b9db-fe52-4fbc-9a9c-7e0c3486276e/ovnkube-controller/3.log" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.747026 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xzkp_1261b9db-fe52-4fbc-9a9c-7e0c3486276e/ovnkube-controller/2.log" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.749898 4831 generic.go:334] "Generic (PLEG): container finished" podID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerID="e913d104de697629b6461dc48f3d2e2f336dce5f39f5d9a976f749dca87a2382" exitCode=1 Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.749944 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" event={"ID":"1261b9db-fe52-4fbc-9a9c-7e0c3486276e","Type":"ContainerDied","Data":"e913d104de697629b6461dc48f3d2e2f336dce5f39f5d9a976f749dca87a2382"} Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.749985 4831 scope.go:117] "RemoveContainer" containerID="32af2e5d1d5e83cda9e496e431c5ea7b74aa427b478bc27e3d20db69ba776ad8" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.750681 4831 scope.go:117] "RemoveContainer" containerID="e913d104de697629b6461dc48f3d2e2f336dce5f39f5d9a976f749dca87a2382" Dec 04 10:16:21 crc kubenswrapper[4831]: E1204 10:16:21.750846 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4xzkp_openshift-ovn-kubernetes(1261b9db-fe52-4fbc-9a9c-7e0c3486276e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.771082 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8475bb26-8864-4d49-935b-db7d4cb73387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bf0cca6fa6a2c105fe80613113f4e04f5252d70eb5de23b53632ba8479e070\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399a139e421589fc50e4bad256402be123e37600f5c39eadf919ce767e9eeeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g76nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:21Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.787336 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fd6cw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b50ce71-ca0a-4532-86e9-4f779dcc7b93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fd6cw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:21Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.809672 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8a90f1a-4f3b-4007-8635-52d0af689af0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3ec5dcde986c3e369253eb2c4fc8d6e646c9c1557bc5c28a98a6c71562d04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e3c29b09715bd660a94365179740c46dd6bbeb04b333ecb5dadbf8cdea5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6c54394c01d6097ec29247b0676fc04989db88066d482eb053b5e9b040b630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://941587f1cf00e0b2279e8f08f1bad8e477f14a02f6cfc8ed858d0196e01f29f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:21Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.824580 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94431d54539f9725aa9b32ac709bc9406dd0f1a1f82c3de77b3fefe7a34e3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:21Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.842405 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:21Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.844496 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.844539 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.844548 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.844589 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.844604 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:21Z","lastTransitionTime":"2025-12-04T10:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.853240 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9ft9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3938322-cab2-412a-91e4-904ce2d99adf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a45d7846aee5ad8d18b94a52fb2d4e99365bc0a6c5b774d33b26e7a107fc825d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4l7pt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9ft9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:21Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.868847 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:21Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.882952 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5g27v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6a78509-d612-4338-8562-9b0627c1793f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4bd1b329fe50dcd62c415378ed6a27d47c10396502c612f82a95540cd56501a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8109a06b8e0bea96f12e0e1cf757a34d1da4d89159cac1247613d59cfa5645df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T10:16:10Z\\\",\\\"message\\\":\\\"2025-12-04T10:15:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_fa2cef0a-01d4-4a99-8b70-548df287b1a3\\\\n2025-12-04T10:15:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_fa2cef0a-01d4-4a99-8b70-548df287b1a3 to /host/opt/cni/bin/\\\\n2025-12-04T10:15:25Z [verbose] multus-daemon started\\\\n2025-12-04T10:15:25Z [verbose] Readiness Indicator file check\\\\n2025-12-04T10:16:10Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:16:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5g27v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:21Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.902199 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e913d104de697629b6461dc48f3d2e2f336dce5f39f5d9a976f749dca87a2382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32af2e5d1d5e83cda9e496e431c5ea7b74aa427b478bc27e3d20db69ba776ad8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T10:15:49Z\\\",\\\"message\\\":\\\"e hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1204 10:15:49.415770 6474 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/olm-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/olm-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1204 10:15:49.415841 6474 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1204 10:15:49.415869 6474 ovnkube.go:599] Stopped ovnkube\\\\nI1204 10:15:49.415914 6474 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1204 10:15:49.415980 6474 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e913d104de697629b6461dc48f3d2e2f336dce5f39f5d9a976f749dca87a2382\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T10:16:21Z\\\",\\\"message\\\":\\\"roller.go:776] Recording success event on pod openshift-multus/multus-5g27v\\\\nI1204 10:16:21.380417 6854 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 3.155852ms)\\\\nI1204 10:16:21.380190 6854 services_controller.go:454] Service openshift-machine-api/machine-api-operator-machine-webhook for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1204 10:16:21.380489 6854 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/machine-api-operator-machine-webhook_TCP_cluster\\\\\\\", UUID:\\\\\\\"de88cb48-af91-44f8-b3c0-73dcf8201ba5\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator-machine-webhook\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/machine-api-operator-machine-webhook_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:16:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xzkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:21Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.912444 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttr4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca0a26ee-8553-41fc-8723-935bd994e3dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87e3574c160272839c8c85471f3e37b92c4bfaac028513cb345db9c804f73ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmwqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c389f190e572a728addfeaa6cb3bac8ece2ffa00fd451e584b5bcf4965256fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmwqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ttr4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:21Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.927161 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600fe973-1869-4a12-9354-a394ed648521\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bc05b9fff58df35fc8bfab430467a16ae22396ce878934c9dd2ad0a21043f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2536b44f913751976bc76295ac52a233a3de641e3fc272594a4bd94a5c788755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2536b44f913751976bc76295ac52a233a3de641e3fc272594a4bd94a5c788755\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:21Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.947878 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.947924 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.947968 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.947986 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.947998 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:21Z","lastTransitionTime":"2025-12-04T10:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.950471 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7266f967-3803-4ef3-9609-5a9c540a8305\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 10:15:16.683634 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 10:15:16.685267 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-316787488/tls.crt::/tmp/serving-cert-316787488/tls.key\\\\\\\"\\\\nI1204 10:15:22.063722 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 10:15:22.066433 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 10:15:22.066471 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 10:15:22.066492 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 10:15:22.066499 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 10:15:22.070852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 10:15:22.070873 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070879 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 10:15:22.070889 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 10:15:22.070892 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI1204 10:15:22.070891 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 10:15:22.070895 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 10:15:22.073753 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:21Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.968041 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c223077ddfa1e5cb3f1efbb02dcac1239c34d2337f398635019c8fe6e4509097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zk2rt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:21Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:21 crc kubenswrapper[4831]: I1204 10:16:21.982426 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5814902f9f02bed9eb4600d77993f20ad463f2f78064f66755641f509913f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d1e9dabb13deee46d2c9ce72c90d2ed09c3e293b11b987c0c4be5b9a3cebe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:21Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.000834 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fade3c6b87d8b8d465b4e59ca3685265f09365b19feab5f4d31c18fecdea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:21Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.014513 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc448500592bc03a05d95154710642daf9362093c92bed1334c84b3f23ebf556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:22Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.030467 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15fc060-02d6-4ae3-95f5-60660050cced\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fcdee1a860ddfe046b4983d9b2422cf6c993be574f549cb3667179d92944884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa8d355d4394ba0854fc067550b15f70e9e72a2a1efb0d7f981574db2f81836c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://745df7e6079182e2ba297b8e268d8a1647ce0f856e99f8f90d3dc92a6a071087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae20d8a14fc0db716ccf1450078acdb98f14e5522b17a00b243c67300d7e0b43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae20d8a14fc0db716ccf1450078acdb98f14e5522b17a00b243c67300d7e0b43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:22Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.046631 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:22Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.050148 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.050210 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.050223 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.050240 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.050253 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:22Z","lastTransitionTime":"2025-12-04T10:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.153091 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.153374 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.153453 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.153526 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.153595 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:22Z","lastTransitionTime":"2025-12-04T10:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.258758 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.258937 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.258965 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.259039 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.259063 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:22Z","lastTransitionTime":"2025-12-04T10:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.276162 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.276205 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.276226 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:16:22 crc kubenswrapper[4831]: E1204 10:16:22.276303 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:16:22 crc kubenswrapper[4831]: E1204 10:16:22.276517 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:16:22 crc kubenswrapper[4831]: E1204 10:16:22.276656 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.361458 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.361492 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.361502 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.361518 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.361529 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:22Z","lastTransitionTime":"2025-12-04T10:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.464257 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.464293 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.464303 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.464318 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.464328 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:22Z","lastTransitionTime":"2025-12-04T10:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.567357 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.567436 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.567465 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.567514 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.567536 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:22Z","lastTransitionTime":"2025-12-04T10:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.670613 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.670650 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.670674 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.670686 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.670697 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:22Z","lastTransitionTime":"2025-12-04T10:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.756018 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xzkp_1261b9db-fe52-4fbc-9a9c-7e0c3486276e/ovnkube-controller/3.log" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.773052 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.773105 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.773120 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.773140 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.773157 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:22Z","lastTransitionTime":"2025-12-04T10:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.875591 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.875632 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.875652 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.875694 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.875707 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:22Z","lastTransitionTime":"2025-12-04T10:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.977498 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.977539 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.977551 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.977567 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:22 crc kubenswrapper[4831]: I1204 10:16:22.977579 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:22Z","lastTransitionTime":"2025-12-04T10:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.080247 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.080280 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.080291 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.080306 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.080317 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:23Z","lastTransitionTime":"2025-12-04T10:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.183252 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.183350 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.183367 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.183391 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.183407 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:23Z","lastTransitionTime":"2025-12-04T10:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.275969 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:16:23 crc kubenswrapper[4831]: E1204 10:16:23.276172 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.285981 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.286022 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.286037 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.286057 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.286070 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:23Z","lastTransitionTime":"2025-12-04T10:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.288525 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600fe973-1869-4a12-9354-a394ed648521\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bc05b9fff58df35fc8bfab430467a16ae22396ce878934c9dd2ad0a21043f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2536b44f913751976bc76295ac52a233a3de641e3fc272594a4bd94a5c788755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2536b44f913751976bc76295ac52a233a3de641e3fc272594a4bd94a5c788755\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:23Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.304952 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7266f967-3803-4ef3-9609-5a9c540a8305\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 10:15:16.683634 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 10:15:16.685267 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-316787488/tls.crt::/tmp/serving-cert-316787488/tls.key\\\\\\\"\\\\nI1204 10:15:22.063722 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 10:15:22.066433 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 10:15:22.066471 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 10:15:22.066492 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 10:15:22.066499 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 10:15:22.070852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 10:15:22.070873 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070879 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 10:15:22.070889 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 10:15:22.070892 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI1204 10:15:22.070891 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 10:15:22.070895 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 10:15:22.073753 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:23Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.327940 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c223077ddfa1e5cb3f1efbb02dcac1239c34d2337f398635019c8fe6e4509097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zk2rt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:23Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.342768 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttr4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca0a26ee-8553-41fc-8723-935bd994e3dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87e3574c160272839c8c85471f3e37b92c4bfaac028513cb345db9c804f73ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmwqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c389f190e572a728addfeaa6cb3bac8ece2ffa00fd451e584b5bcf4965256fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmwqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ttr4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:23Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.359565 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15fc060-02d6-4ae3-95f5-60660050cced\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fcdee1a860ddfe046b4983d9b2422cf6c993be574f549cb3667179d92944884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa8d355d4394ba0854fc067550b15f70e9e72a2a1efb0d7f981574db2f81836c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://745df7e6079182e2ba297b8e268d8a1647ce0f856e99f8f90d3dc92a6a071087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae20d8a14fc0db716ccf1450078acdb98f14e5522b17a00b243c67300d7e0b43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae20d8a14fc0db716ccf1450078acdb98f14e5522b17a00b243c67300d7e0b43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:23Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.374001 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:23Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.386489 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5814902f9f02bed9eb4600d77993f20ad463f2f78064f66755641f509913f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d1e9dabb13deee46d2c9ce72c90d2ed09c3e293b11b987c0c4be5b9a3cebe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:23Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.388140 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.388172 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.388186 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.388202 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.388214 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:23Z","lastTransitionTime":"2025-12-04T10:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.400524 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fade3c6b87d8b8d465b4e59ca3685265f09365b19feab5f4d31c18fecdea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:23Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.416753 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc448500592bc03a05d95154710642daf9362093c92bed1334c84b3f23ebf556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:23Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.427961 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8a90f1a-4f3b-4007-8635-52d0af689af0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3ec5dcde986c3e369253eb2c4fc8d6e646c9c1557bc5c28a98a6c71562d04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e3c29b09715bd660a94365179740c46dd6bbeb04b333ecb5dadbf8cdea5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6c54394c01d6097ec29247b0676fc04989db88066d482eb053b5e9b040b630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://941587f1cf00e0b2279e8f08f1bad8e477f14a02f6cfc8ed858d0196e01f29f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:23Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.437176 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94431d54539f9725aa9b32ac709bc9406dd0f1a1f82c3de77b3fefe7a34e3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:23Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.447386 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:23Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.456532 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8475bb26-8864-4d49-935b-db7d4cb73387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bf0cca6fa6a2c105fe80613113f4e04f5252d70eb5de23b53632ba8479e070\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399a139e421589fc50e4bad256402be123e37600f5c39eadf919ce767e9eeeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g76nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:23Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.465640 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fd6cw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b50ce71-ca0a-4532-86e9-4f779dcc7b93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fd6cw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:23Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.476021 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:23Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.486535 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5g27v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6a78509-d612-4338-8562-9b0627c1793f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4bd1b329fe50dcd62c415378ed6a27d47c10396502c612f82a95540cd56501a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8109a06b8e0bea96f12e0e1cf757a34d1da4d89159cac1247613d59cfa5645df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T10:16:10Z\\\",\\\"message\\\":\\\"2025-12-04T10:15:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_fa2cef0a-01d4-4a99-8b70-548df287b1a3\\\\n2025-12-04T10:15:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_fa2cef0a-01d4-4a99-8b70-548df287b1a3 to /host/opt/cni/bin/\\\\n2025-12-04T10:15:25Z [verbose] multus-daemon started\\\\n2025-12-04T10:15:25Z [verbose] Readiness Indicator file check\\\\n2025-12-04T10:16:10Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:16:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5g27v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:23Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.489610 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.489752 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.489814 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.489892 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.489963 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:23Z","lastTransitionTime":"2025-12-04T10:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.502365 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e913d104de697629b6461dc48f3d2e2f336dce5f39f5d9a976f749dca87a2382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32af2e5d1d5e83cda9e496e431c5ea7b74aa427b478bc27e3d20db69ba776ad8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T10:15:49Z\\\",\\\"message\\\":\\\"e hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1204 10:15:49.415770 6474 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/olm-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/olm-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1204 10:15:49.415841 6474 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1204 10:15:49.415869 6474 ovnkube.go:599] Stopped ovnkube\\\\nI1204 10:15:49.415914 6474 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1204 10:15:49.415980 6474 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e913d104de697629b6461dc48f3d2e2f336dce5f39f5d9a976f749dca87a2382\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T10:16:21Z\\\",\\\"message\\\":\\\"roller.go:776] Recording success event on pod openshift-multus/multus-5g27v\\\\nI1204 10:16:21.380417 6854 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 3.155852ms)\\\\nI1204 10:16:21.380190 6854 services_controller.go:454] Service openshift-machine-api/machine-api-operator-machine-webhook for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1204 10:16:21.380489 6854 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/machine-api-operator-machine-webhook_TCP_cluster\\\\\\\", UUID:\\\\\\\"de88cb48-af91-44f8-b3c0-73dcf8201ba5\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator-machine-webhook\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/machine-api-operator-machine-webhook_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:16:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xzkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:23Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.510726 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9ft9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3938322-cab2-412a-91e4-904ce2d99adf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a45d7846aee5ad8d18b94a52fb2d4e99365bc0a6c5b774d33b26e7a107fc825d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4l7pt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9ft9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:23Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.592692 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.592907 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.593003 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.593104 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.593195 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:23Z","lastTransitionTime":"2025-12-04T10:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.695199 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.695863 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.695919 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.695949 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.695971 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:23Z","lastTransitionTime":"2025-12-04T10:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.798092 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.798142 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.798154 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.798172 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.798183 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:23Z","lastTransitionTime":"2025-12-04T10:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.900852 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.900918 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.900935 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.900960 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:23 crc kubenswrapper[4831]: I1204 10:16:23.900975 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:23Z","lastTransitionTime":"2025-12-04T10:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.004344 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.004682 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.004700 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.004718 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.004730 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:24Z","lastTransitionTime":"2025-12-04T10:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.107454 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.107493 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.107507 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.107524 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.107538 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:24Z","lastTransitionTime":"2025-12-04T10:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.210179 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.210256 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.210281 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.210307 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.210326 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:24Z","lastTransitionTime":"2025-12-04T10:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.276190 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.276218 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:16:24 crc kubenswrapper[4831]: E1204 10:16:24.276336 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.276211 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:16:24 crc kubenswrapper[4831]: E1204 10:16:24.276522 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:16:24 crc kubenswrapper[4831]: E1204 10:16:24.276565 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.313091 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.313125 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.313133 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.313146 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.313155 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:24Z","lastTransitionTime":"2025-12-04T10:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.415806 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.415847 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.415859 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.415871 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.415881 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:24Z","lastTransitionTime":"2025-12-04T10:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.518970 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.519039 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.519056 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.519077 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.519093 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:24Z","lastTransitionTime":"2025-12-04T10:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.622125 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.622174 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.622185 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.622202 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.622213 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:24Z","lastTransitionTime":"2025-12-04T10:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.725134 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.725197 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.725208 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.725223 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.725232 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:24Z","lastTransitionTime":"2025-12-04T10:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.827484 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.827521 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.827530 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.827544 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.827554 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:24Z","lastTransitionTime":"2025-12-04T10:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.930184 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.930602 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.930619 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.930645 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:24 crc kubenswrapper[4831]: I1204 10:16:24.930696 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:24Z","lastTransitionTime":"2025-12-04T10:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.034352 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.034429 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.034456 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.034503 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.034525 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:25Z","lastTransitionTime":"2025-12-04T10:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.137423 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.137475 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.137487 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.137506 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.137519 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:25Z","lastTransitionTime":"2025-12-04T10:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.240028 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.240343 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.240444 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.240542 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.240680 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:25Z","lastTransitionTime":"2025-12-04T10:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.277052 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:16:25 crc kubenswrapper[4831]: E1204 10:16:25.277240 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.342980 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.343016 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.343026 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.343042 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.343052 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:25Z","lastTransitionTime":"2025-12-04T10:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.445920 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.445991 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.446014 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.446044 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.446102 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:25Z","lastTransitionTime":"2025-12-04T10:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.549029 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.549086 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.549097 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.549115 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.549128 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:25Z","lastTransitionTime":"2025-12-04T10:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.651330 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.651379 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.651399 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.651418 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.651431 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:25Z","lastTransitionTime":"2025-12-04T10:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.753853 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.753898 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.753946 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.753961 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.754114 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:25Z","lastTransitionTime":"2025-12-04T10:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.856970 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.857012 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.857028 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.857055 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.857066 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:25Z","lastTransitionTime":"2025-12-04T10:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.959900 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.960167 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.960242 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.960366 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:25 crc kubenswrapper[4831]: I1204 10:16:25.960444 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:25Z","lastTransitionTime":"2025-12-04T10:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.063340 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.063395 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.063411 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.063433 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.063448 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:26Z","lastTransitionTime":"2025-12-04T10:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.143063 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.144092 4831 scope.go:117] "RemoveContainer" containerID="e913d104de697629b6461dc48f3d2e2f336dce5f39f5d9a976f749dca87a2382" Dec 04 10:16:26 crc kubenswrapper[4831]: E1204 10:16:26.144342 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4xzkp_openshift-ovn-kubernetes(1261b9db-fe52-4fbc-9a9c-7e0c3486276e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.157645 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.160343 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:16:26 crc kubenswrapper[4831]: E1204 10:16:26.160535 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:30.160514868 +0000 UTC m=+147.109690212 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.166606 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.166644 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.166679 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.166697 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.166711 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:26Z","lastTransitionTime":"2025-12-04T10:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.176772 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5g27v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6a78509-d612-4338-8562-9b0627c1793f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4bd1b329fe50dcd62c415378ed6a27d47c10396502c612f82a95540cd56501a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8109a06b8e0bea96f12e0e1cf757a34d1da4d89159cac1247613d59cfa5645df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T10:16:10Z\\\",\\\"message\\\":\\\"2025-12-04T10:15:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_fa2cef0a-01d4-4a99-8b70-548df287b1a3\\\\n2025-12-04T10:15:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_fa2cef0a-01d4-4a99-8b70-548df287b1a3 to /host/opt/cni/bin/\\\\n2025-12-04T10:15:25Z [verbose] multus-daemon started\\\\n2025-12-04T10:15:25Z [verbose] Readiness Indicator file check\\\\n2025-12-04T10:16:10Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:16:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5g27v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.200615 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e913d104de697629b6461dc48f3d2e2f336dce5f39f5d9a976f749dca87a2382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e913d104de697629b6461dc48f3d2e2f336dce5f39f5d9a976f749dca87a2382\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T10:16:21Z\\\",\\\"message\\\":\\\"roller.go:776] Recording success event on pod openshift-multus/multus-5g27v\\\\nI1204 10:16:21.380417 6854 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 3.155852ms)\\\\nI1204 10:16:21.380190 6854 services_controller.go:454] Service openshift-machine-api/machine-api-operator-machine-webhook for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1204 10:16:21.380489 6854 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/machine-api-operator-machine-webhook_TCP_cluster\\\\\\\", UUID:\\\\\\\"de88cb48-af91-44f8-b3c0-73dcf8201ba5\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator-machine-webhook\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/machine-api-operator-machine-webhook_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:16:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4xzkp_openshift-ovn-kubernetes(1261b9db-fe52-4fbc-9a9c-7e0c3486276e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xzkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.218333 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9ft9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3938322-cab2-412a-91e4-904ce2d99adf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a45d7846aee5ad8d18b94a52fb2d4e99365bc0a6c5b774d33b26e7a107fc825d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4l7pt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9ft9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.230462 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600fe973-1869-4a12-9354-a394ed648521\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bc05b9fff58df35fc8bfab430467a16ae22396ce878934c9dd2ad0a21043f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2536b44f913751976bc76295ac52a233a3de641e3fc272594a4bd94a5c788755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2536b44f913751976bc76295ac52a233a3de641e3fc272594a4bd94a5c788755\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.244794 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7266f967-3803-4ef3-9609-5a9c540a8305\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 10:15:16.683634 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 10:15:16.685267 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-316787488/tls.crt::/tmp/serving-cert-316787488/tls.key\\\\\\\"\\\\nI1204 10:15:22.063722 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 10:15:22.066433 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 10:15:22.066471 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 10:15:22.066492 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 10:15:22.066499 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 10:15:22.070852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 10:15:22.070873 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070879 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 10:15:22.070889 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 10:15:22.070892 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI1204 10:15:22.070891 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 10:15:22.070895 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 10:15:22.073753 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.258020 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c223077ddfa1e5cb3f1efbb02dcac1239c34d2337f398635019c8fe6e4509097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zk2rt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.260981 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.261186 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.261328 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.261490 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:16:26 crc kubenswrapper[4831]: E1204 10:16:26.261234 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 10:16:26 crc kubenswrapper[4831]: E1204 10:16:26.262286 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 10:16:26 crc kubenswrapper[4831]: E1204 10:16:26.261786 4831 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 10:16:26 crc kubenswrapper[4831]: E1204 10:16:26.261953 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 10:16:26 crc kubenswrapper[4831]: E1204 10:16:26.262541 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 10:16:26 crc kubenswrapper[4831]: E1204 10:16:26.262612 4831 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 10:16:26 crc kubenswrapper[4831]: E1204 10:16:26.262208 4831 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 10:16:26 crc kubenswrapper[4831]: E1204 10:16:26.262372 4831 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 10:16:26 crc kubenswrapper[4831]: E1204 10:16:26.262495 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 10:17:30.262439778 +0000 UTC m=+147.211615112 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 10:16:26 crc kubenswrapper[4831]: E1204 10:16:26.262926 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 10:17:30.262898051 +0000 UTC m=+147.212073385 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 10:16:26 crc kubenswrapper[4831]: E1204 10:16:26.262946 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 10:17:30.262938312 +0000 UTC m=+147.212113636 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 10:16:26 crc kubenswrapper[4831]: E1204 10:16:26.262963 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 10:17:30.262956343 +0000 UTC m=+147.212131677 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.269726 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.269934 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.270035 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.270186 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.270271 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttr4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca0a26ee-8553-41fc-8723-935bd994e3dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87e3574c160272839c8c85471f3e37b92c4bfaac028513cb345db9c804f73ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmwqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c389f190e572a728addfeaa6cb3bac8ece2ffa00fd451e584b5bcf4965256fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmwqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ttr4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.270422 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:26Z","lastTransitionTime":"2025-12-04T10:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.276296 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.276410 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:16:26 crc kubenswrapper[4831]: E1204 10:16:26.276516 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.276311 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:16:26 crc kubenswrapper[4831]: E1204 10:16:26.276710 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:16:26 crc kubenswrapper[4831]: E1204 10:16:26.276967 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.283913 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc448500592bc03a05d95154710642daf9362093c92bed1334c84b3f23ebf556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.295851 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15fc060-02d6-4ae3-95f5-60660050cced\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fcdee1a860ddfe046b4983d9b2422cf6c993be574f549cb3667179d92944884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa8d355d4394ba0854fc067550b15f70e9e72a2a1efb0d7f981574db2f81836c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://745df7e6079182e2ba297b8e268d8a1647ce0f856e99f8f90d3dc92a6a071087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae20d8a14fc0db716ccf1450078acdb98f14e5522b17a00b243c67300d7e0b43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae20d8a14fc0db716ccf1450078acdb98f14e5522b17a00b243c67300d7e0b43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.307033 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.321984 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5814902f9f02bed9eb4600d77993f20ad463f2f78064f66755641f509913f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d1e9dabb13deee46d2c9ce72c90d2ed09c3e293b11b987c0c4be5b9a3cebe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.336115 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fade3c6b87d8b8d465b4e59ca3685265f09365b19feab5f4d31c18fecdea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.348391 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8a90f1a-4f3b-4007-8635-52d0af689af0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3ec5dcde986c3e369253eb2c4fc8d6e646c9c1557bc5c28a98a6c71562d04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e3c29b09715bd660a94365179740c46dd6bbeb04b333ecb5dadbf8cdea5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6c54394c01d6097ec29247b0676fc04989db88066d482eb053b5e9b040b630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://941587f1cf00e0b2279e8f08f1bad8e477f14a02f6cfc8ed858d0196e01f29f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.358000 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94431d54539f9725aa9b32ac709bc9406dd0f1a1f82c3de77b3fefe7a34e3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.369344 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.372465 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.372495 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.372532 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.372551 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.372562 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:26Z","lastTransitionTime":"2025-12-04T10:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.379502 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8475bb26-8864-4d49-935b-db7d4cb73387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bf0cca6fa6a2c105fe80613113f4e04f5252d70eb5de23b53632ba8479e070\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399a139e421589fc50e4bad256402be123e37600f5c39eadf919ce767e9eeeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g76nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.389012 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fd6cw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b50ce71-ca0a-4532-86e9-4f779dcc7b93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fd6cw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:26Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.475304 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.475358 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.475371 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.475387 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.475395 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:26Z","lastTransitionTime":"2025-12-04T10:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.578084 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.578582 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.578857 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.579044 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.579249 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:26Z","lastTransitionTime":"2025-12-04T10:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.682293 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.682331 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.682343 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.682358 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.682369 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:26Z","lastTransitionTime":"2025-12-04T10:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.785025 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.785074 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.785086 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.785102 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.785113 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:26Z","lastTransitionTime":"2025-12-04T10:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.887458 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.887496 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.887506 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.887521 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.887530 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:26Z","lastTransitionTime":"2025-12-04T10:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.989778 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.989819 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.989829 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.989845 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:26 crc kubenswrapper[4831]: I1204 10:16:26.989856 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:26Z","lastTransitionTime":"2025-12-04T10:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.092582 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.092650 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.092674 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.092692 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.092704 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:27Z","lastTransitionTime":"2025-12-04T10:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.195456 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.195503 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.195513 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.195530 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.195541 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:27Z","lastTransitionTime":"2025-12-04T10:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.276247 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:16:27 crc kubenswrapper[4831]: E1204 10:16:27.276817 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.297784 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.297819 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.297827 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.297842 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.297852 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:27Z","lastTransitionTime":"2025-12-04T10:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.400642 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.400715 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.400727 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.400746 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.400758 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:27Z","lastTransitionTime":"2025-12-04T10:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.503114 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.503171 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.503188 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.503215 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.503232 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:27Z","lastTransitionTime":"2025-12-04T10:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.605549 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.605611 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.605624 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.605641 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.605707 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:27Z","lastTransitionTime":"2025-12-04T10:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.645489 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.645533 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.645541 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.645556 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.645575 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:27Z","lastTransitionTime":"2025-12-04T10:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:27 crc kubenswrapper[4831]: E1204 10:16:27.662507 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae8c7b9f-5f4a-44bc-a820-5600f29471a7\\\",\\\"systemUUID\\\":\\\"aaf9904b-e604-46a1-bdf5-7d2b7b9a992c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.666825 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.666857 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.666885 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.666899 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.666908 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:27Z","lastTransitionTime":"2025-12-04T10:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:27 crc kubenswrapper[4831]: E1204 10:16:27.684343 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae8c7b9f-5f4a-44bc-a820-5600f29471a7\\\",\\\"systemUUID\\\":\\\"aaf9904b-e604-46a1-bdf5-7d2b7b9a992c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.688486 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.688518 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.688528 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.688545 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.688559 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:27Z","lastTransitionTime":"2025-12-04T10:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:27 crc kubenswrapper[4831]: E1204 10:16:27.701047 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae8c7b9f-5f4a-44bc-a820-5600f29471a7\\\",\\\"systemUUID\\\":\\\"aaf9904b-e604-46a1-bdf5-7d2b7b9a992c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.704157 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.704191 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.704198 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.704214 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.704224 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:27Z","lastTransitionTime":"2025-12-04T10:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:27 crc kubenswrapper[4831]: E1204 10:16:27.715465 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae8c7b9f-5f4a-44bc-a820-5600f29471a7\\\",\\\"systemUUID\\\":\\\"aaf9904b-e604-46a1-bdf5-7d2b7b9a992c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.722324 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.722366 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.722376 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.722392 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.722406 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:27Z","lastTransitionTime":"2025-12-04T10:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:27 crc kubenswrapper[4831]: E1204 10:16:27.736903 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T10:16:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae8c7b9f-5f4a-44bc-a820-5600f29471a7\\\",\\\"systemUUID\\\":\\\"aaf9904b-e604-46a1-bdf5-7d2b7b9a992c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:27Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:27 crc kubenswrapper[4831]: E1204 10:16:27.737069 4831 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.738939 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.738969 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.739001 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.739019 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.739030 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:27Z","lastTransitionTime":"2025-12-04T10:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.841389 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.841453 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.841470 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.841493 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.841513 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:27Z","lastTransitionTime":"2025-12-04T10:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.944975 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.945018 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.945029 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.945047 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:27 crc kubenswrapper[4831]: I1204 10:16:27.945058 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:27Z","lastTransitionTime":"2025-12-04T10:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.048304 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.048374 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.048384 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.048418 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.048428 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:28Z","lastTransitionTime":"2025-12-04T10:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.151140 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.151187 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.151199 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.151221 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.151234 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:28Z","lastTransitionTime":"2025-12-04T10:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.254207 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.254250 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.254260 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.254279 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.254290 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:28Z","lastTransitionTime":"2025-12-04T10:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.275790 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.275829 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.275844 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:16:28 crc kubenswrapper[4831]: E1204 10:16:28.275951 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:16:28 crc kubenswrapper[4831]: E1204 10:16:28.276053 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:16:28 crc kubenswrapper[4831]: E1204 10:16:28.276165 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.357860 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.357946 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.357964 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.357993 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.358017 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:28Z","lastTransitionTime":"2025-12-04T10:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.460375 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.460441 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.460452 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.460474 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.460489 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:28Z","lastTransitionTime":"2025-12-04T10:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.563152 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.563191 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.563200 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.563216 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.563226 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:28Z","lastTransitionTime":"2025-12-04T10:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.666030 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.666100 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.666114 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.666130 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.666142 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:28Z","lastTransitionTime":"2025-12-04T10:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.768507 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.768578 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.768591 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.768607 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.768617 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:28Z","lastTransitionTime":"2025-12-04T10:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.872404 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.872468 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.872484 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.872507 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.872525 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:28Z","lastTransitionTime":"2025-12-04T10:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.974526 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.974553 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.974563 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.974597 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:28 crc kubenswrapper[4831]: I1204 10:16:28.974607 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:28Z","lastTransitionTime":"2025-12-04T10:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.077094 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.077134 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.077142 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.077156 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.077182 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:29Z","lastTransitionTime":"2025-12-04T10:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.179868 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.179931 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.179944 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.179960 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.179971 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:29Z","lastTransitionTime":"2025-12-04T10:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.275686 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:16:29 crc kubenswrapper[4831]: E1204 10:16:29.275899 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.281502 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.281558 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.281577 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.281598 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.281617 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:29Z","lastTransitionTime":"2025-12-04T10:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.384332 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.384378 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.384420 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.384439 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.384449 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:29Z","lastTransitionTime":"2025-12-04T10:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.487110 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.487153 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.487164 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.487178 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.487188 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:29Z","lastTransitionTime":"2025-12-04T10:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.589086 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.589155 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.589172 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.589187 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.589197 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:29Z","lastTransitionTime":"2025-12-04T10:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.691237 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.691264 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.691275 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.691288 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.691296 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:29Z","lastTransitionTime":"2025-12-04T10:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.793796 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.793838 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.793849 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.793867 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.793879 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:29Z","lastTransitionTime":"2025-12-04T10:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.897453 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.897564 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.897586 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.897783 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:29 crc kubenswrapper[4831]: I1204 10:16:29.897811 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:29Z","lastTransitionTime":"2025-12-04T10:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.000713 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.000781 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.000792 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.000810 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.000844 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:30Z","lastTransitionTime":"2025-12-04T10:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.102915 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.102954 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.102966 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.102982 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.102994 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:30Z","lastTransitionTime":"2025-12-04T10:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.205703 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.205765 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.205786 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.205815 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.205837 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:30Z","lastTransitionTime":"2025-12-04T10:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.275902 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.276021 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:16:30 crc kubenswrapper[4831]: E1204 10:16:30.276084 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:16:30 crc kubenswrapper[4831]: E1204 10:16:30.276290 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.275916 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:16:30 crc kubenswrapper[4831]: E1204 10:16:30.276478 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.309123 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.309161 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.309171 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.309206 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.309216 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:30Z","lastTransitionTime":"2025-12-04T10:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.411881 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.411926 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.411937 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.411952 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.411965 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:30Z","lastTransitionTime":"2025-12-04T10:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.514714 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.514752 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.514764 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.514780 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.514792 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:30Z","lastTransitionTime":"2025-12-04T10:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.617821 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.617880 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.617897 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.617920 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.617939 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:30Z","lastTransitionTime":"2025-12-04T10:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.720939 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.720988 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.721004 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.721028 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.721044 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:30Z","lastTransitionTime":"2025-12-04T10:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.823460 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.823517 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.823534 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.823557 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.823577 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:30Z","lastTransitionTime":"2025-12-04T10:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.926541 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.926567 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.926574 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.926586 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:30 crc kubenswrapper[4831]: I1204 10:16:30.926595 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:30Z","lastTransitionTime":"2025-12-04T10:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.030293 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.030330 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.030341 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.030357 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.030369 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:31Z","lastTransitionTime":"2025-12-04T10:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.132911 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.132986 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.133005 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.133029 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.133047 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:31Z","lastTransitionTime":"2025-12-04T10:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.235451 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.235481 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.235492 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.235508 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.235519 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:31Z","lastTransitionTime":"2025-12-04T10:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.275654 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:16:31 crc kubenswrapper[4831]: E1204 10:16:31.275879 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.337603 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.337632 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.337641 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.337654 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.337685 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:31Z","lastTransitionTime":"2025-12-04T10:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.440274 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.440303 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.440311 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.440325 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.440333 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:31Z","lastTransitionTime":"2025-12-04T10:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.542850 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.542896 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.542907 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.542923 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.542934 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:31Z","lastTransitionTime":"2025-12-04T10:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.646098 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.646157 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.646173 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.646199 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.646215 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:31Z","lastTransitionTime":"2025-12-04T10:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.748248 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.748299 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.748311 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.748328 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.748340 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:31Z","lastTransitionTime":"2025-12-04T10:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.851070 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.851109 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.851119 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.851135 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.851145 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:31Z","lastTransitionTime":"2025-12-04T10:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.953372 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.954128 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.954292 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.954395 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:31 crc kubenswrapper[4831]: I1204 10:16:31.954488 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:31Z","lastTransitionTime":"2025-12-04T10:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.057270 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.057310 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.057321 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.057339 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.057351 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:32Z","lastTransitionTime":"2025-12-04T10:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.160848 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.160904 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.160935 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.160951 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.160963 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:32Z","lastTransitionTime":"2025-12-04T10:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.263648 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.263696 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.263706 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.263722 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.263732 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:32Z","lastTransitionTime":"2025-12-04T10:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.275537 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.275537 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:16:32 crc kubenswrapper[4831]: E1204 10:16:32.275682 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:16:32 crc kubenswrapper[4831]: E1204 10:16:32.275758 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.275558 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:16:32 crc kubenswrapper[4831]: E1204 10:16:32.275859 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.366444 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.366654 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.366752 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.366818 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.366876 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:32Z","lastTransitionTime":"2025-12-04T10:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.469448 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.469771 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.469869 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.469945 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.470001 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:32Z","lastTransitionTime":"2025-12-04T10:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.572356 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.572664 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.572933 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.573153 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.573302 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:32Z","lastTransitionTime":"2025-12-04T10:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.676805 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.676845 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.676859 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.676874 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.676883 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:32Z","lastTransitionTime":"2025-12-04T10:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.780324 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.780380 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.780400 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.780427 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.780444 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:32Z","lastTransitionTime":"2025-12-04T10:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.883589 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.883648 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.883705 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.883727 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.883739 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:32Z","lastTransitionTime":"2025-12-04T10:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.986889 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.987384 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.987422 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.987458 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:32 crc kubenswrapper[4831]: I1204 10:16:32.987482 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:32Z","lastTransitionTime":"2025-12-04T10:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.089831 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.089893 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.089910 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.089935 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.089951 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:33Z","lastTransitionTime":"2025-12-04T10:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.193013 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.193427 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.193577 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.193768 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.193949 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:33Z","lastTransitionTime":"2025-12-04T10:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.276353 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:16:33 crc kubenswrapper[4831]: E1204 10:16:33.276602 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.296953 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.297000 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.297011 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.297028 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.297039 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:33Z","lastTransitionTime":"2025-12-04T10:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.298874 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9ft9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3938322-cab2-412a-91e4-904ce2d99adf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a45d7846aee5ad8d18b94a52fb2d4e99365bc0a6c5b774d33b26e7a107fc825d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4l7pt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9ft9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:33Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.313587 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:33Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.334972 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5g27v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6a78509-d612-4338-8562-9b0627c1793f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4bd1b329fe50dcd62c415378ed6a27d47c10396502c612f82a95540cd56501a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8109a06b8e0bea96f12e0e1cf757a34d1da4d89159cac1247613d59cfa5645df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T10:16:10Z\\\",\\\"message\\\":\\\"2025-12-04T10:15:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_fa2cef0a-01d4-4a99-8b70-548df287b1a3\\\\n2025-12-04T10:15:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_fa2cef0a-01d4-4a99-8b70-548df287b1a3 to /host/opt/cni/bin/\\\\n2025-12-04T10:15:25Z [verbose] multus-daemon started\\\\n2025-12-04T10:15:25Z [verbose] Readiness Indicator file check\\\\n2025-12-04T10:16:10Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:16:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5g27v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:33Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.356153 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e913d104de697629b6461dc48f3d2e2f336dce5f39f5d9a976f749dca87a2382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e913d104de697629b6461dc48f3d2e2f336dce5f39f5d9a976f749dca87a2382\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T10:16:21Z\\\",\\\"message\\\":\\\"roller.go:776] Recording success event on pod openshift-multus/multus-5g27v\\\\nI1204 10:16:21.380417 6854 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 3.155852ms)\\\\nI1204 10:16:21.380190 6854 services_controller.go:454] Service openshift-machine-api/machine-api-operator-machine-webhook for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1204 10:16:21.380489 6854 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/machine-api-operator-machine-webhook_TCP_cluster\\\\\\\", UUID:\\\\\\\"de88cb48-af91-44f8-b3c0-73dcf8201ba5\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator-machine-webhook\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/machine-api-operator-machine-webhook_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:16:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4xzkp_openshift-ovn-kubernetes(1261b9db-fe52-4fbc-9a9c-7e0c3486276e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb7xl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xzkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:33Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.368923 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttr4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca0a26ee-8553-41fc-8723-935bd994e3dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87e3574c160272839c8c85471f3e37b92c4bfaac028513cb345db9c804f73ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmwqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c389f190e572a728addfeaa6cb3bac8ece2ffa00fd451e584b5bcf4965256fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmwqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ttr4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:33Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.378562 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600fe973-1869-4a12-9354-a394ed648521\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bc05b9fff58df35fc8bfab430467a16ae22396ce878934c9dd2ad0a21043f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2536b44f913751976bc76295ac52a233a3de641e3fc272594a4bd94a5c788755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2536b44f913751976bc76295ac52a233a3de641e3fc272594a4bd94a5c788755\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:33Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.391793 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7266f967-3803-4ef3-9609-5a9c540a8305\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 10:15:16.683634 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 10:15:16.685267 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-316787488/tls.crt::/tmp/serving-cert-316787488/tls.key\\\\\\\"\\\\nI1204 10:15:22.063722 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 10:15:22.066433 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 10:15:22.066471 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 10:15:22.066492 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 10:15:22.066499 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 10:15:22.070852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 10:15:22.070873 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070879 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 10:15:22.070885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 10:15:22.070889 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 10:15:22.070892 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI1204 10:15:22.070891 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 10:15:22.070895 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 10:15:22.073753 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:33Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.399386 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.399646 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.399750 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.399818 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.399880 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:33Z","lastTransitionTime":"2025-12-04T10:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.407814 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4f6422d-d5d2-4e56-8f87-84846b4b98eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c223077ddfa1e5cb3f1efbb02dcac1239c34d2337f398635019c8fe6e4509097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dba2f4f3bb62a9f81ffcc64a58e31659ffbc8c9baac1ffd78e0619250472dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa336464b730e799f5b76fcadf827833b0bbe5f89d5b40894f9f98be868107bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://455b945a4ae8a6983d99c4a31fb0cf08ff0b845ec4f943f4e5aef6a46ff78dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63637d69c4e8f00282d8845aca3ffffa20071f17d6c0d03100ac026a7310ab38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca54389ba22ff227fa5b2691f0624466f23b6156217037deb0f04df33a7a42d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13cd70850618a98788d7713e79046e758411d6b165da3b73c0e29b46688cd38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5j6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zk2rt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:33Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.423343 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5814902f9f02bed9eb4600d77993f20ad463f2f78064f66755641f509913f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d1e9dabb13deee46d2c9ce72c90d2ed09c3e293b11b987c0c4be5b9a3cebe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:33Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.435060 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3fade3c6b87d8b8d465b4e59ca3685265f09365b19feab5f4d31c18fecdea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:33Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.445405 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc448500592bc03a05d95154710642daf9362093c92bed1334c84b3f23ebf556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:33Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.454541 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15fc060-02d6-4ae3-95f5-60660050cced\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fcdee1a860ddfe046b4983d9b2422cf6c993be574f549cb3667179d92944884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa8d355d4394ba0854fc067550b15f70e9e72a2a1efb0d7f981574db2f81836c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://745df7e6079182e2ba297b8e268d8a1647ce0f856e99f8f90d3dc92a6a071087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae20d8a14fc0db716ccf1450078acdb98f14e5522b17a00b243c67300d7e0b43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae20d8a14fc0db716ccf1450078acdb98f14e5522b17a00b243c67300d7e0b43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T10:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:33Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.463994 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:33Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.473524 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8475bb26-8864-4d49-935b-db7d4cb73387\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6bf0cca6fa6a2c105fe80613113f4e04f5252d70eb5de23b53632ba8479e070\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399a139e421589fc50e4bad256402be123e37600f5c39eadf919ce767e9eeeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mt7gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-g76nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:33Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.482468 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fd6cw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b50ce71-ca0a-4532-86e9-4f779dcc7b93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fd6cw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:33Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.491726 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8a90f1a-4f3b-4007-8635-52d0af689af0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3ec5dcde986c3e369253eb2c4fc8d6e646c9c1557bc5c28a98a6c71562d04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e3c29b09715bd660a94365179740c46dd6bbeb04b333ecb5dadbf8cdea5cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6c54394c01d6097ec29247b0676fc04989db88066d482eb053b5e9b040b630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://941587f1cf00e0b2279e8f08f1bad8e477f14a02f6cfc8ed858d0196e01f29f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:33Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.500006 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xc5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a02957-2de3-4874-b43e-85be9e748dab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c94431d54539f9725aa9b32ac709bc9406dd0f1a1f82c3de77b3fefe7a34e3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T10:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l96vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T10:15:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xc5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:33Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.501314 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.501338 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.501373 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.501394 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.501409 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:33Z","lastTransitionTime":"2025-12-04T10:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.510595 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T10:15:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T10:16:33Z is after 2025-08-24T17:21:41Z" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.603932 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.603962 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.603970 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.603982 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.603990 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:33Z","lastTransitionTime":"2025-12-04T10:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.706016 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.706082 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.706104 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.706137 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.706165 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:33Z","lastTransitionTime":"2025-12-04T10:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.809400 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.809446 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.809461 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.809480 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.809497 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:33Z","lastTransitionTime":"2025-12-04T10:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.911403 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.911448 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.911459 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.911475 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:33 crc kubenswrapper[4831]: I1204 10:16:33.911487 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:33Z","lastTransitionTime":"2025-12-04T10:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.013787 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.013832 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.013843 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.013861 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.013873 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:34Z","lastTransitionTime":"2025-12-04T10:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.115962 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.116006 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.116014 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.116027 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.116035 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:34Z","lastTransitionTime":"2025-12-04T10:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.218129 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.218175 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.218187 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.218205 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.218217 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:34Z","lastTransitionTime":"2025-12-04T10:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.276278 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.276323 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.276390 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:16:34 crc kubenswrapper[4831]: E1204 10:16:34.276433 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:16:34 crc kubenswrapper[4831]: E1204 10:16:34.276504 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:16:34 crc kubenswrapper[4831]: E1204 10:16:34.276604 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.320371 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.320421 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.320432 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.320451 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.320462 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:34Z","lastTransitionTime":"2025-12-04T10:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.423802 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.423832 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.423841 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.423854 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.423863 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:34Z","lastTransitionTime":"2025-12-04T10:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.531391 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.531773 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.531904 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.532040 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.532156 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:34Z","lastTransitionTime":"2025-12-04T10:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.635862 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.636149 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.636324 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.636459 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.636590 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:34Z","lastTransitionTime":"2025-12-04T10:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.739567 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.739593 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.739601 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.739613 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.739622 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:34Z","lastTransitionTime":"2025-12-04T10:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.843193 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.843285 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.843309 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.843341 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.843361 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:34Z","lastTransitionTime":"2025-12-04T10:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.946780 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.946828 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.946837 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.946853 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:34 crc kubenswrapper[4831]: I1204 10:16:34.946864 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:34Z","lastTransitionTime":"2025-12-04T10:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.049994 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.050047 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.050063 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.050088 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.050104 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:35Z","lastTransitionTime":"2025-12-04T10:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.152439 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.152480 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.152490 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.152504 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.152515 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:35Z","lastTransitionTime":"2025-12-04T10:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.255411 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.255455 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.255467 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.255484 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.255495 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:35Z","lastTransitionTime":"2025-12-04T10:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.276198 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:16:35 crc kubenswrapper[4831]: E1204 10:16:35.276332 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.357983 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.358230 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.358322 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.358426 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.358500 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:35Z","lastTransitionTime":"2025-12-04T10:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.461076 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.461119 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.461130 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.461172 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.461184 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:35Z","lastTransitionTime":"2025-12-04T10:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.563115 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.563148 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.563158 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.563171 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.563181 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:35Z","lastTransitionTime":"2025-12-04T10:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.665997 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.666058 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.666067 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.666081 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.666090 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:35Z","lastTransitionTime":"2025-12-04T10:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.768373 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.768413 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.768425 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.768440 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.768451 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:35Z","lastTransitionTime":"2025-12-04T10:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.870718 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.870743 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.870751 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.870764 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.870772 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:35Z","lastTransitionTime":"2025-12-04T10:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.973031 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.973235 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.973308 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.973370 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:35 crc kubenswrapper[4831]: I1204 10:16:35.973429 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:35Z","lastTransitionTime":"2025-12-04T10:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.076396 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.076999 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.077123 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.077242 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.077358 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:36Z","lastTransitionTime":"2025-12-04T10:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.180883 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.180939 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.180955 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.180976 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.180992 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:36Z","lastTransitionTime":"2025-12-04T10:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.275773 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.276128 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.275916 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:16:36 crc kubenswrapper[4831]: E1204 10:16:36.276313 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:16:36 crc kubenswrapper[4831]: E1204 10:16:36.276373 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:16:36 crc kubenswrapper[4831]: E1204 10:16:36.276482 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.283835 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.283904 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.283931 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.283959 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.283981 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:36Z","lastTransitionTime":"2025-12-04T10:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.387779 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.388069 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.388084 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.388106 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.388121 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:36Z","lastTransitionTime":"2025-12-04T10:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.490639 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.490922 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.491053 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.491167 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.491274 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:36Z","lastTransitionTime":"2025-12-04T10:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.594760 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.594801 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.594812 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.594830 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.594841 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:36Z","lastTransitionTime":"2025-12-04T10:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.697242 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.697551 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.697748 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.697879 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.697983 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:36Z","lastTransitionTime":"2025-12-04T10:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.806600 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.806641 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.806654 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.806690 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.806706 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:36Z","lastTransitionTime":"2025-12-04T10:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.909589 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.909652 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.909713 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.909744 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:36 crc kubenswrapper[4831]: I1204 10:16:36.909763 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:36Z","lastTransitionTime":"2025-12-04T10:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.011955 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.011996 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.012009 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.012024 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.012049 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:37Z","lastTransitionTime":"2025-12-04T10:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.114801 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.114845 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.114853 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.114871 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.114881 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:37Z","lastTransitionTime":"2025-12-04T10:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.219316 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.219395 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.219418 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.219449 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.219471 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:37Z","lastTransitionTime":"2025-12-04T10:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.276294 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:16:37 crc kubenswrapper[4831]: E1204 10:16:37.276457 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.321778 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.321834 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.321849 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.321868 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.321884 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:37Z","lastTransitionTime":"2025-12-04T10:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.424983 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.425045 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.425066 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.425092 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.425111 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:37Z","lastTransitionTime":"2025-12-04T10:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.527947 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.527983 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.527994 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.528010 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.528020 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:37Z","lastTransitionTime":"2025-12-04T10:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.630419 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.630495 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.630511 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.630532 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.630546 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:37Z","lastTransitionTime":"2025-12-04T10:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.733124 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.733171 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.733179 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.733194 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.733203 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:37Z","lastTransitionTime":"2025-12-04T10:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.744706 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.744766 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.744785 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.744805 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.744817 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T10:16:37Z","lastTransitionTime":"2025-12-04T10:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.804526 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-tbc97"] Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.804946 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tbc97" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.806997 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.807040 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.807366 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.808397 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.880439 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-5g27v" podStartSLOduration=74.880420633 podStartE2EDuration="1m14.880420633s" podCreationTimestamp="2025-12-04 10:15:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:16:37.846492658 +0000 UTC m=+94.795667962" watchObservedRunningTime="2025-12-04 10:16:37.880420633 +0000 UTC m=+94.829595947" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.885501 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/66c256de-3eb9-4fca-be55-4416cc6a6a46-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-tbc97\" (UID: \"66c256de-3eb9-4fca-be55-4416cc6a6a46\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tbc97" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.885548 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/66c256de-3eb9-4fca-be55-4416cc6a6a46-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-tbc97\" (UID: \"66c256de-3eb9-4fca-be55-4416cc6a6a46\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tbc97" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.885579 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/66c256de-3eb9-4fca-be55-4416cc6a6a46-service-ca\") pod \"cluster-version-operator-5c965bbfc6-tbc97\" (UID: \"66c256de-3eb9-4fca-be55-4416cc6a6a46\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tbc97" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.885748 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66c256de-3eb9-4fca-be55-4416cc6a6a46-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-tbc97\" (UID: \"66c256de-3eb9-4fca-be55-4416cc6a6a46\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tbc97" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.885832 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/66c256de-3eb9-4fca-be55-4416cc6a6a46-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-tbc97\" (UID: \"66c256de-3eb9-4fca-be55-4416cc6a6a46\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tbc97" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.901110 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-9ft9l" podStartSLOduration=74.901087993 podStartE2EDuration="1m14.901087993s" podCreationTimestamp="2025-12-04 10:15:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:16:37.890536427 +0000 UTC m=+94.839711761" watchObservedRunningTime="2025-12-04 10:16:37.901087993 +0000 UTC m=+94.850263317" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.909744 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=20.909723334 podStartE2EDuration="20.909723334s" podCreationTimestamp="2025-12-04 10:16:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:16:37.909675133 +0000 UTC m=+94.858850447" watchObservedRunningTime="2025-12-04 10:16:37.909723334 +0000 UTC m=+94.858898688" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.925503 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=75.925481432 podStartE2EDuration="1m15.925481432s" podCreationTimestamp="2025-12-04 10:15:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:16:37.924462852 +0000 UTC m=+94.873638176" watchObservedRunningTime="2025-12-04 10:16:37.925481432 +0000 UTC m=+94.874656786" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.945034 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-zk2rt" podStartSLOduration=74.945018479 podStartE2EDuration="1m14.945018479s" podCreationTimestamp="2025-12-04 10:15:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:16:37.944968048 +0000 UTC m=+94.894143372" watchObservedRunningTime="2025-12-04 10:16:37.945018479 +0000 UTC m=+94.894193783" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.959237 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ttr4t" podStartSLOduration=73.959217641 podStartE2EDuration="1m13.959217641s" podCreationTimestamp="2025-12-04 10:15:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:16:37.958693266 +0000 UTC m=+94.907868620" watchObservedRunningTime="2025-12-04 10:16:37.959217641 +0000 UTC m=+94.908392955" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.974206 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=38.974187116 podStartE2EDuration="38.974187116s" podCreationTimestamp="2025-12-04 10:15:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:16:37.973696962 +0000 UTC m=+94.922872366" watchObservedRunningTime="2025-12-04 10:16:37.974187116 +0000 UTC m=+94.923362430" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.986974 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/66c256de-3eb9-4fca-be55-4416cc6a6a46-service-ca\") pod \"cluster-version-operator-5c965bbfc6-tbc97\" (UID: \"66c256de-3eb9-4fca-be55-4416cc6a6a46\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tbc97" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.987034 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66c256de-3eb9-4fca-be55-4416cc6a6a46-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-tbc97\" (UID: \"66c256de-3eb9-4fca-be55-4416cc6a6a46\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tbc97" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.987059 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/66c256de-3eb9-4fca-be55-4416cc6a6a46-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-tbc97\" (UID: \"66c256de-3eb9-4fca-be55-4416cc6a6a46\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tbc97" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.987087 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/66c256de-3eb9-4fca-be55-4416cc6a6a46-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-tbc97\" (UID: \"66c256de-3eb9-4fca-be55-4416cc6a6a46\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tbc97" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.987101 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/66c256de-3eb9-4fca-be55-4416cc6a6a46-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-tbc97\" (UID: \"66c256de-3eb9-4fca-be55-4416cc6a6a46\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tbc97" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.987203 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/66c256de-3eb9-4fca-be55-4416cc6a6a46-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-tbc97\" (UID: \"66c256de-3eb9-4fca-be55-4416cc6a6a46\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tbc97" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.987203 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/66c256de-3eb9-4fca-be55-4416cc6a6a46-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-tbc97\" (UID: \"66c256de-3eb9-4fca-be55-4416cc6a6a46\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tbc97" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.987872 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/66c256de-3eb9-4fca-be55-4416cc6a6a46-service-ca\") pod \"cluster-version-operator-5c965bbfc6-tbc97\" (UID: \"66c256de-3eb9-4fca-be55-4416cc6a6a46\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tbc97" Dec 04 10:16:37 crc kubenswrapper[4831]: I1204 10:16:37.996861 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66c256de-3eb9-4fca-be55-4416cc6a6a46-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-tbc97\" (UID: \"66c256de-3eb9-4fca-be55-4416cc6a6a46\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tbc97" Dec 04 10:16:38 crc kubenswrapper[4831]: I1204 10:16:38.017200 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/66c256de-3eb9-4fca-be55-4416cc6a6a46-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-tbc97\" (UID: \"66c256de-3eb9-4fca-be55-4416cc6a6a46\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tbc97" Dec 04 10:16:38 crc kubenswrapper[4831]: I1204 10:16:38.058993 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=71.058973628 podStartE2EDuration="1m11.058973628s" podCreationTimestamp="2025-12-04 10:15:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:16:38.057906297 +0000 UTC m=+95.007081611" watchObservedRunningTime="2025-12-04 10:16:38.058973628 +0000 UTC m=+95.008148952" Dec 04 10:16:38 crc kubenswrapper[4831]: I1204 10:16:38.069218 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-xc5vd" podStartSLOduration=77.069195645 podStartE2EDuration="1m17.069195645s" podCreationTimestamp="2025-12-04 10:15:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:16:38.067946139 +0000 UTC m=+95.017121473" watchObservedRunningTime="2025-12-04 10:16:38.069195645 +0000 UTC m=+95.018370969" Dec 04 10:16:38 crc kubenswrapper[4831]: I1204 10:16:38.092801 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podStartSLOduration=75.09278272 podStartE2EDuration="1m15.09278272s" podCreationTimestamp="2025-12-04 10:15:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:16:38.09242346 +0000 UTC m=+95.041598784" watchObservedRunningTime="2025-12-04 10:16:38.09278272 +0000 UTC m=+95.041958024" Dec 04 10:16:38 crc kubenswrapper[4831]: I1204 10:16:38.119112 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tbc97" Dec 04 10:16:38 crc kubenswrapper[4831]: I1204 10:16:38.275808 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:16:38 crc kubenswrapper[4831]: I1204 10:16:38.276139 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:16:38 crc kubenswrapper[4831]: E1204 10:16:38.276254 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:16:38 crc kubenswrapper[4831]: I1204 10:16:38.276805 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:16:38 crc kubenswrapper[4831]: E1204 10:16:38.276909 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:16:38 crc kubenswrapper[4831]: E1204 10:16:38.276968 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:16:38 crc kubenswrapper[4831]: I1204 10:16:38.277741 4831 scope.go:117] "RemoveContainer" containerID="e913d104de697629b6461dc48f3d2e2f336dce5f39f5d9a976f749dca87a2382" Dec 04 10:16:38 crc kubenswrapper[4831]: E1204 10:16:38.277909 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4xzkp_openshift-ovn-kubernetes(1261b9db-fe52-4fbc-9a9c-7e0c3486276e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" Dec 04 10:16:38 crc kubenswrapper[4831]: I1204 10:16:38.290747 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 04 10:16:38 crc kubenswrapper[4831]: I1204 10:16:38.816730 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tbc97" event={"ID":"66c256de-3eb9-4fca-be55-4416cc6a6a46","Type":"ContainerStarted","Data":"e5aeeed55a0a27000cf336469e0147341caaf18d12f8933bca206c6906b761aa"} Dec 04 10:16:38 crc kubenswrapper[4831]: I1204 10:16:38.816802 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tbc97" event={"ID":"66c256de-3eb9-4fca-be55-4416cc6a6a46","Type":"ContainerStarted","Data":"32c18be79336be28d3d9ce0cd810fe3ae43f98aa5b1999925770ee0c02f3f390"} Dec 04 10:16:38 crc kubenswrapper[4831]: I1204 10:16:38.845207 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=0.845185399 podStartE2EDuration="845.185399ms" podCreationTimestamp="2025-12-04 10:16:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:16:38.843638504 +0000 UTC m=+95.792813858" watchObservedRunningTime="2025-12-04 10:16:38.845185399 +0000 UTC m=+95.794360733" Dec 04 10:16:38 crc kubenswrapper[4831]: I1204 10:16:38.858257 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tbc97" podStartSLOduration=75.858237158 podStartE2EDuration="1m15.858237158s" podCreationTimestamp="2025-12-04 10:15:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:16:38.856208769 +0000 UTC m=+95.805384093" watchObservedRunningTime="2025-12-04 10:16:38.858237158 +0000 UTC m=+95.807412472" Dec 04 10:16:39 crc kubenswrapper[4831]: I1204 10:16:39.276065 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:16:39 crc kubenswrapper[4831]: E1204 10:16:39.276210 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:16:40 crc kubenswrapper[4831]: I1204 10:16:40.276101 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:16:40 crc kubenswrapper[4831]: I1204 10:16:40.276171 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:16:40 crc kubenswrapper[4831]: I1204 10:16:40.276123 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:16:40 crc kubenswrapper[4831]: E1204 10:16:40.276292 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:16:40 crc kubenswrapper[4831]: E1204 10:16:40.276439 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:16:40 crc kubenswrapper[4831]: E1204 10:16:40.276589 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:16:41 crc kubenswrapper[4831]: I1204 10:16:41.275458 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:16:41 crc kubenswrapper[4831]: E1204 10:16:41.275648 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:16:42 crc kubenswrapper[4831]: I1204 10:16:42.027931 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b50ce71-ca0a-4532-86e9-4f779dcc7b93-metrics-certs\") pod \"network-metrics-daemon-fd6cw\" (UID: \"5b50ce71-ca0a-4532-86e9-4f779dcc7b93\") " pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:16:42 crc kubenswrapper[4831]: E1204 10:16:42.028205 4831 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 10:16:42 crc kubenswrapper[4831]: E1204 10:16:42.029007 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b50ce71-ca0a-4532-86e9-4f779dcc7b93-metrics-certs podName:5b50ce71-ca0a-4532-86e9-4f779dcc7b93 nodeName:}" failed. No retries permitted until 2025-12-04 10:17:46.02897968 +0000 UTC m=+162.978155074 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b50ce71-ca0a-4532-86e9-4f779dcc7b93-metrics-certs") pod "network-metrics-daemon-fd6cw" (UID: "5b50ce71-ca0a-4532-86e9-4f779dcc7b93") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 10:16:42 crc kubenswrapper[4831]: I1204 10:16:42.275884 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:16:42 crc kubenswrapper[4831]: I1204 10:16:42.275960 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:16:42 crc kubenswrapper[4831]: E1204 10:16:42.276084 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:16:42 crc kubenswrapper[4831]: I1204 10:16:42.276111 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:16:42 crc kubenswrapper[4831]: E1204 10:16:42.276280 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:16:42 crc kubenswrapper[4831]: E1204 10:16:42.276468 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:16:43 crc kubenswrapper[4831]: I1204 10:16:43.276151 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:16:43 crc kubenswrapper[4831]: E1204 10:16:43.277878 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:16:44 crc kubenswrapper[4831]: I1204 10:16:44.276055 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:16:44 crc kubenswrapper[4831]: I1204 10:16:44.276094 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:16:44 crc kubenswrapper[4831]: I1204 10:16:44.276069 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:16:44 crc kubenswrapper[4831]: E1204 10:16:44.276196 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:16:44 crc kubenswrapper[4831]: E1204 10:16:44.276342 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:16:44 crc kubenswrapper[4831]: E1204 10:16:44.276393 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:16:45 crc kubenswrapper[4831]: I1204 10:16:45.276176 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:16:45 crc kubenswrapper[4831]: E1204 10:16:45.276396 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:16:46 crc kubenswrapper[4831]: I1204 10:16:46.275644 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:16:46 crc kubenswrapper[4831]: I1204 10:16:46.275782 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:16:46 crc kubenswrapper[4831]: E1204 10:16:46.275947 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:16:46 crc kubenswrapper[4831]: I1204 10:16:46.276037 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:16:46 crc kubenswrapper[4831]: E1204 10:16:46.276198 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:16:46 crc kubenswrapper[4831]: E1204 10:16:46.276384 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:16:47 crc kubenswrapper[4831]: I1204 10:16:47.276321 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:16:47 crc kubenswrapper[4831]: E1204 10:16:47.276531 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:16:48 crc kubenswrapper[4831]: I1204 10:16:48.276034 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:16:48 crc kubenswrapper[4831]: I1204 10:16:48.276085 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:16:48 crc kubenswrapper[4831]: E1204 10:16:48.276287 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:16:48 crc kubenswrapper[4831]: I1204 10:16:48.276419 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:16:48 crc kubenswrapper[4831]: E1204 10:16:48.276862 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:16:48 crc kubenswrapper[4831]: E1204 10:16:48.276990 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:16:49 crc kubenswrapper[4831]: I1204 10:16:49.276482 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:16:49 crc kubenswrapper[4831]: E1204 10:16:49.276748 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:16:50 crc kubenswrapper[4831]: I1204 10:16:50.275970 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:16:50 crc kubenswrapper[4831]: I1204 10:16:50.276051 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:16:50 crc kubenswrapper[4831]: E1204 10:16:50.276225 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:16:50 crc kubenswrapper[4831]: I1204 10:16:50.276298 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:16:50 crc kubenswrapper[4831]: E1204 10:16:50.276467 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:16:50 crc kubenswrapper[4831]: E1204 10:16:50.276580 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:16:51 crc kubenswrapper[4831]: I1204 10:16:51.275579 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:16:51 crc kubenswrapper[4831]: E1204 10:16:51.275782 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:16:52 crc kubenswrapper[4831]: I1204 10:16:52.276304 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:16:52 crc kubenswrapper[4831]: I1204 10:16:52.276346 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:16:52 crc kubenswrapper[4831]: I1204 10:16:52.276305 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:16:52 crc kubenswrapper[4831]: E1204 10:16:52.276430 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:16:52 crc kubenswrapper[4831]: E1204 10:16:52.276551 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:16:52 crc kubenswrapper[4831]: E1204 10:16:52.276843 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:16:53 crc kubenswrapper[4831]: I1204 10:16:53.276409 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:16:53 crc kubenswrapper[4831]: E1204 10:16:53.277818 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:16:53 crc kubenswrapper[4831]: I1204 10:16:53.278032 4831 scope.go:117] "RemoveContainer" containerID="e913d104de697629b6461dc48f3d2e2f336dce5f39f5d9a976f749dca87a2382" Dec 04 10:16:53 crc kubenswrapper[4831]: E1204 10:16:53.278232 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4xzkp_openshift-ovn-kubernetes(1261b9db-fe52-4fbc-9a9c-7e0c3486276e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" Dec 04 10:16:54 crc kubenswrapper[4831]: I1204 10:16:54.275723 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:16:54 crc kubenswrapper[4831]: I1204 10:16:54.275765 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:16:54 crc kubenswrapper[4831]: I1204 10:16:54.275890 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:16:54 crc kubenswrapper[4831]: E1204 10:16:54.276058 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:16:54 crc kubenswrapper[4831]: E1204 10:16:54.276179 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:16:54 crc kubenswrapper[4831]: E1204 10:16:54.276237 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:16:55 crc kubenswrapper[4831]: I1204 10:16:55.276299 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:16:55 crc kubenswrapper[4831]: E1204 10:16:55.277391 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:16:56 crc kubenswrapper[4831]: I1204 10:16:56.275912 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:16:56 crc kubenswrapper[4831]: I1204 10:16:56.275996 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:16:56 crc kubenswrapper[4831]: E1204 10:16:56.276090 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:16:56 crc kubenswrapper[4831]: E1204 10:16:56.276280 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:16:56 crc kubenswrapper[4831]: I1204 10:16:56.276393 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:16:56 crc kubenswrapper[4831]: E1204 10:16:56.276869 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:16:57 crc kubenswrapper[4831]: I1204 10:16:57.276554 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:16:57 crc kubenswrapper[4831]: E1204 10:16:57.276880 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:16:57 crc kubenswrapper[4831]: I1204 10:16:57.900523 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5g27v_c6a78509-d612-4338-8562-9b0627c1793f/kube-multus/1.log" Dec 04 10:16:57 crc kubenswrapper[4831]: I1204 10:16:57.901504 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5g27v_c6a78509-d612-4338-8562-9b0627c1793f/kube-multus/0.log" Dec 04 10:16:57 crc kubenswrapper[4831]: I1204 10:16:57.901596 4831 generic.go:334] "Generic (PLEG): container finished" podID="c6a78509-d612-4338-8562-9b0627c1793f" containerID="f4bd1b329fe50dcd62c415378ed6a27d47c10396502c612f82a95540cd56501a" exitCode=1 Dec 04 10:16:57 crc kubenswrapper[4831]: I1204 10:16:57.901643 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5g27v" event={"ID":"c6a78509-d612-4338-8562-9b0627c1793f","Type":"ContainerDied","Data":"f4bd1b329fe50dcd62c415378ed6a27d47c10396502c612f82a95540cd56501a"} Dec 04 10:16:57 crc kubenswrapper[4831]: I1204 10:16:57.901759 4831 scope.go:117] "RemoveContainer" containerID="8109a06b8e0bea96f12e0e1cf757a34d1da4d89159cac1247613d59cfa5645df" Dec 04 10:16:57 crc kubenswrapper[4831]: I1204 10:16:57.902640 4831 scope.go:117] "RemoveContainer" containerID="f4bd1b329fe50dcd62c415378ed6a27d47c10396502c612f82a95540cd56501a" Dec 04 10:16:57 crc kubenswrapper[4831]: E1204 10:16:57.903071 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-5g27v_openshift-multus(c6a78509-d612-4338-8562-9b0627c1793f)\"" pod="openshift-multus/multus-5g27v" podUID="c6a78509-d612-4338-8562-9b0627c1793f" Dec 04 10:16:58 crc kubenswrapper[4831]: I1204 10:16:58.275631 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:16:58 crc kubenswrapper[4831]: I1204 10:16:58.275695 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:16:58 crc kubenswrapper[4831]: I1204 10:16:58.275715 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:16:58 crc kubenswrapper[4831]: E1204 10:16:58.275769 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:16:58 crc kubenswrapper[4831]: E1204 10:16:58.275945 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:16:58 crc kubenswrapper[4831]: E1204 10:16:58.276024 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:16:58 crc kubenswrapper[4831]: I1204 10:16:58.907378 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5g27v_c6a78509-d612-4338-8562-9b0627c1793f/kube-multus/1.log" Dec 04 10:16:59 crc kubenswrapper[4831]: I1204 10:16:59.276020 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:16:59 crc kubenswrapper[4831]: E1204 10:16:59.276243 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:17:00 crc kubenswrapper[4831]: I1204 10:17:00.275912 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:17:00 crc kubenswrapper[4831]: I1204 10:17:00.276173 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:17:00 crc kubenswrapper[4831]: I1204 10:17:00.276175 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:17:00 crc kubenswrapper[4831]: E1204 10:17:00.276344 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:17:00 crc kubenswrapper[4831]: E1204 10:17:00.276441 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:17:00 crc kubenswrapper[4831]: E1204 10:17:00.276495 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:17:01 crc kubenswrapper[4831]: I1204 10:17:01.275924 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:17:01 crc kubenswrapper[4831]: E1204 10:17:01.276472 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:17:02 crc kubenswrapper[4831]: I1204 10:17:02.275836 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:17:02 crc kubenswrapper[4831]: I1204 10:17:02.275897 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:17:02 crc kubenswrapper[4831]: I1204 10:17:02.275921 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:17:02 crc kubenswrapper[4831]: E1204 10:17:02.275986 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:17:02 crc kubenswrapper[4831]: E1204 10:17:02.276088 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:17:02 crc kubenswrapper[4831]: E1204 10:17:02.276251 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:17:03 crc kubenswrapper[4831]: I1204 10:17:03.275994 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:17:03 crc kubenswrapper[4831]: E1204 10:17:03.277781 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:17:03 crc kubenswrapper[4831]: E1204 10:17:03.285820 4831 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 04 10:17:03 crc kubenswrapper[4831]: E1204 10:17:03.410283 4831 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 04 10:17:04 crc kubenswrapper[4831]: I1204 10:17:04.276327 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:17:04 crc kubenswrapper[4831]: I1204 10:17:04.276341 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:17:04 crc kubenswrapper[4831]: E1204 10:17:04.276460 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:17:04 crc kubenswrapper[4831]: I1204 10:17:04.276535 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:17:04 crc kubenswrapper[4831]: E1204 10:17:04.276630 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:17:04 crc kubenswrapper[4831]: E1204 10:17:04.276903 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:17:05 crc kubenswrapper[4831]: I1204 10:17:05.279103 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:17:05 crc kubenswrapper[4831]: E1204 10:17:05.279311 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:17:06 crc kubenswrapper[4831]: I1204 10:17:06.276231 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:17:06 crc kubenswrapper[4831]: I1204 10:17:06.276477 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:17:06 crc kubenswrapper[4831]: E1204 10:17:06.276620 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:17:06 crc kubenswrapper[4831]: I1204 10:17:06.276500 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:17:06 crc kubenswrapper[4831]: E1204 10:17:06.277017 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:17:06 crc kubenswrapper[4831]: I1204 10:17:06.276919 4831 scope.go:117] "RemoveContainer" containerID="e913d104de697629b6461dc48f3d2e2f336dce5f39f5d9a976f749dca87a2382" Dec 04 10:17:06 crc kubenswrapper[4831]: E1204 10:17:06.276793 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:17:06 crc kubenswrapper[4831]: I1204 10:17:06.934157 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xzkp_1261b9db-fe52-4fbc-9a9c-7e0c3486276e/ovnkube-controller/3.log" Dec 04 10:17:06 crc kubenswrapper[4831]: I1204 10:17:06.937089 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" event={"ID":"1261b9db-fe52-4fbc-9a9c-7e0c3486276e","Type":"ContainerStarted","Data":"374249a0590c2fc6e217a4254dd8a1d33c409fd5486c32eddcd4f1dc30bd9f7e"} Dec 04 10:17:06 crc kubenswrapper[4831]: I1204 10:17:06.937509 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:17:06 crc kubenswrapper[4831]: I1204 10:17:06.963162 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" podStartSLOduration=103.963144024 podStartE2EDuration="1m43.963144024s" podCreationTimestamp="2025-12-04 10:15:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:17:06.962876806 +0000 UTC m=+123.912052130" watchObservedRunningTime="2025-12-04 10:17:06.963144024 +0000 UTC m=+123.912319338" Dec 04 10:17:07 crc kubenswrapper[4831]: I1204 10:17:07.060783 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fd6cw"] Dec 04 10:17:07 crc kubenswrapper[4831]: I1204 10:17:07.060887 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:17:07 crc kubenswrapper[4831]: E1204 10:17:07.060967 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:17:08 crc kubenswrapper[4831]: I1204 10:17:08.276291 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:17:08 crc kubenswrapper[4831]: I1204 10:17:08.276321 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:17:08 crc kubenswrapper[4831]: I1204 10:17:08.276329 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:17:08 crc kubenswrapper[4831]: E1204 10:17:08.276431 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:17:08 crc kubenswrapper[4831]: E1204 10:17:08.276536 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:17:08 crc kubenswrapper[4831]: E1204 10:17:08.276612 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:17:08 crc kubenswrapper[4831]: I1204 10:17:08.277012 4831 scope.go:117] "RemoveContainer" containerID="f4bd1b329fe50dcd62c415378ed6a27d47c10396502c612f82a95540cd56501a" Dec 04 10:17:08 crc kubenswrapper[4831]: E1204 10:17:08.411085 4831 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 04 10:17:08 crc kubenswrapper[4831]: I1204 10:17:08.946529 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5g27v_c6a78509-d612-4338-8562-9b0627c1793f/kube-multus/1.log" Dec 04 10:17:08 crc kubenswrapper[4831]: I1204 10:17:08.946596 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5g27v" event={"ID":"c6a78509-d612-4338-8562-9b0627c1793f","Type":"ContainerStarted","Data":"cf6417296513bd39972c5b248d9ea179e028735ddb1b875af0424524b1d2c67d"} Dec 04 10:17:09 crc kubenswrapper[4831]: I1204 10:17:09.275898 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:17:09 crc kubenswrapper[4831]: E1204 10:17:09.276900 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:17:10 crc kubenswrapper[4831]: I1204 10:17:10.275950 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:17:10 crc kubenswrapper[4831]: I1204 10:17:10.275969 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:17:10 crc kubenswrapper[4831]: I1204 10:17:10.276079 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:17:10 crc kubenswrapper[4831]: E1204 10:17:10.276247 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:17:10 crc kubenswrapper[4831]: E1204 10:17:10.276370 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:17:10 crc kubenswrapper[4831]: E1204 10:17:10.276562 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:17:11 crc kubenswrapper[4831]: I1204 10:17:11.276465 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:17:11 crc kubenswrapper[4831]: E1204 10:17:11.276611 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:17:12 crc kubenswrapper[4831]: I1204 10:17:12.275872 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:17:12 crc kubenswrapper[4831]: I1204 10:17:12.275926 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:17:12 crc kubenswrapper[4831]: I1204 10:17:12.275955 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:17:12 crc kubenswrapper[4831]: E1204 10:17:12.275993 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 10:17:12 crc kubenswrapper[4831]: E1204 10:17:12.276107 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 10:17:12 crc kubenswrapper[4831]: E1204 10:17:12.276307 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 10:17:13 crc kubenswrapper[4831]: I1204 10:17:13.276015 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:17:13 crc kubenswrapper[4831]: E1204 10:17:13.277300 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd6cw" podUID="5b50ce71-ca0a-4532-86e9-4f779dcc7b93" Dec 04 10:17:14 crc kubenswrapper[4831]: I1204 10:17:14.276176 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:17:14 crc kubenswrapper[4831]: I1204 10:17:14.276189 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:17:14 crc kubenswrapper[4831]: I1204 10:17:14.276358 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:17:14 crc kubenswrapper[4831]: I1204 10:17:14.280006 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 04 10:17:14 crc kubenswrapper[4831]: I1204 10:17:14.280003 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 04 10:17:14 crc kubenswrapper[4831]: I1204 10:17:14.280514 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 04 10:17:14 crc kubenswrapper[4831]: I1204 10:17:14.281881 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 04 10:17:15 crc kubenswrapper[4831]: I1204 10:17:15.276692 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:17:15 crc kubenswrapper[4831]: I1204 10:17:15.279532 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 04 10:17:15 crc kubenswrapper[4831]: I1204 10:17:15.279828 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.377391 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.436795 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rsttb"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.437359 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rsttb" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.438011 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-qssm4"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.443149 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.443230 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.443518 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-qssm4" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.443931 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.445325 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.453017 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.453482 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-4bj64"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.453938 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.454214 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-4bj64" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.454559 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.454978 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-pgqgv"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.455274 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-pgqgv" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.456165 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-skjn8"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.456961 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-skjn8" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.457070 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-27tcv"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.457780 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-27tcv" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.461990 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-nqztc"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.462980 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqztc" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.463047 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.463211 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.463290 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.470956 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-wjdnp"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.477457 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-cmg79"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.477846 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-d8prt"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.478122 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-l2cns"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.478486 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m44sl"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.478644 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.478668 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cmg79" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.478719 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-wjdnp" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.478932 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l2cns" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.479784 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-2h8nq"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.480232 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pbnbq"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.480439 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m44sl" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.480859 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pbnbq" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.481538 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-2h8nq" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.490840 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.490982 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.493416 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-twg79"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.493797 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xjd68"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.494177 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xjd68" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.494590 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-twg79" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.494762 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.495012 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.495023 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-6fssm"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.495122 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.495233 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.495398 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.495448 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.495521 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.495558 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.495620 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.495646 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.495746 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6fssm" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.495762 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.495829 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.495859 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.495894 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.495982 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.496012 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.502782 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.502934 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.503083 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.503136 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.503266 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.503327 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.503451 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.503511 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.503623 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.503706 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.503447 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.503841 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.503952 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.504103 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.504142 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.504316 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.504378 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.504590 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.504639 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.504759 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.504898 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.505215 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.505562 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.505688 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.495994 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.505951 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.505995 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.506221 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.506328 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.495866 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.506729 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.507078 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.507324 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.507420 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.508167 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.528054 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.528970 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.529545 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.552510 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.553014 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.554240 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.554360 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.554361 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.554502 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.554503 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.554693 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.554727 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pr7kc"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.554696 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.555228 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.555473 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pr7kc" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.555763 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.555957 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.556019 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.556100 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.556353 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9a3a8166-a7c6-4fb0-a2a1-83003acf436f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-xjd68\" (UID: \"9a3a8166-a7c6-4fb0-a2a1-83003acf436f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xjd68" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.556374 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8vzk\" (UniqueName: \"kubernetes.io/projected/7bed6763-7fd9-492f-8a67-cd65b0061950-kube-api-access-r8vzk\") pod \"console-operator-58897d9998-pgqgv\" (UID: \"7bed6763-7fd9-492f-8a67-cd65b0061950\") " pod="openshift-console-operator/console-operator-58897d9998-pgqgv" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.556389 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc1e4c46-b909-410a-980d-a045d5b3a636-trusted-ca-bundle\") pod \"console-f9d7485db-twg79\" (UID: \"bc1e4c46-b909-410a-980d-a045d5b3a636\") " pod="openshift-console/console-f9d7485db-twg79" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.556407 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kttjg\" (UniqueName: \"kubernetes.io/projected/edd157b6-46a0-4a10-94fb-670544f743ca-kube-api-access-kttjg\") pod \"route-controller-manager-6576b87f9c-rsttb\" (UID: \"edd157b6-46a0-4a10-94fb-670544f743ca\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rsttb" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.556422 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9zl7\" (UniqueName: \"kubernetes.io/projected/0d3db4fd-37e8-4257-8bb7-6ab5183de9e7-kube-api-access-l9zl7\") pod \"authentication-operator-69f744f599-27tcv\" (UID: \"0d3db4fd-37e8-4257-8bb7-6ab5183de9e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-27tcv" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.556439 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-d8prt\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.556456 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d3db4fd-37e8-4257-8bb7-6ab5183de9e7-service-ca-bundle\") pod \"authentication-operator-69f744f599-27tcv\" (UID: \"0d3db4fd-37e8-4257-8bb7-6ab5183de9e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-27tcv" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.556473 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/edd157b6-46a0-4a10-94fb-670544f743ca-client-ca\") pod \"route-controller-manager-6576b87f9c-rsttb\" (UID: \"edd157b6-46a0-4a10-94fb-670544f743ca\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rsttb" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.556488 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e57fed72-8221-44fb-8108-44e9a0dcc51a-etcd-client\") pod \"apiserver-7bbb656c7d-nqztc\" (UID: \"e57fed72-8221-44fb-8108-44e9a0dcc51a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqztc" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.556502 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bc1e4c46-b909-410a-980d-a045d5b3a636-oauth-serving-cert\") pod \"console-f9d7485db-twg79\" (UID: \"bc1e4c46-b909-410a-980d-a045d5b3a636\") " pod="openshift-console/console-f9d7485db-twg79" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.556518 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bc1e4c46-b909-410a-980d-a045d5b3a636-service-ca\") pod \"console-f9d7485db-twg79\" (UID: \"bc1e4c46-b909-410a-980d-a045d5b3a636\") " pod="openshift-console/console-f9d7485db-twg79" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.556533 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bf8f0aa6-641c-4258-bd46-541bf71d40b1-available-featuregates\") pod \"openshift-config-operator-7777fb866f-cmg79\" (UID: \"bf8f0aa6-641c-4258-bd46-541bf71d40b1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cmg79" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.556549 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmfz5\" (UniqueName: \"kubernetes.io/projected/45a39bbd-157a-404e-ae10-a4a2893de563-kube-api-access-qmfz5\") pod \"openshift-controller-manager-operator-756b6f6bc6-pbnbq\" (UID: \"45a39bbd-157a-404e-ae10-a4a2893de563\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pbnbq" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.556563 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e3ea5ca7-6c75-482a-9245-518056647743-image-import-ca\") pod \"apiserver-76f77b778f-skjn8\" (UID: \"e3ea5ca7-6c75-482a-9245-518056647743\") " pod="openshift-apiserver/apiserver-76f77b778f-skjn8" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.556578 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf8f0aa6-641c-4258-bd46-541bf71d40b1-serving-cert\") pod \"openshift-config-operator-7777fb866f-cmg79\" (UID: \"bf8f0aa6-641c-4258-bd46-541bf71d40b1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cmg79" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.556594 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2b483323-5ed6-40b5-b256-c9de7033e4eb-client-ca\") pod \"controller-manager-879f6c89f-qssm4\" (UID: \"2b483323-5ed6-40b5-b256-c9de7033e4eb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qssm4" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.556608 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-audit-policies\") pod \"oauth-openshift-558db77b4-d8prt\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.556622 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e57fed72-8221-44fb-8108-44e9a0dcc51a-audit-policies\") pod \"apiserver-7bbb656c7d-nqztc\" (UID: \"e57fed72-8221-44fb-8108-44e9a0dcc51a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqztc" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.556636 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2b483323-5ed6-40b5-b256-c9de7033e4eb-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-qssm4\" (UID: \"2b483323-5ed6-40b5-b256-c9de7033e4eb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qssm4" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.556649 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e3ea5ca7-6c75-482a-9245-518056647743-audit\") pod \"apiserver-76f77b778f-skjn8\" (UID: \"e3ea5ca7-6c75-482a-9245-518056647743\") " pod="openshift-apiserver/apiserver-76f77b778f-skjn8" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.559907 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-7czl2"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.560388 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-7b6hb"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.560422 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-7czl2" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.560914 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.560984 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.561275 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc5mt\" (UniqueName: \"kubernetes.io/projected/51167dc9-74db-4809-8d67-b9e4a48c039c-kube-api-access-bc5mt\") pod \"openshift-apiserver-operator-796bbdcf4f-m44sl\" (UID: \"51167dc9-74db-4809-8d67-b9e4a48c039c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m44sl" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.561320 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e57fed72-8221-44fb-8108-44e9a0dcc51a-encryption-config\") pod \"apiserver-7bbb656c7d-nqztc\" (UID: \"e57fed72-8221-44fb-8108-44e9a0dcc51a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqztc" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.561347 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-d8prt\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.561374 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7bed6763-7fd9-492f-8a67-cd65b0061950-trusted-ca\") pod \"console-operator-58897d9998-pgqgv\" (UID: \"7bed6763-7fd9-492f-8a67-cd65b0061950\") " pod="openshift-console-operator/console-operator-58897d9998-pgqgv" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.561392 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e3ea5ca7-6c75-482a-9245-518056647743-audit-dir\") pod \"apiserver-76f77b778f-skjn8\" (UID: \"e3ea5ca7-6c75-482a-9245-518056647743\") " pod="openshift-apiserver/apiserver-76f77b778f-skjn8" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.561445 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lfgb\" (UniqueName: \"kubernetes.io/projected/db6feab7-169d-4a82-a204-f9a1ff56f85e-kube-api-access-9lfgb\") pod \"dns-operator-744455d44c-wjdnp\" (UID: \"db6feab7-169d-4a82-a204-f9a1ff56f85e\") " pod="openshift-dns-operator/dns-operator-744455d44c-wjdnp" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.561463 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b483323-5ed6-40b5-b256-c9de7033e4eb-config\") pod \"controller-manager-879f6c89f-qssm4\" (UID: \"2b483323-5ed6-40b5-b256-c9de7033e4eb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qssm4" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.561484 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-d8prt\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.561506 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-d8prt\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.561527 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-d8prt\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.561555 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edd157b6-46a0-4a10-94fb-670544f743ca-config\") pod \"route-controller-manager-6576b87f9c-rsttb\" (UID: \"edd157b6-46a0-4a10-94fb-670544f743ca\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rsttb" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.561575 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3ea5ca7-6c75-482a-9245-518056647743-serving-cert\") pod \"apiserver-76f77b778f-skjn8\" (UID: \"e3ea5ca7-6c75-482a-9245-518056647743\") " pod="openshift-apiserver/apiserver-76f77b778f-skjn8" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.561579 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.561594 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e57fed72-8221-44fb-8108-44e9a0dcc51a-serving-cert\") pod \"apiserver-7bbb656c7d-nqztc\" (UID: \"e57fed72-8221-44fb-8108-44e9a0dcc51a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqztc" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.561610 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7da1b053-67d9-4a4c-842e-cafb5dce5017-images\") pod \"machine-api-operator-5694c8668f-4bj64\" (UID: \"7da1b053-67d9-4a4c-842e-cafb5dce5017\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4bj64" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.561878 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bed6763-7fd9-492f-8a67-cd65b0061950-config\") pod \"console-operator-58897d9998-pgqgv\" (UID: \"7bed6763-7fd9-492f-8a67-cd65b0061950\") " pod="openshift-console-operator/console-operator-58897d9998-pgqgv" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.561893 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-546qp\" (UniqueName: \"kubernetes.io/projected/e3ea5ca7-6c75-482a-9245-518056647743-kube-api-access-546qp\") pod \"apiserver-76f77b778f-skjn8\" (UID: \"e3ea5ca7-6c75-482a-9245-518056647743\") " pod="openshift-apiserver/apiserver-76f77b778f-skjn8" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.561912 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edd157b6-46a0-4a10-94fb-670544f743ca-serving-cert\") pod \"route-controller-manager-6576b87f9c-rsttb\" (UID: \"edd157b6-46a0-4a10-94fb-670544f743ca\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rsttb" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.561935 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-d8prt\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.561949 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bc1e4c46-b909-410a-980d-a045d5b3a636-console-config\") pod \"console-f9d7485db-twg79\" (UID: \"bc1e4c46-b909-410a-980d-a045d5b3a636\") " pod="openshift-console/console-f9d7485db-twg79" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.561987 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d3db4fd-37e8-4257-8bb7-6ab5183de9e7-config\") pod \"authentication-operator-69f744f599-27tcv\" (UID: \"0d3db4fd-37e8-4257-8bb7-6ab5183de9e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-27tcv" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.562001 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/db6feab7-169d-4a82-a204-f9a1ff56f85e-metrics-tls\") pod \"dns-operator-744455d44c-wjdnp\" (UID: \"db6feab7-169d-4a82-a204-f9a1ff56f85e\") " pod="openshift-dns-operator/dns-operator-744455d44c-wjdnp" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.562017 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xsnh\" (UniqueName: \"kubernetes.io/projected/08171e38-fc77-4d5a-acc9-58dca0784830-kube-api-access-6xsnh\") pod \"downloads-7954f5f757-2h8nq\" (UID: \"08171e38-fc77-4d5a-acc9-58dca0784830\") " pod="openshift-console/downloads-7954f5f757-2h8nq" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.562048 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a592d34f-e6b6-4ec2-8c63-2249f70cd3f4-auth-proxy-config\") pod \"machine-approver-56656f9798-l2cns\" (UID: \"a592d34f-e6b6-4ec2-8c63-2249f70cd3f4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l2cns" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.562064 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a592d34f-e6b6-4ec2-8c63-2249f70cd3f4-machine-approver-tls\") pod \"machine-approver-56656f9798-l2cns\" (UID: \"a592d34f-e6b6-4ec2-8c63-2249f70cd3f4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l2cns" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.562080 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-d8prt\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.562095 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-d8prt\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.562111 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45a39bbd-157a-404e-ae10-a4a2893de563-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-pbnbq\" (UID: \"45a39bbd-157a-404e-ae10-a4a2893de563\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pbnbq" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.562125 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc1e4c46-b909-410a-980d-a045d5b3a636-console-serving-cert\") pod \"console-f9d7485db-twg79\" (UID: \"bc1e4c46-b909-410a-980d-a045d5b3a636\") " pod="openshift-console/console-f9d7485db-twg79" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.562149 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mpcg\" (UniqueName: \"kubernetes.io/projected/a592d34f-e6b6-4ec2-8c63-2249f70cd3f4-kube-api-access-2mpcg\") pod \"machine-approver-56656f9798-l2cns\" (UID: \"a592d34f-e6b6-4ec2-8c63-2249f70cd3f4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l2cns" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.562164 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rb6h\" (UniqueName: \"kubernetes.io/projected/9a3a8166-a7c6-4fb0-a2a1-83003acf436f-kube-api-access-8rb6h\") pod \"cluster-samples-operator-665b6dd947-xjd68\" (UID: \"9a3a8166-a7c6-4fb0-a2a1-83003acf436f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xjd68" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.562179 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lvq7\" (UniqueName: \"kubernetes.io/projected/bf8f0aa6-641c-4258-bd46-541bf71d40b1-kube-api-access-5lvq7\") pod \"openshift-config-operator-7777fb866f-cmg79\" (UID: \"bf8f0aa6-641c-4258-bd46-541bf71d40b1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cmg79" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.562194 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-d8prt\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.562208 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bc1e4c46-b909-410a-980d-a045d5b3a636-console-oauth-config\") pod \"console-f9d7485db-twg79\" (UID: \"bc1e4c46-b909-410a-980d-a045d5b3a636\") " pod="openshift-console/console-f9d7485db-twg79" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.562221 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e3ea5ca7-6c75-482a-9245-518056647743-etcd-client\") pod \"apiserver-76f77b778f-skjn8\" (UID: \"e3ea5ca7-6c75-482a-9245-518056647743\") " pod="openshift-apiserver/apiserver-76f77b778f-skjn8" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.562254 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d3db4fd-37e8-4257-8bb7-6ab5183de9e7-serving-cert\") pod \"authentication-operator-69f744f599-27tcv\" (UID: \"0d3db4fd-37e8-4257-8bb7-6ab5183de9e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-27tcv" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.562270 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e57fed72-8221-44fb-8108-44e9a0dcc51a-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-nqztc\" (UID: \"e57fed72-8221-44fb-8108-44e9a0dcc51a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqztc" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.562284 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51167dc9-74db-4809-8d67-b9e4a48c039c-config\") pod \"openshift-apiserver-operator-796bbdcf4f-m44sl\" (UID: \"51167dc9-74db-4809-8d67-b9e4a48c039c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m44sl" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.562301 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a592d34f-e6b6-4ec2-8c63-2249f70cd3f4-config\") pod \"machine-approver-56656f9798-l2cns\" (UID: \"a592d34f-e6b6-4ec2-8c63-2249f70cd3f4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l2cns" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.562317 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7da1b053-67d9-4a4c-842e-cafb5dce5017-config\") pod \"machine-api-operator-5694c8668f-4bj64\" (UID: \"7da1b053-67d9-4a4c-842e-cafb5dce5017\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4bj64" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.562330 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e3ea5ca7-6c75-482a-9245-518056647743-etcd-serving-ca\") pod \"apiserver-76f77b778f-skjn8\" (UID: \"e3ea5ca7-6c75-482a-9245-518056647743\") " pod="openshift-apiserver/apiserver-76f77b778f-skjn8" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.562346 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e57fed72-8221-44fb-8108-44e9a0dcc51a-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-nqztc\" (UID: \"e57fed72-8221-44fb-8108-44e9a0dcc51a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqztc" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.562360 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7da1b053-67d9-4a4c-842e-cafb5dce5017-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-4bj64\" (UID: \"7da1b053-67d9-4a4c-842e-cafb5dce5017\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4bj64" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.562376 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn99k\" (UniqueName: \"kubernetes.io/projected/7da1b053-67d9-4a4c-842e-cafb5dce5017-kube-api-access-dn99k\") pod \"machine-api-operator-5694c8668f-4bj64\" (UID: \"7da1b053-67d9-4a4c-842e-cafb5dce5017\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4bj64" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.562391 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-d8prt\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.562406 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51167dc9-74db-4809-8d67-b9e4a48c039c-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-m44sl\" (UID: \"51167dc9-74db-4809-8d67-b9e4a48c039c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m44sl" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.562421 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-d8prt\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.562435 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e3ea5ca7-6c75-482a-9245-518056647743-trusted-ca-bundle\") pod \"apiserver-76f77b778f-skjn8\" (UID: \"e3ea5ca7-6c75-482a-9245-518056647743\") " pod="openshift-apiserver/apiserver-76f77b778f-skjn8" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.562449 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e57fed72-8221-44fb-8108-44e9a0dcc51a-audit-dir\") pod \"apiserver-7bbb656c7d-nqztc\" (UID: \"e57fed72-8221-44fb-8108-44e9a0dcc51a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqztc" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.562464 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bed6763-7fd9-492f-8a67-cd65b0061950-serving-cert\") pod \"console-operator-58897d9998-pgqgv\" (UID: \"7bed6763-7fd9-492f-8a67-cd65b0061950\") " pod="openshift-console-operator/console-operator-58897d9998-pgqgv" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.562480 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztdmw\" (UniqueName: \"kubernetes.io/projected/e57fed72-8221-44fb-8108-44e9a0dcc51a-kube-api-access-ztdmw\") pod \"apiserver-7bbb656c7d-nqztc\" (UID: \"e57fed72-8221-44fb-8108-44e9a0dcc51a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqztc" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.562494 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcdwq\" (UniqueName: \"kubernetes.io/projected/2b483323-5ed6-40b5-b256-c9de7033e4eb-kube-api-access-hcdwq\") pod \"controller-manager-879f6c89f-qssm4\" (UID: \"2b483323-5ed6-40b5-b256-c9de7033e4eb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qssm4" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.562510 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3ea5ca7-6c75-482a-9245-518056647743-config\") pod \"apiserver-76f77b778f-skjn8\" (UID: \"e3ea5ca7-6c75-482a-9245-518056647743\") " pod="openshift-apiserver/apiserver-76f77b778f-skjn8" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.562528 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-audit-dir\") pod \"oauth-openshift-558db77b4-d8prt\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.562542 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzs4p\" (UniqueName: \"kubernetes.io/projected/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-kube-api-access-gzs4p\") pod \"oauth-openshift-558db77b4-d8prt\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.562556 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97bjx\" (UniqueName: \"kubernetes.io/projected/bc1e4c46-b909-410a-980d-a045d5b3a636-kube-api-access-97bjx\") pod \"console-f9d7485db-twg79\" (UID: \"bc1e4c46-b909-410a-980d-a045d5b3a636\") " pod="openshift-console/console-f9d7485db-twg79" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.562572 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e3ea5ca7-6c75-482a-9245-518056647743-node-pullsecrets\") pod \"apiserver-76f77b778f-skjn8\" (UID: \"e3ea5ca7-6c75-482a-9245-518056647743\") " pod="openshift-apiserver/apiserver-76f77b778f-skjn8" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.562637 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d3db4fd-37e8-4257-8bb7-6ab5183de9e7-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-27tcv\" (UID: \"0d3db4fd-37e8-4257-8bb7-6ab5183de9e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-27tcv" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.562653 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b483323-5ed6-40b5-b256-c9de7033e4eb-serving-cert\") pod \"controller-manager-879f6c89f-qssm4\" (UID: \"2b483323-5ed6-40b5-b256-c9de7033e4eb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qssm4" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.562684 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45a39bbd-157a-404e-ae10-a4a2893de563-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-pbnbq\" (UID: \"45a39bbd-157a-404e-ae10-a4a2893de563\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pbnbq" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.562699 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e3ea5ca7-6c75-482a-9245-518056647743-encryption-config\") pod \"apiserver-76f77b778f-skjn8\" (UID: \"e3ea5ca7-6c75-482a-9245-518056647743\") " pod="openshift-apiserver/apiserver-76f77b778f-skjn8" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.563004 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.563158 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-tf7kq"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.563631 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.563736 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.563935 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.563976 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.564071 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.564085 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.564172 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.564175 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.564250 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.564303 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.566019 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l4jk9"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.566479 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l4jk9" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.566622 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tf7kq" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.567234 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.568979 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.569010 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.569275 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.569372 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.569487 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.584199 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.584226 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.584559 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-nmqqm"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.589887 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.607076 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-nmqqm" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.607651 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-bp6kt"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.608683 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.609537 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bp6kt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.610137 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2pnnb"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.611855 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.612251 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2pnnb" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.613705 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zng9d"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.615907 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.615951 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.618452 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-twgk8"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.618742 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zng9d" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.618836 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-9sxtr"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.618995 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.619044 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-twgk8" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.619765 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-9sxtr" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.620122 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-c4vs7"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.620640 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c4vs7" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.620976 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-2rbhc"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.621289 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-2rbhc" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.621950 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414055-8r2vj"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.622271 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-8r2vj" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.623133 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qkzn2"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.623681 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qkzn2" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.624705 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mgrdx"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.625112 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mgrdx" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.626567 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-frgsp"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.627036 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qp6r2"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.627501 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qp6r2" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.627593 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-frgsp" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.628609 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wtt7n"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.629091 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wtt7n" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.629620 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zln28"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.630278 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zln28" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.630894 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rsttb"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.631597 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-mlcpb"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.632315 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mlcpb" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.632473 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-lzxr6"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.633258 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lzxr6" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.633780 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-9q8xt"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.634354 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-9q8xt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.635674 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-cmg79"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.636533 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-4bj64"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.637785 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-pgqgv"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.638939 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-qssm4"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.642191 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-27tcv"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.642822 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pr7kc"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.643838 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m44sl"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.644796 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xjd68"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.646162 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-9sxtr"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.646572 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.647379 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-2h8nq"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.648518 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zng9d"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.649766 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2pnnb"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.651820 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pbnbq"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.654140 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-bp6kt"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.656168 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-skjn8"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.658003 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-wjdnp"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.659236 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.660225 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-7czl2"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.661526 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qkzn2"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.662770 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-d8prt"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.663367 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bc1e4c46-b909-410a-980d-a045d5b3a636-service-ca\") pod \"console-f9d7485db-twg79\" (UID: \"bc1e4c46-b909-410a-980d-a045d5b3a636\") " pod="openshift-console/console-f9d7485db-twg79" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.663408 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-849pd\" (UniqueName: \"kubernetes.io/projected/fbe252df-9100-4e9e-a054-1463b3fa50ed-kube-api-access-849pd\") pod \"catalog-operator-68c6474976-l4jk9\" (UID: \"fbe252df-9100-4e9e-a054-1463b3fa50ed\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l4jk9" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.663439 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bf8f0aa6-641c-4258-bd46-541bf71d40b1-available-featuregates\") pod \"openshift-config-operator-7777fb866f-cmg79\" (UID: \"bf8f0aa6-641c-4258-bd46-541bf71d40b1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cmg79" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.663464 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmfz5\" (UniqueName: \"kubernetes.io/projected/45a39bbd-157a-404e-ae10-a4a2893de563-kube-api-access-qmfz5\") pod \"openshift-controller-manager-operator-756b6f6bc6-pbnbq\" (UID: \"45a39bbd-157a-404e-ae10-a4a2893de563\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pbnbq" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.663521 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e3ea5ca7-6c75-482a-9245-518056647743-image-import-ca\") pod \"apiserver-76f77b778f-skjn8\" (UID: \"e3ea5ca7-6c75-482a-9245-518056647743\") " pod="openshift-apiserver/apiserver-76f77b778f-skjn8" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.663607 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf8f0aa6-641c-4258-bd46-541bf71d40b1-serving-cert\") pod \"openshift-config-operator-7777fb866f-cmg79\" (UID: \"bf8f0aa6-641c-4258-bd46-541bf71d40b1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cmg79" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.663652 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2b483323-5ed6-40b5-b256-c9de7033e4eb-client-ca\") pod \"controller-manager-879f6c89f-qssm4\" (UID: \"2b483323-5ed6-40b5-b256-c9de7033e4eb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qssm4" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.663698 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-audit-policies\") pod \"oauth-openshift-558db77b4-d8prt\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.663721 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e57fed72-8221-44fb-8108-44e9a0dcc51a-audit-policies\") pod \"apiserver-7bbb656c7d-nqztc\" (UID: \"e57fed72-8221-44fb-8108-44e9a0dcc51a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqztc" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.663742 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2b483323-5ed6-40b5-b256-c9de7033e4eb-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-qssm4\" (UID: \"2b483323-5ed6-40b5-b256-c9de7033e4eb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qssm4" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.663760 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e3ea5ca7-6c75-482a-9245-518056647743-audit\") pod \"apiserver-76f77b778f-skjn8\" (UID: \"e3ea5ca7-6c75-482a-9245-518056647743\") " pod="openshift-apiserver/apiserver-76f77b778f-skjn8" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.663784 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc5mt\" (UniqueName: \"kubernetes.io/projected/51167dc9-74db-4809-8d67-b9e4a48c039c-kube-api-access-bc5mt\") pod \"openshift-apiserver-operator-796bbdcf4f-m44sl\" (UID: \"51167dc9-74db-4809-8d67-b9e4a48c039c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m44sl" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.663810 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e57fed72-8221-44fb-8108-44e9a0dcc51a-encryption-config\") pod \"apiserver-7bbb656c7d-nqztc\" (UID: \"e57fed72-8221-44fb-8108-44e9a0dcc51a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqztc" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.663825 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-d8prt\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.663841 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7bed6763-7fd9-492f-8a67-cd65b0061950-trusted-ca\") pod \"console-operator-58897d9998-pgqgv\" (UID: \"7bed6763-7fd9-492f-8a67-cd65b0061950\") " pod="openshift-console-operator/console-operator-58897d9998-pgqgv" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.663859 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fbe252df-9100-4e9e-a054-1463b3fa50ed-profile-collector-cert\") pod \"catalog-operator-68c6474976-l4jk9\" (UID: \"fbe252df-9100-4e9e-a054-1463b3fa50ed\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l4jk9" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.663874 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b40be81d-febb-4200-8271-9b562f0ace35-config\") pod \"etcd-operator-b45778765-7czl2\" (UID: \"b40be81d-febb-4200-8271-9b562f0ace35\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7czl2" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.663895 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e3ea5ca7-6c75-482a-9245-518056647743-audit-dir\") pod \"apiserver-76f77b778f-skjn8\" (UID: \"e3ea5ca7-6c75-482a-9245-518056647743\") " pod="openshift-apiserver/apiserver-76f77b778f-skjn8" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.663910 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lfgb\" (UniqueName: \"kubernetes.io/projected/db6feab7-169d-4a82-a204-f9a1ff56f85e-kube-api-access-9lfgb\") pod \"dns-operator-744455d44c-wjdnp\" (UID: \"db6feab7-169d-4a82-a204-f9a1ff56f85e\") " pod="openshift-dns-operator/dns-operator-744455d44c-wjdnp" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.663925 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b483323-5ed6-40b5-b256-c9de7033e4eb-config\") pod \"controller-manager-879f6c89f-qssm4\" (UID: \"2b483323-5ed6-40b5-b256-c9de7033e4eb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qssm4" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.663940 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-d8prt\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.663961 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-d8prt\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.663986 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-d8prt\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664009 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edd157b6-46a0-4a10-94fb-670544f743ca-config\") pod \"route-controller-manager-6576b87f9c-rsttb\" (UID: \"edd157b6-46a0-4a10-94fb-670544f743ca\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rsttb" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664024 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3ea5ca7-6c75-482a-9245-518056647743-serving-cert\") pod \"apiserver-76f77b778f-skjn8\" (UID: \"e3ea5ca7-6c75-482a-9245-518056647743\") " pod="openshift-apiserver/apiserver-76f77b778f-skjn8" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664028 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bf8f0aa6-641c-4258-bd46-541bf71d40b1-available-featuregates\") pod \"openshift-config-operator-7777fb866f-cmg79\" (UID: \"bf8f0aa6-641c-4258-bd46-541bf71d40b1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cmg79" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664043 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e57fed72-8221-44fb-8108-44e9a0dcc51a-serving-cert\") pod \"apiserver-7bbb656c7d-nqztc\" (UID: \"e57fed72-8221-44fb-8108-44e9a0dcc51a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqztc" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664062 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7da1b053-67d9-4a4c-842e-cafb5dce5017-images\") pod \"machine-api-operator-5694c8668f-4bj64\" (UID: \"7da1b053-67d9-4a4c-842e-cafb5dce5017\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4bj64" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664082 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bed6763-7fd9-492f-8a67-cd65b0061950-config\") pod \"console-operator-58897d9998-pgqgv\" (UID: \"7bed6763-7fd9-492f-8a67-cd65b0061950\") " pod="openshift-console-operator/console-operator-58897d9998-pgqgv" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664102 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-546qp\" (UniqueName: \"kubernetes.io/projected/e3ea5ca7-6c75-482a-9245-518056647743-kube-api-access-546qp\") pod \"apiserver-76f77b778f-skjn8\" (UID: \"e3ea5ca7-6c75-482a-9245-518056647743\") " pod="openshift-apiserver/apiserver-76f77b778f-skjn8" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664123 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edd157b6-46a0-4a10-94fb-670544f743ca-serving-cert\") pod \"route-controller-manager-6576b87f9c-rsttb\" (UID: \"edd157b6-46a0-4a10-94fb-670544f743ca\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rsttb" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664143 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b40be81d-febb-4200-8271-9b562f0ace35-etcd-service-ca\") pod \"etcd-operator-b45778765-7czl2\" (UID: \"b40be81d-febb-4200-8271-9b562f0ace35\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7czl2" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664169 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-d8prt\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664190 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bc1e4c46-b909-410a-980d-a045d5b3a636-console-config\") pod \"console-f9d7485db-twg79\" (UID: \"bc1e4c46-b909-410a-980d-a045d5b3a636\") " pod="openshift-console/console-f9d7485db-twg79" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664223 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d3db4fd-37e8-4257-8bb7-6ab5183de9e7-config\") pod \"authentication-operator-69f744f599-27tcv\" (UID: \"0d3db4fd-37e8-4257-8bb7-6ab5183de9e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-27tcv" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664246 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/db6feab7-169d-4a82-a204-f9a1ff56f85e-metrics-tls\") pod \"dns-operator-744455d44c-wjdnp\" (UID: \"db6feab7-169d-4a82-a204-f9a1ff56f85e\") " pod="openshift-dns-operator/dns-operator-744455d44c-wjdnp" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664271 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xsnh\" (UniqueName: \"kubernetes.io/projected/08171e38-fc77-4d5a-acc9-58dca0784830-kube-api-access-6xsnh\") pod \"downloads-7954f5f757-2h8nq\" (UID: \"08171e38-fc77-4d5a-acc9-58dca0784830\") " pod="openshift-console/downloads-7954f5f757-2h8nq" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664297 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a592d34f-e6b6-4ec2-8c63-2249f70cd3f4-auth-proxy-config\") pod \"machine-approver-56656f9798-l2cns\" (UID: \"a592d34f-e6b6-4ec2-8c63-2249f70cd3f4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l2cns" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664321 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a592d34f-e6b6-4ec2-8c63-2249f70cd3f4-machine-approver-tls\") pod \"machine-approver-56656f9798-l2cns\" (UID: \"a592d34f-e6b6-4ec2-8c63-2249f70cd3f4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l2cns" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664341 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-d8prt\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664358 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45a39bbd-157a-404e-ae10-a4a2893de563-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-pbnbq\" (UID: \"45a39bbd-157a-404e-ae10-a4a2893de563\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pbnbq" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664376 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc1e4c46-b909-410a-980d-a045d5b3a636-console-serving-cert\") pod \"console-f9d7485db-twg79\" (UID: \"bc1e4c46-b909-410a-980d-a045d5b3a636\") " pod="openshift-console/console-f9d7485db-twg79" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664393 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-d8prt\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664411 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mpcg\" (UniqueName: \"kubernetes.io/projected/a592d34f-e6b6-4ec2-8c63-2249f70cd3f4-kube-api-access-2mpcg\") pod \"machine-approver-56656f9798-l2cns\" (UID: \"a592d34f-e6b6-4ec2-8c63-2249f70cd3f4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l2cns" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664411 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bc1e4c46-b909-410a-980d-a045d5b3a636-service-ca\") pod \"console-f9d7485db-twg79\" (UID: \"bc1e4c46-b909-410a-980d-a045d5b3a636\") " pod="openshift-console/console-f9d7485db-twg79" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664427 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rb6h\" (UniqueName: \"kubernetes.io/projected/9a3a8166-a7c6-4fb0-a2a1-83003acf436f-kube-api-access-8rb6h\") pod \"cluster-samples-operator-665b6dd947-xjd68\" (UID: \"9a3a8166-a7c6-4fb0-a2a1-83003acf436f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xjd68" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664463 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lvq7\" (UniqueName: \"kubernetes.io/projected/bf8f0aa6-641c-4258-bd46-541bf71d40b1-kube-api-access-5lvq7\") pod \"openshift-config-operator-7777fb866f-cmg79\" (UID: \"bf8f0aa6-641c-4258-bd46-541bf71d40b1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cmg79" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664485 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-d8prt\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664504 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bc1e4c46-b909-410a-980d-a045d5b3a636-console-oauth-config\") pod \"console-f9d7485db-twg79\" (UID: \"bc1e4c46-b909-410a-980d-a045d5b3a636\") " pod="openshift-console/console-f9d7485db-twg79" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664519 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e3ea5ca7-6c75-482a-9245-518056647743-etcd-client\") pod \"apiserver-76f77b778f-skjn8\" (UID: \"e3ea5ca7-6c75-482a-9245-518056647743\") " pod="openshift-apiserver/apiserver-76f77b778f-skjn8" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664549 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fbe252df-9100-4e9e-a054-1463b3fa50ed-srv-cert\") pod \"catalog-operator-68c6474976-l4jk9\" (UID: \"fbe252df-9100-4e9e-a054-1463b3fa50ed\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l4jk9" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664587 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d3db4fd-37e8-4257-8bb7-6ab5183de9e7-serving-cert\") pod \"authentication-operator-69f744f599-27tcv\" (UID: \"0d3db4fd-37e8-4257-8bb7-6ab5183de9e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-27tcv" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664603 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e57fed72-8221-44fb-8108-44e9a0dcc51a-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-nqztc\" (UID: \"e57fed72-8221-44fb-8108-44e9a0dcc51a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqztc" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664620 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51167dc9-74db-4809-8d67-b9e4a48c039c-config\") pod \"openshift-apiserver-operator-796bbdcf4f-m44sl\" (UID: \"51167dc9-74db-4809-8d67-b9e4a48c039c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m44sl" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664646 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a592d34f-e6b6-4ec2-8c63-2249f70cd3f4-config\") pod \"machine-approver-56656f9798-l2cns\" (UID: \"a592d34f-e6b6-4ec2-8c63-2249f70cd3f4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l2cns" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664696 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7da1b053-67d9-4a4c-842e-cafb5dce5017-config\") pod \"machine-api-operator-5694c8668f-4bj64\" (UID: \"7da1b053-67d9-4a4c-842e-cafb5dce5017\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4bj64" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664718 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e3ea5ca7-6c75-482a-9245-518056647743-etcd-serving-ca\") pod \"apiserver-76f77b778f-skjn8\" (UID: \"e3ea5ca7-6c75-482a-9245-518056647743\") " pod="openshift-apiserver/apiserver-76f77b778f-skjn8" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664740 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e57fed72-8221-44fb-8108-44e9a0dcc51a-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-nqztc\" (UID: \"e57fed72-8221-44fb-8108-44e9a0dcc51a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqztc" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664757 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7da1b053-67d9-4a4c-842e-cafb5dce5017-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-4bj64\" (UID: \"7da1b053-67d9-4a4c-842e-cafb5dce5017\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4bj64" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664773 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkqsz\" (UniqueName: \"kubernetes.io/projected/b40be81d-febb-4200-8271-9b562f0ace35-kube-api-access-kkqsz\") pod \"etcd-operator-b45778765-7czl2\" (UID: \"b40be81d-febb-4200-8271-9b562f0ace35\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7czl2" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664789 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn99k\" (UniqueName: \"kubernetes.io/projected/7da1b053-67d9-4a4c-842e-cafb5dce5017-kube-api-access-dn99k\") pod \"machine-api-operator-5694c8668f-4bj64\" (UID: \"7da1b053-67d9-4a4c-842e-cafb5dce5017\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4bj64" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664807 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-d8prt\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664804 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-audit-policies\") pod \"oauth-openshift-558db77b4-d8prt\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664825 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51167dc9-74db-4809-8d67-b9e4a48c039c-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-m44sl\" (UID: \"51167dc9-74db-4809-8d67-b9e4a48c039c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m44sl" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664864 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e3ea5ca7-6c75-482a-9245-518056647743-trusted-ca-bundle\") pod \"apiserver-76f77b778f-skjn8\" (UID: \"e3ea5ca7-6c75-482a-9245-518056647743\") " pod="openshift-apiserver/apiserver-76f77b778f-skjn8" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664894 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-d8prt\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664922 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e57fed72-8221-44fb-8108-44e9a0dcc51a-audit-dir\") pod \"apiserver-7bbb656c7d-nqztc\" (UID: \"e57fed72-8221-44fb-8108-44e9a0dcc51a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqztc" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664948 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bed6763-7fd9-492f-8a67-cd65b0061950-serving-cert\") pod \"console-operator-58897d9998-pgqgv\" (UID: \"7bed6763-7fd9-492f-8a67-cd65b0061950\") " pod="openshift-console-operator/console-operator-58897d9998-pgqgv" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.664977 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztdmw\" (UniqueName: \"kubernetes.io/projected/e57fed72-8221-44fb-8108-44e9a0dcc51a-kube-api-access-ztdmw\") pod \"apiserver-7bbb656c7d-nqztc\" (UID: \"e57fed72-8221-44fb-8108-44e9a0dcc51a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqztc" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.665001 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcdwq\" (UniqueName: \"kubernetes.io/projected/2b483323-5ed6-40b5-b256-c9de7033e4eb-kube-api-access-hcdwq\") pod \"controller-manager-879f6c89f-qssm4\" (UID: \"2b483323-5ed6-40b5-b256-c9de7033e4eb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qssm4" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.665027 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3ea5ca7-6c75-482a-9245-518056647743-config\") pod \"apiserver-76f77b778f-skjn8\" (UID: \"e3ea5ca7-6c75-482a-9245-518056647743\") " pod="openshift-apiserver/apiserver-76f77b778f-skjn8" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.665052 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-audit-dir\") pod \"oauth-openshift-558db77b4-d8prt\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.665076 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzs4p\" (UniqueName: \"kubernetes.io/projected/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-kube-api-access-gzs4p\") pod \"oauth-openshift-558db77b4-d8prt\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.665079 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bed6763-7fd9-492f-8a67-cd65b0061950-config\") pod \"console-operator-58897d9998-pgqgv\" (UID: \"7bed6763-7fd9-492f-8a67-cd65b0061950\") " pod="openshift-console-operator/console-operator-58897d9998-pgqgv" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.665101 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97bjx\" (UniqueName: \"kubernetes.io/projected/bc1e4c46-b909-410a-980d-a045d5b3a636-kube-api-access-97bjx\") pod \"console-f9d7485db-twg79\" (UID: \"bc1e4c46-b909-410a-980d-a045d5b3a636\") " pod="openshift-console/console-f9d7485db-twg79" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.665129 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e3ea5ca7-6c75-482a-9245-518056647743-node-pullsecrets\") pod \"apiserver-76f77b778f-skjn8\" (UID: \"e3ea5ca7-6c75-482a-9245-518056647743\") " pod="openshift-apiserver/apiserver-76f77b778f-skjn8" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.665169 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d3db4fd-37e8-4257-8bb7-6ab5183de9e7-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-27tcv\" (UID: \"0d3db4fd-37e8-4257-8bb7-6ab5183de9e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-27tcv" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.665199 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45a39bbd-157a-404e-ae10-a4a2893de563-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-pbnbq\" (UID: \"45a39bbd-157a-404e-ae10-a4a2893de563\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pbnbq" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.665224 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e3ea5ca7-6c75-482a-9245-518056647743-encryption-config\") pod \"apiserver-76f77b778f-skjn8\" (UID: \"e3ea5ca7-6c75-482a-9245-518056647743\") " pod="openshift-apiserver/apiserver-76f77b778f-skjn8" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.665251 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b40be81d-febb-4200-8271-9b562f0ace35-etcd-client\") pod \"etcd-operator-b45778765-7czl2\" (UID: \"b40be81d-febb-4200-8271-9b562f0ace35\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7czl2" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.665276 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b483323-5ed6-40b5-b256-c9de7033e4eb-serving-cert\") pod \"controller-manager-879f6c89f-qssm4\" (UID: \"2b483323-5ed6-40b5-b256-c9de7033e4eb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qssm4" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.665302 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9a3a8166-a7c6-4fb0-a2a1-83003acf436f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-xjd68\" (UID: \"9a3a8166-a7c6-4fb0-a2a1-83003acf436f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xjd68" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.665328 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8vzk\" (UniqueName: \"kubernetes.io/projected/7bed6763-7fd9-492f-8a67-cd65b0061950-kube-api-access-r8vzk\") pod \"console-operator-58897d9998-pgqgv\" (UID: \"7bed6763-7fd9-492f-8a67-cd65b0061950\") " pod="openshift-console-operator/console-operator-58897d9998-pgqgv" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.665351 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc1e4c46-b909-410a-980d-a045d5b3a636-trusted-ca-bundle\") pod \"console-f9d7485db-twg79\" (UID: \"bc1e4c46-b909-410a-980d-a045d5b3a636\") " pod="openshift-console/console-f9d7485db-twg79" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.665376 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kttjg\" (UniqueName: \"kubernetes.io/projected/edd157b6-46a0-4a10-94fb-670544f743ca-kube-api-access-kttjg\") pod \"route-controller-manager-6576b87f9c-rsttb\" (UID: \"edd157b6-46a0-4a10-94fb-670544f743ca\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rsttb" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.665400 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b40be81d-febb-4200-8271-9b562f0ace35-serving-cert\") pod \"etcd-operator-b45778765-7czl2\" (UID: \"b40be81d-febb-4200-8271-9b562f0ace35\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7czl2" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.665425 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9zl7\" (UniqueName: \"kubernetes.io/projected/0d3db4fd-37e8-4257-8bb7-6ab5183de9e7-kube-api-access-l9zl7\") pod \"authentication-operator-69f744f599-27tcv\" (UID: \"0d3db4fd-37e8-4257-8bb7-6ab5183de9e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-27tcv" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.665453 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-d8prt\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.665481 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d3db4fd-37e8-4257-8bb7-6ab5183de9e7-service-ca-bundle\") pod \"authentication-operator-69f744f599-27tcv\" (UID: \"0d3db4fd-37e8-4257-8bb7-6ab5183de9e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-27tcv" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.665506 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/edd157b6-46a0-4a10-94fb-670544f743ca-client-ca\") pod \"route-controller-manager-6576b87f9c-rsttb\" (UID: \"edd157b6-46a0-4a10-94fb-670544f743ca\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rsttb" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.665532 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e57fed72-8221-44fb-8108-44e9a0dcc51a-etcd-client\") pod \"apiserver-7bbb656c7d-nqztc\" (UID: \"e57fed72-8221-44fb-8108-44e9a0dcc51a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqztc" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.665553 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bc1e4c46-b909-410a-980d-a045d5b3a636-oauth-serving-cert\") pod \"console-f9d7485db-twg79\" (UID: \"bc1e4c46-b909-410a-980d-a045d5b3a636\") " pod="openshift-console/console-f9d7485db-twg79" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.665572 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b40be81d-febb-4200-8271-9b562f0ace35-etcd-ca\") pod \"etcd-operator-b45778765-7czl2\" (UID: \"b40be81d-febb-4200-8271-9b562f0ace35\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7czl2" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.666266 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d3db4fd-37e8-4257-8bb7-6ab5183de9e7-config\") pod \"authentication-operator-69f744f599-27tcv\" (UID: \"0d3db4fd-37e8-4257-8bb7-6ab5183de9e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-27tcv" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.666325 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-d8prt\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.666337 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b483323-5ed6-40b5-b256-c9de7033e4eb-config\") pod \"controller-manager-879f6c89f-qssm4\" (UID: \"2b483323-5ed6-40b5-b256-c9de7033e4eb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qssm4" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.667124 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2b483323-5ed6-40b5-b256-c9de7033e4eb-client-ca\") pod \"controller-manager-879f6c89f-qssm4\" (UID: \"2b483323-5ed6-40b5-b256-c9de7033e4eb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qssm4" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.665130 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e3ea5ca7-6c75-482a-9245-518056647743-audit-dir\") pod \"apiserver-76f77b778f-skjn8\" (UID: \"e3ea5ca7-6c75-482a-9245-518056647743\") " pod="openshift-apiserver/apiserver-76f77b778f-skjn8" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.669475 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf8f0aa6-641c-4258-bd46-541bf71d40b1-serving-cert\") pod \"openshift-config-operator-7777fb866f-cmg79\" (UID: \"bf8f0aa6-641c-4258-bd46-541bf71d40b1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cmg79" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.669501 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51167dc9-74db-4809-8d67-b9e4a48c039c-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-m44sl\" (UID: \"51167dc9-74db-4809-8d67-b9e4a48c039c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m44sl" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.669825 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bc1e4c46-b909-410a-980d-a045d5b3a636-console-oauth-config\") pod \"console-f9d7485db-twg79\" (UID: \"bc1e4c46-b909-410a-980d-a045d5b3a636\") " pod="openshift-console/console-f9d7485db-twg79" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.670016 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zln28"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.671145 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e57fed72-8221-44fb-8108-44e9a0dcc51a-audit-policies\") pod \"apiserver-7bbb656c7d-nqztc\" (UID: \"e57fed72-8221-44fb-8108-44e9a0dcc51a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqztc" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.671220 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edd157b6-46a0-4a10-94fb-670544f743ca-serving-cert\") pod \"route-controller-manager-6576b87f9c-rsttb\" (UID: \"edd157b6-46a0-4a10-94fb-670544f743ca\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rsttb" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.671705 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e3ea5ca7-6c75-482a-9245-518056647743-node-pullsecrets\") pod \"apiserver-76f77b778f-skjn8\" (UID: \"e3ea5ca7-6c75-482a-9245-518056647743\") " pod="openshift-apiserver/apiserver-76f77b778f-skjn8" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.672545 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2b483323-5ed6-40b5-b256-c9de7033e4eb-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-qssm4\" (UID: \"2b483323-5ed6-40b5-b256-c9de7033e4eb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qssm4" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.672578 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7bed6763-7fd9-492f-8a67-cd65b0061950-trusted-ca\") pod \"console-operator-58897d9998-pgqgv\" (UID: \"7bed6763-7fd9-492f-8a67-cd65b0061950\") " pod="openshift-console-operator/console-operator-58897d9998-pgqgv" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.672624 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc1e4c46-b909-410a-980d-a045d5b3a636-trusted-ca-bundle\") pod \"console-f9d7485db-twg79\" (UID: \"bc1e4c46-b909-410a-980d-a045d5b3a636\") " pod="openshift-console/console-f9d7485db-twg79" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.673035 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-d8prt\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.673081 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e57fed72-8221-44fb-8108-44e9a0dcc51a-audit-dir\") pod \"apiserver-7bbb656c7d-nqztc\" (UID: \"e57fed72-8221-44fb-8108-44e9a0dcc51a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqztc" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.673158 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e3ea5ca7-6c75-482a-9245-518056647743-audit\") pod \"apiserver-76f77b778f-skjn8\" (UID: \"e3ea5ca7-6c75-482a-9245-518056647743\") " pod="openshift-apiserver/apiserver-76f77b778f-skjn8" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.674680 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-d8prt\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.674958 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d3db4fd-37e8-4257-8bb7-6ab5183de9e7-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-27tcv\" (UID: \"0d3db4fd-37e8-4257-8bb7-6ab5183de9e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-27tcv" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.675157 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e3ea5ca7-6c75-482a-9245-518056647743-trusted-ca-bundle\") pod \"apiserver-76f77b778f-skjn8\" (UID: \"e3ea5ca7-6c75-482a-9245-518056647743\") " pod="openshift-apiserver/apiserver-76f77b778f-skjn8" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.675292 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-twg79"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.675327 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-7b6hb"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.675342 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l4jk9"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.675693 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45a39bbd-157a-404e-ae10-a4a2893de563-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-pbnbq\" (UID: \"45a39bbd-157a-404e-ae10-a4a2893de563\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pbnbq" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.676366 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51167dc9-74db-4809-8d67-b9e4a48c039c-config\") pod \"openshift-apiserver-operator-796bbdcf4f-m44sl\" (UID: \"51167dc9-74db-4809-8d67-b9e4a48c039c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m44sl" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.676605 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e3ea5ca7-6c75-482a-9245-518056647743-image-import-ca\") pod \"apiserver-76f77b778f-skjn8\" (UID: \"e3ea5ca7-6c75-482a-9245-518056647743\") " pod="openshift-apiserver/apiserver-76f77b778f-skjn8" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.676997 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3ea5ca7-6c75-482a-9245-518056647743-config\") pod \"apiserver-76f77b778f-skjn8\" (UID: \"e3ea5ca7-6c75-482a-9245-518056647743\") " pod="openshift-apiserver/apiserver-76f77b778f-skjn8" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.676324 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e57fed72-8221-44fb-8108-44e9a0dcc51a-encryption-config\") pod \"apiserver-7bbb656c7d-nqztc\" (UID: \"e57fed72-8221-44fb-8108-44e9a0dcc51a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqztc" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.677164 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-audit-dir\") pod \"oauth-openshift-558db77b4-d8prt\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.677899 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc1e4c46-b909-410a-980d-a045d5b3a636-console-serving-cert\") pod \"console-f9d7485db-twg79\" (UID: \"bc1e4c46-b909-410a-980d-a045d5b3a636\") " pod="openshift-console/console-f9d7485db-twg79" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.678366 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a592d34f-e6b6-4ec2-8c63-2249f70cd3f4-auth-proxy-config\") pod \"machine-approver-56656f9798-l2cns\" (UID: \"a592d34f-e6b6-4ec2-8c63-2249f70cd3f4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l2cns" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.678379 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-d8prt\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.678785 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a592d34f-e6b6-4ec2-8c63-2249f70cd3f4-machine-approver-tls\") pod \"machine-approver-56656f9798-l2cns\" (UID: \"a592d34f-e6b6-4ec2-8c63-2249f70cd3f4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l2cns" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.678878 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e57fed72-8221-44fb-8108-44e9a0dcc51a-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-nqztc\" (UID: \"e57fed72-8221-44fb-8108-44e9a0dcc51a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqztc" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.678995 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bed6763-7fd9-492f-8a67-cd65b0061950-serving-cert\") pod \"console-operator-58897d9998-pgqgv\" (UID: \"7bed6763-7fd9-492f-8a67-cd65b0061950\") " pod="openshift-console-operator/console-operator-58897d9998-pgqgv" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.679266 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e3ea5ca7-6c75-482a-9245-518056647743-encryption-config\") pod \"apiserver-76f77b778f-skjn8\" (UID: \"e3ea5ca7-6c75-482a-9245-518056647743\") " pod="openshift-apiserver/apiserver-76f77b778f-skjn8" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.679502 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-d8prt\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.679605 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-56tfw"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.679742 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45a39bbd-157a-404e-ae10-a4a2893de563-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-pbnbq\" (UID: \"45a39bbd-157a-404e-ae10-a4a2893de563\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pbnbq" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.679913 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e3ea5ca7-6c75-482a-9245-518056647743-etcd-serving-ca\") pod \"apiserver-76f77b778f-skjn8\" (UID: \"e3ea5ca7-6c75-482a-9245-518056647743\") " pod="openshift-apiserver/apiserver-76f77b778f-skjn8" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.680064 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-d8prt\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.680178 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-d8prt\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.680284 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d3db4fd-37e8-4257-8bb7-6ab5183de9e7-service-ca-bundle\") pod \"authentication-operator-69f744f599-27tcv\" (UID: \"0d3db4fd-37e8-4257-8bb7-6ab5183de9e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-27tcv" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.680345 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a592d34f-e6b6-4ec2-8c63-2249f70cd3f4-config\") pod \"machine-approver-56656f9798-l2cns\" (UID: \"a592d34f-e6b6-4ec2-8c63-2249f70cd3f4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l2cns" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.680481 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9a3a8166-a7c6-4fb0-a2a1-83003acf436f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-xjd68\" (UID: \"9a3a8166-a7c6-4fb0-a2a1-83003acf436f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xjd68" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.680600 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edd157b6-46a0-4a10-94fb-670544f743ca-config\") pod \"route-controller-manager-6576b87f9c-rsttb\" (UID: \"edd157b6-46a0-4a10-94fb-670544f743ca\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rsttb" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.680945 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b483323-5ed6-40b5-b256-c9de7033e4eb-serving-cert\") pod \"controller-manager-879f6c89f-qssm4\" (UID: \"2b483323-5ed6-40b5-b256-c9de7033e4eb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qssm4" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.681142 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/edd157b6-46a0-4a10-94fb-670544f743ca-client-ca\") pod \"route-controller-manager-6576b87f9c-rsttb\" (UID: \"edd157b6-46a0-4a10-94fb-670544f743ca\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rsttb" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.681374 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7da1b053-67d9-4a4c-842e-cafb5dce5017-config\") pod \"machine-api-operator-5694c8668f-4bj64\" (UID: \"7da1b053-67d9-4a4c-842e-cafb5dce5017\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4bj64" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.681408 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bc1e4c46-b909-410a-980d-a045d5b3a636-console-config\") pod \"console-f9d7485db-twg79\" (UID: \"bc1e4c46-b909-410a-980d-a045d5b3a636\") " pod="openshift-console/console-f9d7485db-twg79" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.679622 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bc1e4c46-b909-410a-980d-a045d5b3a636-oauth-serving-cert\") pod \"console-f9d7485db-twg79\" (UID: \"bc1e4c46-b909-410a-980d-a045d5b3a636\") " pod="openshift-console/console-f9d7485db-twg79" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.681685 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e57fed72-8221-44fb-8108-44e9a0dcc51a-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-nqztc\" (UID: \"e57fed72-8221-44fb-8108-44e9a0dcc51a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqztc" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.682536 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7da1b053-67d9-4a4c-842e-cafb5dce5017-images\") pod \"machine-api-operator-5694c8668f-4bj64\" (UID: \"7da1b053-67d9-4a4c-842e-cafb5dce5017\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4bj64" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.683725 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e57fed72-8221-44fb-8108-44e9a0dcc51a-etcd-client\") pod \"apiserver-7bbb656c7d-nqztc\" (UID: \"e57fed72-8221-44fb-8108-44e9a0dcc51a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqztc" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.683941 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.685147 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d3db4fd-37e8-4257-8bb7-6ab5183de9e7-serving-cert\") pod \"authentication-operator-69f744f599-27tcv\" (UID: \"0d3db4fd-37e8-4257-8bb7-6ab5183de9e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-27tcv" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.685354 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-d8prt\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.685577 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3ea5ca7-6c75-482a-9245-518056647743-serving-cert\") pod \"apiserver-76f77b778f-skjn8\" (UID: \"e3ea5ca7-6c75-482a-9245-518056647743\") " pod="openshift-apiserver/apiserver-76f77b778f-skjn8" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.685606 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-d8prt\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.685725 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-d8prt\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.687172 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-d85d4"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.687365 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-56tfw" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.687955 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7da1b053-67d9-4a4c-842e-cafb5dce5017-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-4bj64\" (UID: \"7da1b053-67d9-4a4c-842e-cafb5dce5017\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4bj64" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.688288 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-d8prt\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.688420 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/db6feab7-169d-4a82-a204-f9a1ff56f85e-metrics-tls\") pod \"dns-operator-744455d44c-wjdnp\" (UID: \"db6feab7-169d-4a82-a204-f9a1ff56f85e\") " pod="openshift-dns-operator/dns-operator-744455d44c-wjdnp" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.688768 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-tf7kq"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.688863 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-frgsp"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.689018 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-d85d4" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.690576 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414055-8r2vj"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.691510 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e3ea5ca7-6c75-482a-9245-518056647743-etcd-client\") pod \"apiserver-76f77b778f-skjn8\" (UID: \"e3ea5ca7-6c75-482a-9245-518056647743\") " pod="openshift-apiserver/apiserver-76f77b778f-skjn8" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.692345 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wtt7n"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.692499 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e57fed72-8221-44fb-8108-44e9a0dcc51a-serving-cert\") pod \"apiserver-7bbb656c7d-nqztc\" (UID: \"e57fed72-8221-44fb-8108-44e9a0dcc51a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqztc" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.693477 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-2rbhc"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.694560 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-6fssm"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.695601 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qp6r2"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.696766 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-56tfw"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.697989 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lzxr6"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.699009 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mgrdx"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.700083 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-twgk8"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.701141 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-c4vs7"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.701166 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.702188 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-nqztc"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.703190 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-mlcpb"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.704342 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-d85d4"] Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.719085 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.744380 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.759314 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.766154 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b40be81d-febb-4200-8271-9b562f0ace35-etcd-ca\") pod \"etcd-operator-b45778765-7czl2\" (UID: \"b40be81d-febb-4200-8271-9b562f0ace35\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7czl2" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.766278 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-849pd\" (UniqueName: \"kubernetes.io/projected/fbe252df-9100-4e9e-a054-1463b3fa50ed-kube-api-access-849pd\") pod \"catalog-operator-68c6474976-l4jk9\" (UID: \"fbe252df-9100-4e9e-a054-1463b3fa50ed\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l4jk9" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.766431 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b40be81d-febb-4200-8271-9b562f0ace35-config\") pod \"etcd-operator-b45778765-7czl2\" (UID: \"b40be81d-febb-4200-8271-9b562f0ace35\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7czl2" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.766509 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fbe252df-9100-4e9e-a054-1463b3fa50ed-profile-collector-cert\") pod \"catalog-operator-68c6474976-l4jk9\" (UID: \"fbe252df-9100-4e9e-a054-1463b3fa50ed\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l4jk9" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.766609 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b40be81d-febb-4200-8271-9b562f0ace35-etcd-service-ca\") pod \"etcd-operator-b45778765-7czl2\" (UID: \"b40be81d-febb-4200-8271-9b562f0ace35\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7czl2" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.766762 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fbe252df-9100-4e9e-a054-1463b3fa50ed-srv-cert\") pod \"catalog-operator-68c6474976-l4jk9\" (UID: \"fbe252df-9100-4e9e-a054-1463b3fa50ed\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l4jk9" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.766894 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkqsz\" (UniqueName: \"kubernetes.io/projected/b40be81d-febb-4200-8271-9b562f0ace35-kube-api-access-kkqsz\") pod \"etcd-operator-b45778765-7czl2\" (UID: \"b40be81d-febb-4200-8271-9b562f0ace35\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7czl2" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.767097 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b40be81d-febb-4200-8271-9b562f0ace35-etcd-client\") pod \"etcd-operator-b45778765-7czl2\" (UID: \"b40be81d-febb-4200-8271-9b562f0ace35\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7czl2" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.767208 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b40be81d-febb-4200-8271-9b562f0ace35-serving-cert\") pod \"etcd-operator-b45778765-7czl2\" (UID: \"b40be81d-febb-4200-8271-9b562f0ace35\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7czl2" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.779231 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.799493 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.810788 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b40be81d-febb-4200-8271-9b562f0ace35-serving-cert\") pod \"etcd-operator-b45778765-7czl2\" (UID: \"b40be81d-febb-4200-8271-9b562f0ace35\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7czl2" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.819810 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.829725 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b40be81d-febb-4200-8271-9b562f0ace35-etcd-client\") pod \"etcd-operator-b45778765-7czl2\" (UID: \"b40be81d-febb-4200-8271-9b562f0ace35\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7czl2" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.839480 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.859324 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.868352 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b40be81d-febb-4200-8271-9b562f0ace35-config\") pod \"etcd-operator-b45778765-7czl2\" (UID: \"b40be81d-febb-4200-8271-9b562f0ace35\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7czl2" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.879824 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.887033 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b40be81d-febb-4200-8271-9b562f0ace35-etcd-ca\") pod \"etcd-operator-b45778765-7czl2\" (UID: \"b40be81d-febb-4200-8271-9b562f0ace35\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7czl2" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.899129 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.907617 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b40be81d-febb-4200-8271-9b562f0ace35-etcd-service-ca\") pod \"etcd-operator-b45778765-7czl2\" (UID: \"b40be81d-febb-4200-8271-9b562f0ace35\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7czl2" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.919465 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.939581 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.960382 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 04 10:17:18 crc kubenswrapper[4831]: I1204 10:17:18.999822 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.019842 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.039899 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.049905 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fbe252df-9100-4e9e-a054-1463b3fa50ed-srv-cert\") pod \"catalog-operator-68c6474976-l4jk9\" (UID: \"fbe252df-9100-4e9e-a054-1463b3fa50ed\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l4jk9" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.059043 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.079337 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.099856 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.111349 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fbe252df-9100-4e9e-a054-1463b3fa50ed-profile-collector-cert\") pod \"catalog-operator-68c6474976-l4jk9\" (UID: \"fbe252df-9100-4e9e-a054-1463b3fa50ed\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l4jk9" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.119561 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.140061 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.160646 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.179501 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.199496 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.220746 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.241691 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.259759 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.279733 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.320239 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.339615 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.360839 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.380277 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.400258 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.419390 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.439454 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.460887 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.480422 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.501089 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.519215 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.539166 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.560035 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.579870 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.601043 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.621478 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.638065 4831 request.go:700] Waited for 1.017204097s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmco-proxy-tls&limit=500&resourceVersion=0 Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.640309 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.660015 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.681898 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.699991 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.720137 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.740743 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.759979 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.780950 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.800435 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.820844 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.841369 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.859502 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.880002 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.899538 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.929519 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.941281 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.960229 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 04 10:17:19 crc kubenswrapper[4831]: I1204 10:17:19.981468 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.000194 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.020173 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.040459 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.060292 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.080196 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.099857 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.119556 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.139732 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.159932 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.179833 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.199225 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.220268 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.240909 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.260652 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.281030 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.300904 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.320515 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.341033 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.375239 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmfz5\" (UniqueName: \"kubernetes.io/projected/45a39bbd-157a-404e-ae10-a4a2893de563-kube-api-access-qmfz5\") pod \"openshift-controller-manager-operator-756b6f6bc6-pbnbq\" (UID: \"45a39bbd-157a-404e-ae10-a4a2893de563\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pbnbq" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.395972 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rb6h\" (UniqueName: \"kubernetes.io/projected/9a3a8166-a7c6-4fb0-a2a1-83003acf436f-kube-api-access-8rb6h\") pod \"cluster-samples-operator-665b6dd947-xjd68\" (UID: \"9a3a8166-a7c6-4fb0-a2a1-83003acf436f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xjd68" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.413387 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-546qp\" (UniqueName: \"kubernetes.io/projected/e3ea5ca7-6c75-482a-9245-518056647743-kube-api-access-546qp\") pod \"apiserver-76f77b778f-skjn8\" (UID: \"e3ea5ca7-6c75-482a-9245-518056647743\") " pod="openshift-apiserver/apiserver-76f77b778f-skjn8" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.435567 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lfgb\" (UniqueName: \"kubernetes.io/projected/db6feab7-169d-4a82-a204-f9a1ff56f85e-kube-api-access-9lfgb\") pod \"dns-operator-744455d44c-wjdnp\" (UID: \"db6feab7-169d-4a82-a204-f9a1ff56f85e\") " pod="openshift-dns-operator/dns-operator-744455d44c-wjdnp" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.439190 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-wjdnp" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.456942 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lvq7\" (UniqueName: \"kubernetes.io/projected/bf8f0aa6-641c-4258-bd46-541bf71d40b1-kube-api-access-5lvq7\") pod \"openshift-config-operator-7777fb866f-cmg79\" (UID: \"bf8f0aa6-641c-4258-bd46-541bf71d40b1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cmg79" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.462935 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pbnbq" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.479767 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztdmw\" (UniqueName: \"kubernetes.io/projected/e57fed72-8221-44fb-8108-44e9a0dcc51a-kube-api-access-ztdmw\") pod \"apiserver-7bbb656c7d-nqztc\" (UID: \"e57fed72-8221-44fb-8108-44e9a0dcc51a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqztc" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.494103 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97bjx\" (UniqueName: \"kubernetes.io/projected/bc1e4c46-b909-410a-980d-a045d5b3a636-kube-api-access-97bjx\") pod \"console-f9d7485db-twg79\" (UID: \"bc1e4c46-b909-410a-980d-a045d5b3a636\") " pod="openshift-console/console-f9d7485db-twg79" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.517644 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc5mt\" (UniqueName: \"kubernetes.io/projected/51167dc9-74db-4809-8d67-b9e4a48c039c-kube-api-access-bc5mt\") pod \"openshift-apiserver-operator-796bbdcf4f-m44sl\" (UID: \"51167dc9-74db-4809-8d67-b9e4a48c039c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m44sl" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.522926 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xjd68" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.530437 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-twg79" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.537956 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcdwq\" (UniqueName: \"kubernetes.io/projected/2b483323-5ed6-40b5-b256-c9de7033e4eb-kube-api-access-hcdwq\") pod \"controller-manager-879f6c89f-qssm4\" (UID: \"2b483323-5ed6-40b5-b256-c9de7033e4eb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qssm4" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.554506 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kttjg\" (UniqueName: \"kubernetes.io/projected/edd157b6-46a0-4a10-94fb-670544f743ca-kube-api-access-kttjg\") pod \"route-controller-manager-6576b87f9c-rsttb\" (UID: \"edd157b6-46a0-4a10-94fb-670544f743ca\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rsttb" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.574767 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rsttb" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.582056 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9zl7\" (UniqueName: \"kubernetes.io/projected/0d3db4fd-37e8-4257-8bb7-6ab5183de9e7-kube-api-access-l9zl7\") pod \"authentication-operator-69f744f599-27tcv\" (UID: \"0d3db4fd-37e8-4257-8bb7-6ab5183de9e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-27tcv" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.594835 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-qssm4" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.606712 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzs4p\" (UniqueName: \"kubernetes.io/projected/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-kube-api-access-gzs4p\") pod \"oauth-openshift-558db77b4-d8prt\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.616554 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xsnh\" (UniqueName: \"kubernetes.io/projected/08171e38-fc77-4d5a-acc9-58dca0784830-kube-api-access-6xsnh\") pod \"downloads-7954f5f757-2h8nq\" (UID: \"08171e38-fc77-4d5a-acc9-58dca0784830\") " pod="openshift-console/downloads-7954f5f757-2h8nq" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.637913 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mpcg\" (UniqueName: \"kubernetes.io/projected/a592d34f-e6b6-4ec2-8c63-2249f70cd3f4-kube-api-access-2mpcg\") pod \"machine-approver-56656f9798-l2cns\" (UID: \"a592d34f-e6b6-4ec2-8c63-2249f70cd3f4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l2cns" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.652719 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8vzk\" (UniqueName: \"kubernetes.io/projected/7bed6763-7fd9-492f-8a67-cd65b0061950-kube-api-access-r8vzk\") pod \"console-operator-58897d9998-pgqgv\" (UID: \"7bed6763-7fd9-492f-8a67-cd65b0061950\") " pod="openshift-console-operator/console-operator-58897d9998-pgqgv" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.653145 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-pgqgv" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.658581 4831 request.go:700] Waited for 1.978094826s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-api/serviceaccounts/machine-api-operator/token Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.664538 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-skjn8" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.679791 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn99k\" (UniqueName: \"kubernetes.io/projected/7da1b053-67d9-4a4c-842e-cafb5dce5017-kube-api-access-dn99k\") pod \"machine-api-operator-5694c8668f-4bj64\" (UID: \"7da1b053-67d9-4a4c-842e-cafb5dce5017\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4bj64" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.679973 4831 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.690145 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-27tcv" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.691727 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pbnbq"] Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.699940 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.718036 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqztc" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.736071 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.736596 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cmg79" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.739688 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.743952 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-wjdnp"] Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.745455 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.748706 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l2cns" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.757836 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m44sl" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.760629 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.779046 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.779196 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-2h8nq" Dec 04 10:17:20 crc kubenswrapper[4831]: W1204 10:17:20.782710 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb6feab7_169d_4a82_a204_f9a1ff56f85e.slice/crio-42f5e91cdf13cad2e76e564793a057158f1d0ac899ec5ee7bd13774938a55d8f WatchSource:0}: Error finding container 42f5e91cdf13cad2e76e564793a057158f1d0ac899ec5ee7bd13774938a55d8f: Status 404 returned error can't find the container with id 42f5e91cdf13cad2e76e564793a057158f1d0ac899ec5ee7bd13774938a55d8f Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.799901 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.840354 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-849pd\" (UniqueName: \"kubernetes.io/projected/fbe252df-9100-4e9e-a054-1463b3fa50ed-kube-api-access-849pd\") pod \"catalog-operator-68c6474976-l4jk9\" (UID: \"fbe252df-9100-4e9e-a054-1463b3fa50ed\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l4jk9" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.857956 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkqsz\" (UniqueName: \"kubernetes.io/projected/b40be81d-febb-4200-8271-9b562f0ace35-kube-api-access-kkqsz\") pod \"etcd-operator-b45778765-7czl2\" (UID: \"b40be81d-febb-4200-8271-9b562f0ace35\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7czl2" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.869481 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l4jk9" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.894952 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/25ecb4dd-adbb-48db-8563-78c6f9faff4f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-pr7kc\" (UID: \"25ecb4dd-adbb-48db-8563-78c6f9faff4f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pr7kc" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.894999 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twpck\" (UniqueName: \"kubernetes.io/projected/3acf1264-0fb4-46e4-a876-8e7677b39304-kube-api-access-twpck\") pod \"ingress-operator-5b745b69d9-6fssm\" (UID: \"3acf1264-0fb4-46e4-a876-8e7677b39304\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6fssm" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.895041 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3acf1264-0fb4-46e4-a876-8e7677b39304-metrics-tls\") pod \"ingress-operator-5b745b69d9-6fssm\" (UID: \"3acf1264-0fb4-46e4-a876-8e7677b39304\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6fssm" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.895126 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ca0af9c8-f2fa-45f8-a428-9b061441bddf-bound-sa-token\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.895151 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4783614b-88f9-4207-b6ef-f73824ce9334-metrics-certs\") pod \"router-default-5444994796-nmqqm\" (UID: \"4783614b-88f9-4207-b6ef-f73824ce9334\") " pod="openshift-ingress/router-default-5444994796-nmqqm" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.895177 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3acf1264-0fb4-46e4-a876-8e7677b39304-trusted-ca\") pod \"ingress-operator-5b745b69d9-6fssm\" (UID: \"3acf1264-0fb4-46e4-a876-8e7677b39304\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6fssm" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.895226 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvshm\" (UniqueName: \"kubernetes.io/projected/ca0af9c8-f2fa-45f8-a428-9b061441bddf-kube-api-access-wvshm\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.895246 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3acf1264-0fb4-46e4-a876-8e7677b39304-bound-sa-token\") pod \"ingress-operator-5b745b69d9-6fssm\" (UID: \"3acf1264-0fb4-46e4-a876-8e7677b39304\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6fssm" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.895276 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4783614b-88f9-4207-b6ef-f73824ce9334-default-certificate\") pod \"router-default-5444994796-nmqqm\" (UID: \"4783614b-88f9-4207-b6ef-f73824ce9334\") " pod="openshift-ingress/router-default-5444994796-nmqqm" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.895302 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ca0af9c8-f2fa-45f8-a428-9b061441bddf-installation-pull-secrets\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.895330 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4783614b-88f9-4207-b6ef-f73824ce9334-service-ca-bundle\") pod \"router-default-5444994796-nmqqm\" (UID: \"4783614b-88f9-4207-b6ef-f73824ce9334\") " pod="openshift-ingress/router-default-5444994796-nmqqm" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.895356 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ca0af9c8-f2fa-45f8-a428-9b061441bddf-ca-trust-extracted\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.895373 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/25ecb4dd-adbb-48db-8563-78c6f9faff4f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-pr7kc\" (UID: \"25ecb4dd-adbb-48db-8563-78c6f9faff4f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pr7kc" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.895396 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.895424 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ca0af9c8-f2fa-45f8-a428-9b061441bddf-trusted-ca\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.895443 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g85sg\" (UniqueName: \"kubernetes.io/projected/4783614b-88f9-4207-b6ef-f73824ce9334-kube-api-access-g85sg\") pod \"router-default-5444994796-nmqqm\" (UID: \"4783614b-88f9-4207-b6ef-f73824ce9334\") " pod="openshift-ingress/router-default-5444994796-nmqqm" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.895465 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d8pw\" (UniqueName: \"kubernetes.io/projected/effa094e-e36b-4039-8523-dc11cffa2894-kube-api-access-7d8pw\") pod \"migrator-59844c95c7-tf7kq\" (UID: \"effa094e-e36b-4039-8523-dc11cffa2894\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tf7kq" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.895486 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7slwb\" (UniqueName: \"kubernetes.io/projected/25ecb4dd-adbb-48db-8563-78c6f9faff4f-kube-api-access-7slwb\") pod \"cluster-image-registry-operator-dc59b4c8b-pr7kc\" (UID: \"25ecb4dd-adbb-48db-8563-78c6f9faff4f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pr7kc" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.895521 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ca0af9c8-f2fa-45f8-a428-9b061441bddf-registry-certificates\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.895543 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/25ecb4dd-adbb-48db-8563-78c6f9faff4f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-pr7kc\" (UID: \"25ecb4dd-adbb-48db-8563-78c6f9faff4f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pr7kc" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.895562 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4783614b-88f9-4207-b6ef-f73824ce9334-stats-auth\") pod \"router-default-5444994796-nmqqm\" (UID: \"4783614b-88f9-4207-b6ef-f73824ce9334\") " pod="openshift-ingress/router-default-5444994796-nmqqm" Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.895614 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ca0af9c8-f2fa-45f8-a428-9b061441bddf-registry-tls\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:20 crc kubenswrapper[4831]: E1204 10:17:20.896422 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:21.396410066 +0000 UTC m=+138.345585380 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:20 crc kubenswrapper[4831]: I1204 10:17:20.917803 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-4bj64" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.997024 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.997472 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7eb2c02c-884b-45f4-8387-e2e71df329a9-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-twgk8\" (UID: \"7eb2c02c-884b-45f4-8387-e2e71df329a9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-twgk8" Dec 04 10:17:21 crc kubenswrapper[4831]: E1204 10:17:20.997524 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:21.497490755 +0000 UTC m=+138.446666099 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.997601 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ca0af9c8-f2fa-45f8-a428-9b061441bddf-bound-sa-token\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.997636 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/73f9aaec-7f63-4909-9ffe-6b073e0225d9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qp6r2\" (UID: \"73f9aaec-7f63-4909-9ffe-6b073e0225d9\") " pod="openshift-marketplace/marketplace-operator-79b997595-qp6r2" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.997690 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3acf1264-0fb4-46e4-a876-8e7677b39304-trusted-ca\") pod \"ingress-operator-5b745b69d9-6fssm\" (UID: \"3acf1264-0fb4-46e4-a876-8e7677b39304\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6fssm" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.997713 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/87734f51-2b97-4726-8eba-c95450c2686b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-frgsp\" (UID: \"87734f51-2b97-4726-8eba-c95450c2686b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-frgsp" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.997733 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbjdt\" (UniqueName: \"kubernetes.io/projected/73f9aaec-7f63-4909-9ffe-6b073e0225d9-kube-api-access-hbjdt\") pod \"marketplace-operator-79b997595-qp6r2\" (UID: \"73f9aaec-7f63-4909-9ffe-6b073e0225d9\") " pod="openshift-marketplace/marketplace-operator-79b997595-qp6r2" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.997770 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73f9aaec-7f63-4909-9ffe-6b073e0225d9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qp6r2\" (UID: \"73f9aaec-7f63-4909-9ffe-6b073e0225d9\") " pod="openshift-marketplace/marketplace-operator-79b997595-qp6r2" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.997785 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4ab7a865-d267-4ad7-88bd-e3b9e1f2510c-signing-cabundle\") pod \"service-ca-9c57cc56f-2rbhc\" (UID: \"4ab7a865-d267-4ad7-88bd-e3b9e1f2510c\") " pod="openshift-service-ca/service-ca-9c57cc56f-2rbhc" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.997843 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwpvb\" (UniqueName: \"kubernetes.io/projected/04bb6d4d-643d-4015-9579-c522cfc43c2a-kube-api-access-cwpvb\") pod \"dns-default-lzxr6\" (UID: \"04bb6d4d-643d-4015-9579-c522cfc43c2a\") " pod="openshift-dns/dns-default-lzxr6" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.997889 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ca0af9c8-f2fa-45f8-a428-9b061441bddf-installation-pull-secrets\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.997910 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/37cadb5f-eedf-40ab-b4db-44d9935e26eb-csi-data-dir\") pod \"csi-hostpathplugin-56tfw\" (UID: \"37cadb5f-eedf-40ab-b4db-44d9935e26eb\") " pod="hostpath-provisioner/csi-hostpathplugin-56tfw" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.997928 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/37cadb5f-eedf-40ab-b4db-44d9935e26eb-registration-dir\") pod \"csi-hostpathplugin-56tfw\" (UID: \"37cadb5f-eedf-40ab-b4db-44d9935e26eb\") " pod="hostpath-provisioner/csi-hostpathplugin-56tfw" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.997945 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4783614b-88f9-4207-b6ef-f73824ce9334-service-ca-bundle\") pod \"router-default-5444994796-nmqqm\" (UID: \"4783614b-88f9-4207-b6ef-f73824ce9334\") " pod="openshift-ingress/router-default-5444994796-nmqqm" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.997985 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbmf8\" (UniqueName: \"kubernetes.io/projected/7eb2c02c-884b-45f4-8387-e2e71df329a9-kube-api-access-xbmf8\") pod \"kube-storage-version-migrator-operator-b67b599dd-twgk8\" (UID: \"7eb2c02c-884b-45f4-8387-e2e71df329a9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-twgk8" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.998010 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ca0af9c8-f2fa-45f8-a428-9b061441bddf-ca-trust-extracted\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.998030 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/51375b90-5931-416b-8a05-5a76d3e2852f-certs\") pod \"machine-config-server-9q8xt\" (UID: \"51375b90-5931-416b-8a05-5a76d3e2852f\") " pod="openshift-machine-config-operator/machine-config-server-9q8xt" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.998053 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w54wf\" (UniqueName: \"kubernetes.io/projected/51375b90-5931-416b-8a05-5a76d3e2852f-kube-api-access-w54wf\") pod \"machine-config-server-9q8xt\" (UID: \"51375b90-5931-416b-8a05-5a76d3e2852f\") " pod="openshift-machine-config-operator/machine-config-server-9q8xt" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.998074 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab874db5-da93-478d-927b-21104115cf12-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zln28\" (UID: \"ab874db5-da93-478d-927b-21104115cf12\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zln28" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.998092 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/51375b90-5931-416b-8a05-5a76d3e2852f-node-bootstrap-token\") pod \"machine-config-server-9q8xt\" (UID: \"51375b90-5931-416b-8a05-5a76d3e2852f\") " pod="openshift-machine-config-operator/machine-config-server-9q8xt" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.998132 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d686eb67-074c-4086-b33c-c632acc21a66-webhook-cert\") pod \"packageserver-d55dfcdfc-qkzn2\" (UID: \"d686eb67-074c-4086-b33c-c632acc21a66\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qkzn2" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.998164 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7d8pw\" (UniqueName: \"kubernetes.io/projected/effa094e-e36b-4039-8523-dc11cffa2894-kube-api-access-7d8pw\") pod \"migrator-59844c95c7-tf7kq\" (UID: \"effa094e-e36b-4039-8523-dc11cffa2894\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tf7kq" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.998195 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ca0af9c8-f2fa-45f8-a428-9b061441bddf-registry-certificates\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.998212 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61498aaa-6ad9-4572-954f-40d841edd6d8-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-wtt7n\" (UID: \"61498aaa-6ad9-4572-954f-40d841edd6d8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wtt7n" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.998236 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4783614b-88f9-4207-b6ef-f73824ce9334-stats-auth\") pod \"router-default-5444994796-nmqqm\" (UID: \"4783614b-88f9-4207-b6ef-f73824ce9334\") " pod="openshift-ingress/router-default-5444994796-nmqqm" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.998258 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7eb2c02c-884b-45f4-8387-e2e71df329a9-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-twgk8\" (UID: \"7eb2c02c-884b-45f4-8387-e2e71df329a9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-twgk8" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.998296 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/61498aaa-6ad9-4572-954f-40d841edd6d8-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-wtt7n\" (UID: \"61498aaa-6ad9-4572-954f-40d841edd6d8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wtt7n" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.998324 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ca0af9c8-f2fa-45f8-a428-9b061441bddf-registry-tls\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.998351 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/25ecb4dd-adbb-48db-8563-78c6f9faff4f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-pr7kc\" (UID: \"25ecb4dd-adbb-48db-8563-78c6f9faff4f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pr7kc" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.998372 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61498aaa-6ad9-4572-954f-40d841edd6d8-config\") pod \"kube-controller-manager-operator-78b949d7b-wtt7n\" (UID: \"61498aaa-6ad9-4572-954f-40d841edd6d8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wtt7n" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.998391 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d686eb67-074c-4086-b33c-c632acc21a66-apiservice-cert\") pod \"packageserver-d55dfcdfc-qkzn2\" (UID: \"d686eb67-074c-4086-b33c-c632acc21a66\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qkzn2" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.998436 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twpck\" (UniqueName: \"kubernetes.io/projected/3acf1264-0fb4-46e4-a876-8e7677b39304-kube-api-access-twpck\") pod \"ingress-operator-5b745b69d9-6fssm\" (UID: \"3acf1264-0fb4-46e4-a876-8e7677b39304\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6fssm" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.998461 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93463415-c819-44b0-9ee7-0de0698eb6a6-config-volume\") pod \"collect-profiles-29414055-8r2vj\" (UID: \"93463415-c819-44b0-9ee7-0de0698eb6a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-8r2vj" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.998481 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f5d23b38-2bda-4faf-a3da-8565336a48a2-cert\") pod \"ingress-canary-d85d4\" (UID: \"f5d23b38-2bda-4faf-a3da-8565336a48a2\") " pod="openshift-ingress-canary/ingress-canary-d85d4" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.998503 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6vmb\" (UniqueName: \"kubernetes.io/projected/a8c8337c-874e-4c1d-9656-aeefef264a92-kube-api-access-b6vmb\") pod \"multus-admission-controller-857f4d67dd-9sxtr\" (UID: \"a8c8337c-874e-4c1d-9656-aeefef264a92\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9sxtr" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.998600 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad2c81a8-b9ac-4332-a6ae-a27e95e90537-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-2pnnb\" (UID: \"ad2c81a8-b9ac-4332-a6ae-a27e95e90537\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2pnnb" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.998620 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/87734f51-2b97-4726-8eba-c95450c2686b-srv-cert\") pod \"olm-operator-6b444d44fb-frgsp\" (UID: \"87734f51-2b97-4726-8eba-c95450c2686b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-frgsp" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.998650 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/04bb6d4d-643d-4015-9579-c522cfc43c2a-metrics-tls\") pod \"dns-default-lzxr6\" (UID: \"04bb6d4d-643d-4015-9579-c522cfc43c2a\") " pod="openshift-dns/dns-default-lzxr6" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.998689 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/37cadb5f-eedf-40ab-b4db-44d9935e26eb-socket-dir\") pod \"csi-hostpathplugin-56tfw\" (UID: \"37cadb5f-eedf-40ab-b4db-44d9935e26eb\") " pod="hostpath-provisioner/csi-hostpathplugin-56tfw" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.998783 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab874db5-da93-478d-927b-21104115cf12-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zln28\" (UID: \"ab874db5-da93-478d-927b-21104115cf12\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zln28" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.998842 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj75s\" (UniqueName: \"kubernetes.io/projected/68cdd202-055b-41c7-ac5f-a13b918c44fc-kube-api-access-zj75s\") pod \"control-plane-machine-set-operator-78cbb6b69f-mgrdx\" (UID: \"68cdd202-055b-41c7-ac5f-a13b918c44fc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mgrdx" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.998902 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4783614b-88f9-4207-b6ef-f73824ce9334-metrics-certs\") pod \"router-default-5444994796-nmqqm\" (UID: \"4783614b-88f9-4207-b6ef-f73824ce9334\") " pod="openshift-ingress/router-default-5444994796-nmqqm" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.998933 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c589b350-6e72-40be-9f47-b26e80dc5ba6-proxy-tls\") pod \"machine-config-operator-74547568cd-c4vs7\" (UID: \"c589b350-6e72-40be-9f47-b26e80dc5ba6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c4vs7" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.998973 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tswx8\" (UniqueName: \"kubernetes.io/projected/87734f51-2b97-4726-8eba-c95450c2686b-kube-api-access-tswx8\") pod \"olm-operator-6b444d44fb-frgsp\" (UID: \"87734f51-2b97-4726-8eba-c95450c2686b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-frgsp" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.999008 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvshm\" (UniqueName: \"kubernetes.io/projected/ca0af9c8-f2fa-45f8-a428-9b061441bddf-kube-api-access-wvshm\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.999035 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3acf1264-0fb4-46e4-a876-8e7677b39304-bound-sa-token\") pod \"ingress-operator-5b745b69d9-6fssm\" (UID: \"3acf1264-0fb4-46e4-a876-8e7677b39304\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6fssm" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.999062 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/20c7a900-7157-4468-bd68-e74c2d382e85-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-bp6kt\" (UID: \"20c7a900-7157-4468-bd68-e74c2d382e85\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bp6kt" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.999085 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/93463415-c819-44b0-9ee7-0de0698eb6a6-secret-volume\") pod \"collect-profiles-29414055-8r2vj\" (UID: \"93463415-c819-44b0-9ee7-0de0698eb6a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-8r2vj" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.999115 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4783614b-88f9-4207-b6ef-f73824ce9334-default-certificate\") pod \"router-default-5444994796-nmqqm\" (UID: \"4783614b-88f9-4207-b6ef-f73824ce9334\") " pod="openshift-ingress/router-default-5444994796-nmqqm" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.999162 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgzwd\" (UniqueName: \"kubernetes.io/projected/f5d23b38-2bda-4faf-a3da-8565336a48a2-kube-api-access-fgzwd\") pod \"ingress-canary-d85d4\" (UID: \"f5d23b38-2bda-4faf-a3da-8565336a48a2\") " pod="openshift-ingress-canary/ingress-canary-d85d4" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.999188 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j9sd\" (UniqueName: \"kubernetes.io/projected/d686eb67-074c-4086-b33c-c632acc21a66-kube-api-access-9j9sd\") pod \"packageserver-d55dfcdfc-qkzn2\" (UID: \"d686eb67-074c-4086-b33c-c632acc21a66\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qkzn2" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.999211 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a8c8337c-874e-4c1d-9656-aeefef264a92-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-9sxtr\" (UID: \"a8c8337c-874e-4c1d-9656-aeefef264a92\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9sxtr" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.999577 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/25ecb4dd-adbb-48db-8563-78c6f9faff4f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-pr7kc\" (UID: \"25ecb4dd-adbb-48db-8563-78c6f9faff4f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pr7kc" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.999637 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.999692 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad2c81a8-b9ac-4332-a6ae-a27e95e90537-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-2pnnb\" (UID: \"ad2c81a8-b9ac-4332-a6ae-a27e95e90537\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2pnnb" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.999713 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkxn5\" (UniqueName: \"kubernetes.io/projected/20c7a900-7157-4468-bd68-e74c2d382e85-kube-api-access-vkxn5\") pod \"machine-config-controller-84d6567774-bp6kt\" (UID: \"20c7a900-7157-4468-bd68-e74c2d382e85\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bp6kt" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.999734 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94zfl\" (UniqueName: \"kubernetes.io/projected/db1720b8-e97a-4399-8c33-a0dfe81a4621-kube-api-access-94zfl\") pod \"package-server-manager-789f6589d5-zng9d\" (UID: \"db1720b8-e97a-4399-8c33-a0dfe81a4621\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zng9d" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.999757 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5psh\" (UniqueName: \"kubernetes.io/projected/c589b350-6e72-40be-9f47-b26e80dc5ba6-kube-api-access-x5psh\") pod \"machine-config-operator-74547568cd-c4vs7\" (UID: \"c589b350-6e72-40be-9f47-b26e80dc5ba6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c4vs7" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.999775 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab874db5-da93-478d-927b-21104115cf12-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zln28\" (UID: \"ab874db5-da93-478d-927b-21104115cf12\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zln28" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.999809 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g85sg\" (UniqueName: \"kubernetes.io/projected/4783614b-88f9-4207-b6ef-f73824ce9334-kube-api-access-g85sg\") pod \"router-default-5444994796-nmqqm\" (UID: \"4783614b-88f9-4207-b6ef-f73824ce9334\") " pod="openshift-ingress/router-default-5444994796-nmqqm" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.999850 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ca0af9c8-f2fa-45f8-a428-9b061441bddf-trusted-ca\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.999868 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/db1720b8-e97a-4399-8c33-a0dfe81a4621-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zng9d\" (UID: \"db1720b8-e97a-4399-8c33-a0dfe81a4621\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zng9d" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.999888 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77vwm\" (UniqueName: \"kubernetes.io/projected/4ab7a865-d267-4ad7-88bd-e3b9e1f2510c-kube-api-access-77vwm\") pod \"service-ca-9c57cc56f-2rbhc\" (UID: \"4ab7a865-d267-4ad7-88bd-e3b9e1f2510c\") " pod="openshift-service-ca/service-ca-9c57cc56f-2rbhc" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.999907 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7slwb\" (UniqueName: \"kubernetes.io/projected/25ecb4dd-adbb-48db-8563-78c6f9faff4f-kube-api-access-7slwb\") pod \"cluster-image-registry-operator-dc59b4c8b-pr7kc\" (UID: \"25ecb4dd-adbb-48db-8563-78c6f9faff4f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pr7kc" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:20.999954 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/37cadb5f-eedf-40ab-b4db-44d9935e26eb-mountpoint-dir\") pod \"csi-hostpathplugin-56tfw\" (UID: \"37cadb5f-eedf-40ab-b4db-44d9935e26eb\") " pod="hostpath-provisioner/csi-hostpathplugin-56tfw" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.000132 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3acf1264-0fb4-46e4-a876-8e7677b39304-trusted-ca\") pod \"ingress-operator-5b745b69d9-6fssm\" (UID: \"3acf1264-0fb4-46e4-a876-8e7677b39304\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6fssm" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.001565 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4783614b-88f9-4207-b6ef-f73824ce9334-service-ca-bundle\") pod \"router-default-5444994796-nmqqm\" (UID: \"4783614b-88f9-4207-b6ef-f73824ce9334\") " pod="openshift-ingress/router-default-5444994796-nmqqm" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.001815 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ca0af9c8-f2fa-45f8-a428-9b061441bddf-ca-trust-extracted\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.003785 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ca0af9c8-f2fa-45f8-a428-9b061441bddf-registry-certificates\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:21 crc kubenswrapper[4831]: E1204 10:17:21.005554 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:21.505537998 +0000 UTC m=+138.454713312 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.005820 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/25ecb4dd-adbb-48db-8563-78c6f9faff4f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-pr7kc\" (UID: \"25ecb4dd-adbb-48db-8563-78c6f9faff4f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pr7kc" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.005865 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsh97\" (UniqueName: \"kubernetes.io/projected/a57c45a9-d720-4144-86c6-dc4c3cfbd9bd-kube-api-access-dsh97\") pod \"service-ca-operator-777779d784-mlcpb\" (UID: \"a57c45a9-d720-4144-86c6-dc4c3cfbd9bd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mlcpb" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.005975 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xjd68"] Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.006076 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a57c45a9-d720-4144-86c6-dc4c3cfbd9bd-config\") pod \"service-ca-operator-777779d784-mlcpb\" (UID: \"a57c45a9-d720-4144-86c6-dc4c3cfbd9bd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mlcpb" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.006156 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c589b350-6e72-40be-9f47-b26e80dc5ba6-images\") pod \"machine-config-operator-74547568cd-c4vs7\" (UID: \"c589b350-6e72-40be-9f47-b26e80dc5ba6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c4vs7" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.006219 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a57c45a9-d720-4144-86c6-dc4c3cfbd9bd-serving-cert\") pod \"service-ca-operator-777779d784-mlcpb\" (UID: \"a57c45a9-d720-4144-86c6-dc4c3cfbd9bd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mlcpb" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.006253 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/37cadb5f-eedf-40ab-b4db-44d9935e26eb-plugins-dir\") pod \"csi-hostpathplugin-56tfw\" (UID: \"37cadb5f-eedf-40ab-b4db-44d9935e26eb\") " pod="hostpath-provisioner/csi-hostpathplugin-56tfw" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.006272 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/04bb6d4d-643d-4015-9579-c522cfc43c2a-config-volume\") pod \"dns-default-lzxr6\" (UID: \"04bb6d4d-643d-4015-9579-c522cfc43c2a\") " pod="openshift-dns/dns-default-lzxr6" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.006566 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4783614b-88f9-4207-b6ef-f73824ce9334-default-certificate\") pod \"router-default-5444994796-nmqqm\" (UID: \"4783614b-88f9-4207-b6ef-f73824ce9334\") " pod="openshift-ingress/router-default-5444994796-nmqqm" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.006235 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ca0af9c8-f2fa-45f8-a428-9b061441bddf-installation-pull-secrets\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.007127 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/25ecb4dd-adbb-48db-8563-78c6f9faff4f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-pr7kc\" (UID: \"25ecb4dd-adbb-48db-8563-78c6f9faff4f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pr7kc" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.007155 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4783614b-88f9-4207-b6ef-f73824ce9334-metrics-certs\") pod \"router-default-5444994796-nmqqm\" (UID: \"4783614b-88f9-4207-b6ef-f73824ce9334\") " pod="openshift-ingress/router-default-5444994796-nmqqm" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.008821 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ca0af9c8-f2fa-45f8-a428-9b061441bddf-trusted-ca\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.010528 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/68cdd202-055b-41c7-ac5f-a13b918c44fc-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-mgrdx\" (UID: \"68cdd202-055b-41c7-ac5f-a13b918c44fc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mgrdx" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.010577 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmrtg\" (UniqueName: \"kubernetes.io/projected/93463415-c819-44b0-9ee7-0de0698eb6a6-kube-api-access-lmrtg\") pod \"collect-profiles-29414055-8r2vj\" (UID: \"93463415-c819-44b0-9ee7-0de0698eb6a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-8r2vj" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.010623 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/20c7a900-7157-4468-bd68-e74c2d382e85-proxy-tls\") pod \"machine-config-controller-84d6567774-bp6kt\" (UID: \"20c7a900-7157-4468-bd68-e74c2d382e85\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bp6kt" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.010648 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad2c81a8-b9ac-4332-a6ae-a27e95e90537-config\") pod \"kube-apiserver-operator-766d6c64bb-2pnnb\" (UID: \"ad2c81a8-b9ac-4332-a6ae-a27e95e90537\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2pnnb" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.010703 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sptxp\" (UniqueName: \"kubernetes.io/projected/37cadb5f-eedf-40ab-b4db-44d9935e26eb-kube-api-access-sptxp\") pod \"csi-hostpathplugin-56tfw\" (UID: \"37cadb5f-eedf-40ab-b4db-44d9935e26eb\") " pod="hostpath-provisioner/csi-hostpathplugin-56tfw" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.010747 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c589b350-6e72-40be-9f47-b26e80dc5ba6-auth-proxy-config\") pod \"machine-config-operator-74547568cd-c4vs7\" (UID: \"c589b350-6e72-40be-9f47-b26e80dc5ba6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c4vs7" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.010822 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4ab7a865-d267-4ad7-88bd-e3b9e1f2510c-signing-key\") pod \"service-ca-9c57cc56f-2rbhc\" (UID: \"4ab7a865-d267-4ad7-88bd-e3b9e1f2510c\") " pod="openshift-service-ca/service-ca-9c57cc56f-2rbhc" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.010826 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/25ecb4dd-adbb-48db-8563-78c6f9faff4f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-pr7kc\" (UID: \"25ecb4dd-adbb-48db-8563-78c6f9faff4f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pr7kc" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.011089 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3acf1264-0fb4-46e4-a876-8e7677b39304-metrics-tls\") pod \"ingress-operator-5b745b69d9-6fssm\" (UID: \"3acf1264-0fb4-46e4-a876-8e7677b39304\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6fssm" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.011259 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d686eb67-074c-4086-b33c-c632acc21a66-tmpfs\") pod \"packageserver-d55dfcdfc-qkzn2\" (UID: \"d686eb67-074c-4086-b33c-c632acc21a66\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qkzn2" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.021105 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-twg79"] Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.021322 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3acf1264-0fb4-46e4-a876-8e7677b39304-metrics-tls\") pod \"ingress-operator-5b745b69d9-6fssm\" (UID: \"3acf1264-0fb4-46e4-a876-8e7677b39304\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6fssm" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.023385 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4783614b-88f9-4207-b6ef-f73824ce9334-stats-auth\") pod \"router-default-5444994796-nmqqm\" (UID: \"4783614b-88f9-4207-b6ef-f73824ce9334\") " pod="openshift-ingress/router-default-5444994796-nmqqm" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.028438 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ca0af9c8-f2fa-45f8-a428-9b061441bddf-registry-tls\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.039220 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pbnbq" event={"ID":"45a39bbd-157a-404e-ae10-a4a2893de563","Type":"ContainerStarted","Data":"c1dfa7c9afe9c5bb732f37d6bf957782cd62cd06995c50d6c77d4ad9937fd789"} Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.040434 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ca0af9c8-f2fa-45f8-a428-9b061441bddf-bound-sa-token\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.042424 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-wjdnp" event={"ID":"db6feab7-169d-4a82-a204-f9a1ff56f85e","Type":"ContainerStarted","Data":"42f5e91cdf13cad2e76e564793a057158f1d0ac899ec5ee7bd13774938a55d8f"} Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.043200 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l2cns" event={"ID":"a592d34f-e6b6-4ec2-8c63-2249f70cd3f4","Type":"ContainerStarted","Data":"dd3d5d3b871f30196a2045ff334285747cbfa890163e2807f683f7f907e00d2d"} Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.059545 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rsttb"] Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.069367 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/25ecb4dd-adbb-48db-8563-78c6f9faff4f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-pr7kc\" (UID: \"25ecb4dd-adbb-48db-8563-78c6f9faff4f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pr7kc" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.076036 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twpck\" (UniqueName: \"kubernetes.io/projected/3acf1264-0fb4-46e4-a876-8e7677b39304-kube-api-access-twpck\") pod \"ingress-operator-5b745b69d9-6fssm\" (UID: \"3acf1264-0fb4-46e4-a876-8e7677b39304\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6fssm" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.088313 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-qssm4"] Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.095388 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d8pw\" (UniqueName: \"kubernetes.io/projected/effa094e-e36b-4039-8523-dc11cffa2894-kube-api-access-7d8pw\") pod \"migrator-59844c95c7-tf7kq\" (UID: \"effa094e-e36b-4039-8523-dc11cffa2894\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tf7kq" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.112678 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:21 crc kubenswrapper[4831]: E1204 10:17:21.112867 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:21.612847747 +0000 UTC m=+138.562023061 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.112920 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c589b350-6e72-40be-9f47-b26e80dc5ba6-proxy-tls\") pod \"machine-config-operator-74547568cd-c4vs7\" (UID: \"c589b350-6e72-40be-9f47-b26e80dc5ba6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c4vs7" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.112980 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tswx8\" (UniqueName: \"kubernetes.io/projected/87734f51-2b97-4726-8eba-c95450c2686b-kube-api-access-tswx8\") pod \"olm-operator-6b444d44fb-frgsp\" (UID: \"87734f51-2b97-4726-8eba-c95450c2686b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-frgsp" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.113027 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/20c7a900-7157-4468-bd68-e74c2d382e85-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-bp6kt\" (UID: \"20c7a900-7157-4468-bd68-e74c2d382e85\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bp6kt" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.113092 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/93463415-c819-44b0-9ee7-0de0698eb6a6-secret-volume\") pod \"collect-profiles-29414055-8r2vj\" (UID: \"93463415-c819-44b0-9ee7-0de0698eb6a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-8r2vj" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.113154 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgzwd\" (UniqueName: \"kubernetes.io/projected/f5d23b38-2bda-4faf-a3da-8565336a48a2-kube-api-access-fgzwd\") pod \"ingress-canary-d85d4\" (UID: \"f5d23b38-2bda-4faf-a3da-8565336a48a2\") " pod="openshift-ingress-canary/ingress-canary-d85d4" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.113180 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j9sd\" (UniqueName: \"kubernetes.io/projected/d686eb67-074c-4086-b33c-c632acc21a66-kube-api-access-9j9sd\") pod \"packageserver-d55dfcdfc-qkzn2\" (UID: \"d686eb67-074c-4086-b33c-c632acc21a66\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qkzn2" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.113200 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a8c8337c-874e-4c1d-9656-aeefef264a92-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-9sxtr\" (UID: \"a8c8337c-874e-4c1d-9656-aeefef264a92\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9sxtr" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.113254 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkxn5\" (UniqueName: \"kubernetes.io/projected/20c7a900-7157-4468-bd68-e74c2d382e85-kube-api-access-vkxn5\") pod \"machine-config-controller-84d6567774-bp6kt\" (UID: \"20c7a900-7157-4468-bd68-e74c2d382e85\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bp6kt" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.113307 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.113336 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad2c81a8-b9ac-4332-a6ae-a27e95e90537-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-2pnnb\" (UID: \"ad2c81a8-b9ac-4332-a6ae-a27e95e90537\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2pnnb" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.113362 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94zfl\" (UniqueName: \"kubernetes.io/projected/db1720b8-e97a-4399-8c33-a0dfe81a4621-kube-api-access-94zfl\") pod \"package-server-manager-789f6589d5-zng9d\" (UID: \"db1720b8-e97a-4399-8c33-a0dfe81a4621\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zng9d" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.113411 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5psh\" (UniqueName: \"kubernetes.io/projected/c589b350-6e72-40be-9f47-b26e80dc5ba6-kube-api-access-x5psh\") pod \"machine-config-operator-74547568cd-c4vs7\" (UID: \"c589b350-6e72-40be-9f47-b26e80dc5ba6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c4vs7" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.113437 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab874db5-da93-478d-927b-21104115cf12-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zln28\" (UID: \"ab874db5-da93-478d-927b-21104115cf12\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zln28" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.113502 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/db1720b8-e97a-4399-8c33-a0dfe81a4621-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zng9d\" (UID: \"db1720b8-e97a-4399-8c33-a0dfe81a4621\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zng9d" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.113555 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77vwm\" (UniqueName: \"kubernetes.io/projected/4ab7a865-d267-4ad7-88bd-e3b9e1f2510c-kube-api-access-77vwm\") pod \"service-ca-9c57cc56f-2rbhc\" (UID: \"4ab7a865-d267-4ad7-88bd-e3b9e1f2510c\") " pod="openshift-service-ca/service-ca-9c57cc56f-2rbhc" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.113584 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/37cadb5f-eedf-40ab-b4db-44d9935e26eb-mountpoint-dir\") pod \"csi-hostpathplugin-56tfw\" (UID: \"37cadb5f-eedf-40ab-b4db-44d9935e26eb\") " pod="hostpath-provisioner/csi-hostpathplugin-56tfw" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.113605 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsh97\" (UniqueName: \"kubernetes.io/projected/a57c45a9-d720-4144-86c6-dc4c3cfbd9bd-kube-api-access-dsh97\") pod \"service-ca-operator-777779d784-mlcpb\" (UID: \"a57c45a9-d720-4144-86c6-dc4c3cfbd9bd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mlcpb" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.113648 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a57c45a9-d720-4144-86c6-dc4c3cfbd9bd-config\") pod \"service-ca-operator-777779d784-mlcpb\" (UID: \"a57c45a9-d720-4144-86c6-dc4c3cfbd9bd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mlcpb" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.113763 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c589b350-6e72-40be-9f47-b26e80dc5ba6-images\") pod \"machine-config-operator-74547568cd-c4vs7\" (UID: \"c589b350-6e72-40be-9f47-b26e80dc5ba6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c4vs7" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.113787 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/04bb6d4d-643d-4015-9579-c522cfc43c2a-config-volume\") pod \"dns-default-lzxr6\" (UID: \"04bb6d4d-643d-4015-9579-c522cfc43c2a\") " pod="openshift-dns/dns-default-lzxr6" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.113831 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a57c45a9-d720-4144-86c6-dc4c3cfbd9bd-serving-cert\") pod \"service-ca-operator-777779d784-mlcpb\" (UID: \"a57c45a9-d720-4144-86c6-dc4c3cfbd9bd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mlcpb" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.113857 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/37cadb5f-eedf-40ab-b4db-44d9935e26eb-plugins-dir\") pod \"csi-hostpathplugin-56tfw\" (UID: \"37cadb5f-eedf-40ab-b4db-44d9935e26eb\") " pod="hostpath-provisioner/csi-hostpathplugin-56tfw" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.113900 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/20c7a900-7157-4468-bd68-e74c2d382e85-proxy-tls\") pod \"machine-config-controller-84d6567774-bp6kt\" (UID: \"20c7a900-7157-4468-bd68-e74c2d382e85\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bp6kt" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.113931 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/68cdd202-055b-41c7-ac5f-a13b918c44fc-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-mgrdx\" (UID: \"68cdd202-055b-41c7-ac5f-a13b918c44fc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mgrdx" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.113954 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmrtg\" (UniqueName: \"kubernetes.io/projected/93463415-c819-44b0-9ee7-0de0698eb6a6-kube-api-access-lmrtg\") pod \"collect-profiles-29414055-8r2vj\" (UID: \"93463415-c819-44b0-9ee7-0de0698eb6a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-8r2vj" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.114014 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad2c81a8-b9ac-4332-a6ae-a27e95e90537-config\") pod \"kube-apiserver-operator-766d6c64bb-2pnnb\" (UID: \"ad2c81a8-b9ac-4332-a6ae-a27e95e90537\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2pnnb" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.114039 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sptxp\" (UniqueName: \"kubernetes.io/projected/37cadb5f-eedf-40ab-b4db-44d9935e26eb-kube-api-access-sptxp\") pod \"csi-hostpathplugin-56tfw\" (UID: \"37cadb5f-eedf-40ab-b4db-44d9935e26eb\") " pod="hostpath-provisioner/csi-hostpathplugin-56tfw" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.114085 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c589b350-6e72-40be-9f47-b26e80dc5ba6-auth-proxy-config\") pod \"machine-config-operator-74547568cd-c4vs7\" (UID: \"c589b350-6e72-40be-9f47-b26e80dc5ba6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c4vs7" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.114108 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4ab7a865-d267-4ad7-88bd-e3b9e1f2510c-signing-key\") pod \"service-ca-9c57cc56f-2rbhc\" (UID: \"4ab7a865-d267-4ad7-88bd-e3b9e1f2510c\") " pod="openshift-service-ca/service-ca-9c57cc56f-2rbhc" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.114156 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d686eb67-074c-4086-b33c-c632acc21a66-tmpfs\") pod \"packageserver-d55dfcdfc-qkzn2\" (UID: \"d686eb67-074c-4086-b33c-c632acc21a66\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qkzn2" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.114259 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7eb2c02c-884b-45f4-8387-e2e71df329a9-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-twgk8\" (UID: \"7eb2c02c-884b-45f4-8387-e2e71df329a9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-twgk8" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.114298 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/73f9aaec-7f63-4909-9ffe-6b073e0225d9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qp6r2\" (UID: \"73f9aaec-7f63-4909-9ffe-6b073e0225d9\") " pod="openshift-marketplace/marketplace-operator-79b997595-qp6r2" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.114349 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/87734f51-2b97-4726-8eba-c95450c2686b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-frgsp\" (UID: \"87734f51-2b97-4726-8eba-c95450c2686b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-frgsp" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.114373 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbjdt\" (UniqueName: \"kubernetes.io/projected/73f9aaec-7f63-4909-9ffe-6b073e0225d9-kube-api-access-hbjdt\") pod \"marketplace-operator-79b997595-qp6r2\" (UID: \"73f9aaec-7f63-4909-9ffe-6b073e0225d9\") " pod="openshift-marketplace/marketplace-operator-79b997595-qp6r2" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.114423 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73f9aaec-7f63-4909-9ffe-6b073e0225d9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qp6r2\" (UID: \"73f9aaec-7f63-4909-9ffe-6b073e0225d9\") " pod="openshift-marketplace/marketplace-operator-79b997595-qp6r2" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.114448 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4ab7a865-d267-4ad7-88bd-e3b9e1f2510c-signing-cabundle\") pod \"service-ca-9c57cc56f-2rbhc\" (UID: \"4ab7a865-d267-4ad7-88bd-e3b9e1f2510c\") " pod="openshift-service-ca/service-ca-9c57cc56f-2rbhc" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.114510 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwpvb\" (UniqueName: \"kubernetes.io/projected/04bb6d4d-643d-4015-9579-c522cfc43c2a-kube-api-access-cwpvb\") pod \"dns-default-lzxr6\" (UID: \"04bb6d4d-643d-4015-9579-c522cfc43c2a\") " pod="openshift-dns/dns-default-lzxr6" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.114541 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/37cadb5f-eedf-40ab-b4db-44d9935e26eb-csi-data-dir\") pod \"csi-hostpathplugin-56tfw\" (UID: \"37cadb5f-eedf-40ab-b4db-44d9935e26eb\") " pod="hostpath-provisioner/csi-hostpathplugin-56tfw" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.114818 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/37cadb5f-eedf-40ab-b4db-44d9935e26eb-registration-dir\") pod \"csi-hostpathplugin-56tfw\" (UID: \"37cadb5f-eedf-40ab-b4db-44d9935e26eb\") " pod="hostpath-provisioner/csi-hostpathplugin-56tfw" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.114850 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbmf8\" (UniqueName: \"kubernetes.io/projected/7eb2c02c-884b-45f4-8387-e2e71df329a9-kube-api-access-xbmf8\") pod \"kube-storage-version-migrator-operator-b67b599dd-twgk8\" (UID: \"7eb2c02c-884b-45f4-8387-e2e71df329a9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-twgk8" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.114903 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/51375b90-5931-416b-8a05-5a76d3e2852f-certs\") pod \"machine-config-server-9q8xt\" (UID: \"51375b90-5931-416b-8a05-5a76d3e2852f\") " pod="openshift-machine-config-operator/machine-config-server-9q8xt" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.114912 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a57c45a9-d720-4144-86c6-dc4c3cfbd9bd-config\") pod \"service-ca-operator-777779d784-mlcpb\" (UID: \"a57c45a9-d720-4144-86c6-dc4c3cfbd9bd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mlcpb" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.114934 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w54wf\" (UniqueName: \"kubernetes.io/projected/51375b90-5931-416b-8a05-5a76d3e2852f-kube-api-access-w54wf\") pod \"machine-config-server-9q8xt\" (UID: \"51375b90-5931-416b-8a05-5a76d3e2852f\") " pod="openshift-machine-config-operator/machine-config-server-9q8xt" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.114983 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab874db5-da93-478d-927b-21104115cf12-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zln28\" (UID: \"ab874db5-da93-478d-927b-21104115cf12\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zln28" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.115004 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/51375b90-5931-416b-8a05-5a76d3e2852f-node-bootstrap-token\") pod \"machine-config-server-9q8xt\" (UID: \"51375b90-5931-416b-8a05-5a76d3e2852f\") " pod="openshift-machine-config-operator/machine-config-server-9q8xt" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.115026 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d686eb67-074c-4086-b33c-c632acc21a66-webhook-cert\") pod \"packageserver-d55dfcdfc-qkzn2\" (UID: \"d686eb67-074c-4086-b33c-c632acc21a66\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qkzn2" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.115057 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61498aaa-6ad9-4572-954f-40d841edd6d8-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-wtt7n\" (UID: \"61498aaa-6ad9-4572-954f-40d841edd6d8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wtt7n" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.115079 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7eb2c02c-884b-45f4-8387-e2e71df329a9-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-twgk8\" (UID: \"7eb2c02c-884b-45f4-8387-e2e71df329a9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-twgk8" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.115103 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/61498aaa-6ad9-4572-954f-40d841edd6d8-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-wtt7n\" (UID: \"61498aaa-6ad9-4572-954f-40d841edd6d8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wtt7n" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.115133 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61498aaa-6ad9-4572-954f-40d841edd6d8-config\") pod \"kube-controller-manager-operator-78b949d7b-wtt7n\" (UID: \"61498aaa-6ad9-4572-954f-40d841edd6d8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wtt7n" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.115150 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d686eb67-074c-4086-b33c-c632acc21a66-apiservice-cert\") pod \"packageserver-d55dfcdfc-qkzn2\" (UID: \"d686eb67-074c-4086-b33c-c632acc21a66\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qkzn2" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.115177 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f5d23b38-2bda-4faf-a3da-8565336a48a2-cert\") pod \"ingress-canary-d85d4\" (UID: \"f5d23b38-2bda-4faf-a3da-8565336a48a2\") " pod="openshift-ingress-canary/ingress-canary-d85d4" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.115197 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6vmb\" (UniqueName: \"kubernetes.io/projected/a8c8337c-874e-4c1d-9656-aeefef264a92-kube-api-access-b6vmb\") pod \"multus-admission-controller-857f4d67dd-9sxtr\" (UID: \"a8c8337c-874e-4c1d-9656-aeefef264a92\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9sxtr" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.115216 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93463415-c819-44b0-9ee7-0de0698eb6a6-config-volume\") pod \"collect-profiles-29414055-8r2vj\" (UID: \"93463415-c819-44b0-9ee7-0de0698eb6a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-8r2vj" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.115274 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad2c81a8-b9ac-4332-a6ae-a27e95e90537-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-2pnnb\" (UID: \"ad2c81a8-b9ac-4332-a6ae-a27e95e90537\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2pnnb" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.115291 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/87734f51-2b97-4726-8eba-c95450c2686b-srv-cert\") pod \"olm-operator-6b444d44fb-frgsp\" (UID: \"87734f51-2b97-4726-8eba-c95450c2686b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-frgsp" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.115308 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/04bb6d4d-643d-4015-9579-c522cfc43c2a-metrics-tls\") pod \"dns-default-lzxr6\" (UID: \"04bb6d4d-643d-4015-9579-c522cfc43c2a\") " pod="openshift-dns/dns-default-lzxr6" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.115324 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/37cadb5f-eedf-40ab-b4db-44d9935e26eb-socket-dir\") pod \"csi-hostpathplugin-56tfw\" (UID: \"37cadb5f-eedf-40ab-b4db-44d9935e26eb\") " pod="hostpath-provisioner/csi-hostpathplugin-56tfw" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.115347 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab874db5-da93-478d-927b-21104115cf12-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zln28\" (UID: \"ab874db5-da93-478d-927b-21104115cf12\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zln28" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.115371 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zj75s\" (UniqueName: \"kubernetes.io/projected/68cdd202-055b-41c7-ac5f-a13b918c44fc-kube-api-access-zj75s\") pod \"control-plane-machine-set-operator-78cbb6b69f-mgrdx\" (UID: \"68cdd202-055b-41c7-ac5f-a13b918c44fc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mgrdx" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.115983 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/20c7a900-7157-4468-bd68-e74c2d382e85-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-bp6kt\" (UID: \"20c7a900-7157-4468-bd68-e74c2d382e85\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bp6kt" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.116098 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab874db5-da93-478d-927b-21104115cf12-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zln28\" (UID: \"ab874db5-da93-478d-927b-21104115cf12\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zln28" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.117468 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad2c81a8-b9ac-4332-a6ae-a27e95e90537-config\") pod \"kube-apiserver-operator-766d6c64bb-2pnnb\" (UID: \"ad2c81a8-b9ac-4332-a6ae-a27e95e90537\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2pnnb" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.117529 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/93463415-c819-44b0-9ee7-0de0698eb6a6-secret-volume\") pod \"collect-profiles-29414055-8r2vj\" (UID: \"93463415-c819-44b0-9ee7-0de0698eb6a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-8r2vj" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.118274 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c589b350-6e72-40be-9f47-b26e80dc5ba6-proxy-tls\") pod \"machine-config-operator-74547568cd-c4vs7\" (UID: \"c589b350-6e72-40be-9f47-b26e80dc5ba6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c4vs7" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.120199 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61498aaa-6ad9-4572-954f-40d841edd6d8-config\") pod \"kube-controller-manager-operator-78b949d7b-wtt7n\" (UID: \"61498aaa-6ad9-4572-954f-40d841edd6d8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wtt7n" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.120632 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a8c8337c-874e-4c1d-9656-aeefef264a92-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-9sxtr\" (UID: \"a8c8337c-874e-4c1d-9656-aeefef264a92\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9sxtr" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.120931 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d686eb67-074c-4086-b33c-c632acc21a66-tmpfs\") pod \"packageserver-d55dfcdfc-qkzn2\" (UID: \"d686eb67-074c-4086-b33c-c632acc21a66\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qkzn2" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.121005 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/37cadb5f-eedf-40ab-b4db-44d9935e26eb-socket-dir\") pod \"csi-hostpathplugin-56tfw\" (UID: \"37cadb5f-eedf-40ab-b4db-44d9935e26eb\") " pod="hostpath-provisioner/csi-hostpathplugin-56tfw" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.121043 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/37cadb5f-eedf-40ab-b4db-44d9935e26eb-mountpoint-dir\") pod \"csi-hostpathplugin-56tfw\" (UID: \"37cadb5f-eedf-40ab-b4db-44d9935e26eb\") " pod="hostpath-provisioner/csi-hostpathplugin-56tfw" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.121529 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/04bb6d4d-643d-4015-9579-c522cfc43c2a-config-volume\") pod \"dns-default-lzxr6\" (UID: \"04bb6d4d-643d-4015-9579-c522cfc43c2a\") " pod="openshift-dns/dns-default-lzxr6" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.121605 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/37cadb5f-eedf-40ab-b4db-44d9935e26eb-registration-dir\") pod \"csi-hostpathplugin-56tfw\" (UID: \"37cadb5f-eedf-40ab-b4db-44d9935e26eb\") " pod="hostpath-provisioner/csi-hostpathplugin-56tfw" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.121645 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/37cadb5f-eedf-40ab-b4db-44d9935e26eb-plugins-dir\") pod \"csi-hostpathplugin-56tfw\" (UID: \"37cadb5f-eedf-40ab-b4db-44d9935e26eb\") " pod="hostpath-provisioner/csi-hostpathplugin-56tfw" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.121907 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/37cadb5f-eedf-40ab-b4db-44d9935e26eb-csi-data-dir\") pod \"csi-hostpathplugin-56tfw\" (UID: \"37cadb5f-eedf-40ab-b4db-44d9935e26eb\") " pod="hostpath-provisioner/csi-hostpathplugin-56tfw" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.122541 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/51375b90-5931-416b-8a05-5a76d3e2852f-node-bootstrap-token\") pod \"machine-config-server-9q8xt\" (UID: \"51375b90-5931-416b-8a05-5a76d3e2852f\") " pod="openshift-machine-config-operator/machine-config-server-9q8xt" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.123128 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93463415-c819-44b0-9ee7-0de0698eb6a6-config-volume\") pod \"collect-profiles-29414055-8r2vj\" (UID: \"93463415-c819-44b0-9ee7-0de0698eb6a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-8r2vj" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.123226 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73f9aaec-7f63-4909-9ffe-6b073e0225d9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qp6r2\" (UID: \"73f9aaec-7f63-4909-9ffe-6b073e0225d9\") " pod="openshift-marketplace/marketplace-operator-79b997595-qp6r2" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.123632 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7eb2c02c-884b-45f4-8387-e2e71df329a9-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-twgk8\" (UID: \"7eb2c02c-884b-45f4-8387-e2e71df329a9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-twgk8" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.124421 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/87734f51-2b97-4726-8eba-c95450c2686b-srv-cert\") pod \"olm-operator-6b444d44fb-frgsp\" (UID: \"87734f51-2b97-4726-8eba-c95450c2686b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-frgsp" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.124437 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab874db5-da93-478d-927b-21104115cf12-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zln28\" (UID: \"ab874db5-da93-478d-927b-21104115cf12\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zln28" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.125035 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f5d23b38-2bda-4faf-a3da-8565336a48a2-cert\") pod \"ingress-canary-d85d4\" (UID: \"f5d23b38-2bda-4faf-a3da-8565336a48a2\") " pod="openshift-ingress-canary/ingress-canary-d85d4" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.125296 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d686eb67-074c-4086-b33c-c632acc21a66-webhook-cert\") pod \"packageserver-d55dfcdfc-qkzn2\" (UID: \"d686eb67-074c-4086-b33c-c632acc21a66\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qkzn2" Dec 04 10:17:21 crc kubenswrapper[4831]: E1204 10:17:21.125331 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:21.625317067 +0000 UTC m=+138.574492381 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.125504 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c589b350-6e72-40be-9f47-b26e80dc5ba6-images\") pod \"machine-config-operator-74547568cd-c4vs7\" (UID: \"c589b350-6e72-40be-9f47-b26e80dc5ba6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c4vs7" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.126083 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/20c7a900-7157-4468-bd68-e74c2d382e85-proxy-tls\") pod \"machine-config-controller-84d6567774-bp6kt\" (UID: \"20c7a900-7157-4468-bd68-e74c2d382e85\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bp6kt" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.126127 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d686eb67-074c-4086-b33c-c632acc21a66-apiservice-cert\") pod \"packageserver-d55dfcdfc-qkzn2\" (UID: \"d686eb67-074c-4086-b33c-c632acc21a66\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qkzn2" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.126579 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4ab7a865-d267-4ad7-88bd-e3b9e1f2510c-signing-cabundle\") pod \"service-ca-9c57cc56f-2rbhc\" (UID: \"4ab7a865-d267-4ad7-88bd-e3b9e1f2510c\") " pod="openshift-service-ca/service-ca-9c57cc56f-2rbhc" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.128299 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad2c81a8-b9ac-4332-a6ae-a27e95e90537-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-2pnnb\" (UID: \"ad2c81a8-b9ac-4332-a6ae-a27e95e90537\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2pnnb" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.128589 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a57c45a9-d720-4144-86c6-dc4c3cfbd9bd-serving-cert\") pod \"service-ca-operator-777779d784-mlcpb\" (UID: \"a57c45a9-d720-4144-86c6-dc4c3cfbd9bd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mlcpb" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.129252 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/db1720b8-e97a-4399-8c33-a0dfe81a4621-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zng9d\" (UID: \"db1720b8-e97a-4399-8c33-a0dfe81a4621\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zng9d" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.129988 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61498aaa-6ad9-4572-954f-40d841edd6d8-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-wtt7n\" (UID: \"61498aaa-6ad9-4572-954f-40d841edd6d8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wtt7n" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.130154 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/51375b90-5931-416b-8a05-5a76d3e2852f-certs\") pod \"machine-config-server-9q8xt\" (UID: \"51375b90-5931-416b-8a05-5a76d3e2852f\") " pod="openshift-machine-config-operator/machine-config-server-9q8xt" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.131094 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/68cdd202-055b-41c7-ac5f-a13b918c44fc-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-mgrdx\" (UID: \"68cdd202-055b-41c7-ac5f-a13b918c44fc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mgrdx" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.132206 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/87734f51-2b97-4726-8eba-c95450c2686b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-frgsp\" (UID: \"87734f51-2b97-4726-8eba-c95450c2686b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-frgsp" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.133897 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/73f9aaec-7f63-4909-9ffe-6b073e0225d9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qp6r2\" (UID: \"73f9aaec-7f63-4909-9ffe-6b073e0225d9\") " pod="openshift-marketplace/marketplace-operator-79b997595-qp6r2" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.136410 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4ab7a865-d267-4ad7-88bd-e3b9e1f2510c-signing-key\") pod \"service-ca-9c57cc56f-2rbhc\" (UID: \"4ab7a865-d267-4ad7-88bd-e3b9e1f2510c\") " pod="openshift-service-ca/service-ca-9c57cc56f-2rbhc" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.136439 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-skjn8"] Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.153892 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3acf1264-0fb4-46e4-a876-8e7677b39304-bound-sa-token\") pod \"ingress-operator-5b745b69d9-6fssm\" (UID: \"3acf1264-0fb4-46e4-a876-8e7677b39304\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6fssm" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.154449 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-7czl2" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.165394 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7slwb\" (UniqueName: \"kubernetes.io/projected/25ecb4dd-adbb-48db-8563-78c6f9faff4f-kube-api-access-7slwb\") pod \"cluster-image-registry-operator-dc59b4c8b-pr7kc\" (UID: \"25ecb4dd-adbb-48db-8563-78c6f9faff4f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pr7kc" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.168313 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-27tcv"] Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.173090 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-pgqgv"] Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.176812 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tf7kq" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.178279 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g85sg\" (UniqueName: \"kubernetes.io/projected/4783614b-88f9-4207-b6ef-f73824ce9334-kube-api-access-g85sg\") pod \"router-default-5444994796-nmqqm\" (UID: \"4783614b-88f9-4207-b6ef-f73824ce9334\") " pod="openshift-ingress/router-default-5444994796-nmqqm" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.185167 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-nmqqm" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.212722 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkxn5\" (UniqueName: \"kubernetes.io/projected/20c7a900-7157-4468-bd68-e74c2d382e85-kube-api-access-vkxn5\") pod \"machine-config-controller-84d6567774-bp6kt\" (UID: \"20c7a900-7157-4468-bd68-e74c2d382e85\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bp6kt" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.216613 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:21 crc kubenswrapper[4831]: E1204 10:17:21.216994 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:21.716939623 +0000 UTC m=+138.666114937 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.235073 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j9sd\" (UniqueName: \"kubernetes.io/projected/d686eb67-074c-4086-b33c-c632acc21a66-kube-api-access-9j9sd\") pod \"packageserver-d55dfcdfc-qkzn2\" (UID: \"d686eb67-074c-4086-b33c-c632acc21a66\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qkzn2" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.247541 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-d8prt"] Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.253024 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgzwd\" (UniqueName: \"kubernetes.io/projected/f5d23b38-2bda-4faf-a3da-8565336a48a2-kube-api-access-fgzwd\") pod \"ingress-canary-d85d4\" (UID: \"f5d23b38-2bda-4faf-a3da-8565336a48a2\") " pod="openshift-ingress-canary/ingress-canary-d85d4" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.267396 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qkzn2" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.274089 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w54wf\" (UniqueName: \"kubernetes.io/projected/51375b90-5931-416b-8a05-5a76d3e2852f-kube-api-access-w54wf\") pod \"machine-config-server-9q8xt\" (UID: \"51375b90-5931-416b-8a05-5a76d3e2852f\") " pod="openshift-machine-config-operator/machine-config-server-9q8xt" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.295421 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj75s\" (UniqueName: \"kubernetes.io/projected/68cdd202-055b-41c7-ac5f-a13b918c44fc-kube-api-access-zj75s\") pod \"control-plane-machine-set-operator-78cbb6b69f-mgrdx\" (UID: \"68cdd202-055b-41c7-ac5f-a13b918c44fc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mgrdx" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.313997 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad2c81a8-b9ac-4332-a6ae-a27e95e90537-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-2pnnb\" (UID: \"ad2c81a8-b9ac-4332-a6ae-a27e95e90537\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2pnnb" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.319585 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:21 crc kubenswrapper[4831]: E1204 10:17:21.319986 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:21.819952158 +0000 UTC m=+138.769127472 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.325978 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-9q8xt" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.331982 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-nqztc"] Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.332034 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m44sl"] Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.332045 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-2h8nq"] Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.333251 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-cmg79"] Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.336562 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/61498aaa-6ad9-4572-954f-40d841edd6d8-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-wtt7n\" (UID: \"61498aaa-6ad9-4572-954f-40d841edd6d8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wtt7n" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.353269 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77vwm\" (UniqueName: \"kubernetes.io/projected/4ab7a865-d267-4ad7-88bd-e3b9e1f2510c-kube-api-access-77vwm\") pod \"service-ca-9c57cc56f-2rbhc\" (UID: \"4ab7a865-d267-4ad7-88bd-e3b9e1f2510c\") " pod="openshift-service-ca/service-ca-9c57cc56f-2rbhc" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.356279 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-d85d4" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.372881 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmrtg\" (UniqueName: \"kubernetes.io/projected/93463415-c819-44b0-9ee7-0de0698eb6a6-kube-api-access-lmrtg\") pod \"collect-profiles-29414055-8r2vj\" (UID: \"93463415-c819-44b0-9ee7-0de0698eb6a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-8r2vj" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.392515 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tswx8\" (UniqueName: \"kubernetes.io/projected/87734f51-2b97-4726-8eba-c95450c2686b-kube-api-access-tswx8\") pod \"olm-operator-6b444d44fb-frgsp\" (UID: \"87734f51-2b97-4726-8eba-c95450c2686b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-frgsp" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.410586 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l4jk9"] Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.414595 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-4bj64"] Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.417503 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsh97\" (UniqueName: \"kubernetes.io/projected/a57c45a9-d720-4144-86c6-dc4c3cfbd9bd-kube-api-access-dsh97\") pod \"service-ca-operator-777779d784-mlcpb\" (UID: \"a57c45a9-d720-4144-86c6-dc4c3cfbd9bd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mlcpb" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.420412 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:21 crc kubenswrapper[4831]: E1204 10:17:21.420574 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:21.920543573 +0000 UTC m=+138.869718907 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.420834 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:21 crc kubenswrapper[4831]: E1204 10:17:21.421141 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:21.92112698 +0000 UTC m=+138.870302284 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.434674 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sptxp\" (UniqueName: \"kubernetes.io/projected/37cadb5f-eedf-40ab-b4db-44d9935e26eb-kube-api-access-sptxp\") pod \"csi-hostpathplugin-56tfw\" (UID: \"37cadb5f-eedf-40ab-b4db-44d9935e26eb\") " pod="hostpath-provisioner/csi-hostpathplugin-56tfw" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.437491 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6fssm" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.447587 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pr7kc" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.457611 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbjdt\" (UniqueName: \"kubernetes.io/projected/73f9aaec-7f63-4909-9ffe-6b073e0225d9-kube-api-access-hbjdt\") pod \"marketplace-operator-79b997595-qp6r2\" (UID: \"73f9aaec-7f63-4909-9ffe-6b073e0225d9\") " pod="openshift-marketplace/marketplace-operator-79b997595-qp6r2" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.471450 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwpvb\" (UniqueName: \"kubernetes.io/projected/04bb6d4d-643d-4015-9579-c522cfc43c2a-kube-api-access-cwpvb\") pod \"dns-default-lzxr6\" (UID: \"04bb6d4d-643d-4015-9579-c522cfc43c2a\") " pod="openshift-dns/dns-default-lzxr6" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.491341 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bp6kt" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.493552 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbmf8\" (UniqueName: \"kubernetes.io/projected/7eb2c02c-884b-45f4-8387-e2e71df329a9-kube-api-access-xbmf8\") pod \"kube-storage-version-migrator-operator-b67b599dd-twgk8\" (UID: \"7eb2c02c-884b-45f4-8387-e2e71df329a9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-twgk8" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.498354 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2pnnb" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.521502 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:21 crc kubenswrapper[4831]: E1204 10:17:21.521635 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:22.021608832 +0000 UTC m=+138.970784146 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.521927 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:21 crc kubenswrapper[4831]: E1204 10:17:21.522254 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:22.02224175 +0000 UTC m=+138.971417064 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.533270 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6vmb\" (UniqueName: \"kubernetes.io/projected/a8c8337c-874e-4c1d-9656-aeefef264a92-kube-api-access-b6vmb\") pod \"multus-admission-controller-857f4d67dd-9sxtr\" (UID: \"a8c8337c-874e-4c1d-9656-aeefef264a92\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9sxtr" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.552280 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94zfl\" (UniqueName: \"kubernetes.io/projected/db1720b8-e97a-4399-8c33-a0dfe81a4621-kube-api-access-94zfl\") pod \"package-server-manager-789f6589d5-zng9d\" (UID: \"db1720b8-e97a-4399-8c33-a0dfe81a4621\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zng9d" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.552502 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-2rbhc" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.560748 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-8r2vj" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.573585 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mgrdx" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.579984 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qp6r2" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.587182 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-frgsp" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.593604 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wtt7n" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.609398 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mlcpb" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.623458 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:21 crc kubenswrapper[4831]: E1204 10:17:21.623555 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:22.123534145 +0000 UTC m=+139.072709469 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.623716 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:21 crc kubenswrapper[4831]: E1204 10:17:21.624059 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:22.12404897 +0000 UTC m=+139.073224294 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.647227 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-56tfw" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.724625 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:21 crc kubenswrapper[4831]: E1204 10:17:21.724843 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:22.224806199 +0000 UTC m=+139.173981523 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.725066 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:21 crc kubenswrapper[4831]: E1204 10:17:21.725585 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:22.225567231 +0000 UTC m=+139.174742565 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.806838 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zng9d" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.826811 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:21 crc kubenswrapper[4831]: E1204 10:17:21.827290 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:22.327268158 +0000 UTC m=+139.276443482 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.830564 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-9sxtr" Dec 04 10:17:21 crc kubenswrapper[4831]: I1204 10:17:21.928706 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:21 crc kubenswrapper[4831]: E1204 10:17:21.929157 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:22.42914134 +0000 UTC m=+139.378316664 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:22 crc kubenswrapper[4831]: I1204 10:17:22.033256 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:22 crc kubenswrapper[4831]: E1204 10:17:22.037166 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:22.537085427 +0000 UTC m=+139.486260761 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:22 crc kubenswrapper[4831]: I1204 10:17:22.058876 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-twg79" event={"ID":"bc1e4c46-b909-410a-980d-a045d5b3a636","Type":"ContainerStarted","Data":"917523e78438bd882750464ab15c15a4f9116a44cf5d4b1095f0318ce2615eba"} Dec 04 10:17:22 crc kubenswrapper[4831]: I1204 10:17:22.138518 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:22 crc kubenswrapper[4831]: E1204 10:17:22.138974 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:22.638954109 +0000 UTC m=+139.588129523 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:22 crc kubenswrapper[4831]: I1204 10:17:22.194612 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/04bb6d4d-643d-4015-9579-c522cfc43c2a-metrics-tls\") pod \"dns-default-lzxr6\" (UID: \"04bb6d4d-643d-4015-9579-c522cfc43c2a\") " pod="openshift-dns/dns-default-lzxr6" Dec 04 10:17:22 crc kubenswrapper[4831]: I1204 10:17:22.195293 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c589b350-6e72-40be-9f47-b26e80dc5ba6-auth-proxy-config\") pod \"machine-config-operator-74547568cd-c4vs7\" (UID: \"c589b350-6e72-40be-9f47-b26e80dc5ba6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c4vs7" Dec 04 10:17:22 crc kubenswrapper[4831]: I1204 10:17:22.195582 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7eb2c02c-884b-45f4-8387-e2e71df329a9-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-twgk8\" (UID: \"7eb2c02c-884b-45f4-8387-e2e71df329a9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-twgk8" Dec 04 10:17:22 crc kubenswrapper[4831]: I1204 10:17:22.203925 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvshm\" (UniqueName: \"kubernetes.io/projected/ca0af9c8-f2fa-45f8-a428-9b061441bddf-kube-api-access-wvshm\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:22 crc kubenswrapper[4831]: W1204 10:17:22.205790 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedd157b6_46a0_4a10_94fb_670544f743ca.slice/crio-b73534be47bcd81604328d43ec7c3924fb6734d07024444576849fe63cbeb56e WatchSource:0}: Error finding container b73534be47bcd81604328d43ec7c3924fb6734d07024444576849fe63cbeb56e: Status 404 returned error can't find the container with id b73534be47bcd81604328d43ec7c3924fb6734d07024444576849fe63cbeb56e Dec 04 10:17:22 crc kubenswrapper[4831]: I1204 10:17:22.206235 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab874db5-da93-478d-927b-21104115cf12-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zln28\" (UID: \"ab874db5-da93-478d-927b-21104115cf12\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zln28" Dec 04 10:17:22 crc kubenswrapper[4831]: W1204 10:17:22.209438 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b483323_5ed6_40b5_b256_c9de7033e4eb.slice/crio-a4ab4eb18963e5eb18ffad30f6ac3e43e26a1ea255f8496c712d742a8bc03e3f WatchSource:0}: Error finding container a4ab4eb18963e5eb18ffad30f6ac3e43e26a1ea255f8496c712d742a8bc03e3f: Status 404 returned error can't find the container with id a4ab4eb18963e5eb18ffad30f6ac3e43e26a1ea255f8496c712d742a8bc03e3f Dec 04 10:17:22 crc kubenswrapper[4831]: I1204 10:17:22.215213 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5psh\" (UniqueName: \"kubernetes.io/projected/c589b350-6e72-40be-9f47-b26e80dc5ba6-kube-api-access-x5psh\") pod \"machine-config-operator-74547568cd-c4vs7\" (UID: \"c589b350-6e72-40be-9f47-b26e80dc5ba6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c4vs7" Dec 04 10:17:22 crc kubenswrapper[4831]: W1204 10:17:22.216005 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bed6763_7fd9_492f_8a67_cd65b0061950.slice/crio-47fa486134ce131e646fb261ca7daa2074f2988366a82db456955d5a5a579153 WatchSource:0}: Error finding container 47fa486134ce131e646fb261ca7daa2074f2988366a82db456955d5a5a579153: Status 404 returned error can't find the container with id 47fa486134ce131e646fb261ca7daa2074f2988366a82db456955d5a5a579153 Dec 04 10:17:22 crc kubenswrapper[4831]: I1204 10:17:22.216133 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lzxr6" Dec 04 10:17:22 crc kubenswrapper[4831]: W1204 10:17:22.217323 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d3db4fd_37e8_4257_8bb7_6ab5183de9e7.slice/crio-813eff7516bbfe30cfd8366c025c3b7ad3be75bd9f4b0bf601c2245af9b2f883 WatchSource:0}: Error finding container 813eff7516bbfe30cfd8366c025c3b7ad3be75bd9f4b0bf601c2245af9b2f883: Status 404 returned error can't find the container with id 813eff7516bbfe30cfd8366c025c3b7ad3be75bd9f4b0bf601c2245af9b2f883 Dec 04 10:17:22 crc kubenswrapper[4831]: W1204 10:17:22.218280 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fa1b6b7_1b8d_4c8c_96ac_9ed7c2297cc3.slice/crio-0904d2e17a1c0aa53a8c8f48f89bf14d1fbb84e936d2676480dca09a225514d0 WatchSource:0}: Error finding container 0904d2e17a1c0aa53a8c8f48f89bf14d1fbb84e936d2676480dca09a225514d0: Status 404 returned error can't find the container with id 0904d2e17a1c0aa53a8c8f48f89bf14d1fbb84e936d2676480dca09a225514d0 Dec 04 10:17:22 crc kubenswrapper[4831]: W1204 10:17:22.224289 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode57fed72_8221_44fb_8108_44e9a0dcc51a.slice/crio-74ad36286df7b77e1f8803e213c42a06723b5823f7c7e8b78f5001e3705f3f85 WatchSource:0}: Error finding container 74ad36286df7b77e1f8803e213c42a06723b5823f7c7e8b78f5001e3705f3f85: Status 404 returned error can't find the container with id 74ad36286df7b77e1f8803e213c42a06723b5823f7c7e8b78f5001e3705f3f85 Dec 04 10:17:22 crc kubenswrapper[4831]: I1204 10:17:22.242424 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:22 crc kubenswrapper[4831]: E1204 10:17:22.244714 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:22.744642932 +0000 UTC m=+139.693818256 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:22 crc kubenswrapper[4831]: I1204 10:17:22.245788 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:22 crc kubenswrapper[4831]: E1204 10:17:22.246215 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:22.746196906 +0000 UTC m=+139.695372220 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:22 crc kubenswrapper[4831]: I1204 10:17:22.349882 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:22 crc kubenswrapper[4831]: E1204 10:17:22.350482 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:22.850452877 +0000 UTC m=+139.799628191 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:22 crc kubenswrapper[4831]: I1204 10:17:22.422984 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-twgk8" Dec 04 10:17:22 crc kubenswrapper[4831]: I1204 10:17:22.446563 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c4vs7" Dec 04 10:17:22 crc kubenswrapper[4831]: I1204 10:17:22.451787 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:22 crc kubenswrapper[4831]: E1204 10:17:22.452272 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:22.952260277 +0000 UTC m=+139.901435581 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:22 crc kubenswrapper[4831]: I1204 10:17:22.475493 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qp6r2"] Dec 04 10:17:22 crc kubenswrapper[4831]: I1204 10:17:22.501252 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zln28" Dec 04 10:17:22 crc kubenswrapper[4831]: I1204 10:17:22.552808 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:22 crc kubenswrapper[4831]: E1204 10:17:22.553160 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:23.053142191 +0000 UTC m=+140.002317505 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:22 crc kubenswrapper[4831]: I1204 10:17:22.574167 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pr7kc"] Dec 04 10:17:22 crc kubenswrapper[4831]: I1204 10:17:22.621110 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-6fssm"] Dec 04 10:17:22 crc kubenswrapper[4831]: I1204 10:17:22.653963 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:22 crc kubenswrapper[4831]: E1204 10:17:22.654456 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:23.154434396 +0000 UTC m=+140.103609760 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:22 crc kubenswrapper[4831]: I1204 10:17:22.699474 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-7czl2"] Dec 04 10:17:22 crc kubenswrapper[4831]: W1204 10:17:22.709103 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73f9aaec_7f63_4909_9ffe_6b073e0225d9.slice/crio-89985e9b156ba7785d44196e744bc00ab4e4ca268a9a5c073e98447b1fc519aa WatchSource:0}: Error finding container 89985e9b156ba7785d44196e744bc00ab4e4ca268a9a5c073e98447b1fc519aa: Status 404 returned error can't find the container with id 89985e9b156ba7785d44196e744bc00ab4e4ca268a9a5c073e98447b1fc519aa Dec 04 10:17:22 crc kubenswrapper[4831]: I1204 10:17:22.755589 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:22 crc kubenswrapper[4831]: E1204 10:17:22.756317 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:23.256288848 +0000 UTC m=+140.205464172 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:22 crc kubenswrapper[4831]: W1204 10:17:22.759545 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25ecb4dd_adbb_48db_8563_78c6f9faff4f.slice/crio-af057ee8a93257c44071e62dad4dff35ace66908395a63a6d44b21fe5708d6cd WatchSource:0}: Error finding container af057ee8a93257c44071e62dad4dff35ace66908395a63a6d44b21fe5708d6cd: Status 404 returned error can't find the container with id af057ee8a93257c44071e62dad4dff35ace66908395a63a6d44b21fe5708d6cd Dec 04 10:17:22 crc kubenswrapper[4831]: I1204 10:17:22.857115 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:22 crc kubenswrapper[4831]: E1204 10:17:22.857696 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:23.357678836 +0000 UTC m=+140.306854150 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:22 crc kubenswrapper[4831]: I1204 10:17:22.884435 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-d85d4"] Dec 04 10:17:22 crc kubenswrapper[4831]: I1204 10:17:22.965306 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2pnnb"] Dec 04 10:17:23 crc kubenswrapper[4831]: I1204 10:17:23.063468 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-4bj64" event={"ID":"7da1b053-67d9-4a4c-842e-cafb5dce5017","Type":"ContainerStarted","Data":"48557a17c318129fb97cbc1aaf79ae4c67fcefcc5907a3a4c68de7bda1a4e612"} Dec 04 10:17:23 crc kubenswrapper[4831]: I1204 10:17:23.064255 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rsttb" event={"ID":"edd157b6-46a0-4a10-94fb-670544f743ca","Type":"ContainerStarted","Data":"b73534be47bcd81604328d43ec7c3924fb6734d07024444576849fe63cbeb56e"} Dec 04 10:17:23 crc kubenswrapper[4831]: I1204 10:17:23.065402 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-27tcv" event={"ID":"0d3db4fd-37e8-4257-8bb7-6ab5183de9e7","Type":"ContainerStarted","Data":"813eff7516bbfe30cfd8366c025c3b7ad3be75bd9f4b0bf601c2245af9b2f883"} Dec 04 10:17:23 crc kubenswrapper[4831]: I1204 10:17:23.066515 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pr7kc" event={"ID":"25ecb4dd-adbb-48db-8563-78c6f9faff4f","Type":"ContainerStarted","Data":"af057ee8a93257c44071e62dad4dff35ace66908395a63a6d44b21fe5708d6cd"} Dec 04 10:17:23 crc kubenswrapper[4831]: I1204 10:17:23.067614 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-wjdnp" event={"ID":"db6feab7-169d-4a82-a204-f9a1ff56f85e","Type":"ContainerStarted","Data":"757b3ecd5bc9504c838f3db8c323a264c327783f9c94efbe9e1574e1a601e5a3"} Dec 04 10:17:23 crc kubenswrapper[4831]: I1204 10:17:23.068521 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cmg79" event={"ID":"bf8f0aa6-641c-4258-bd46-541bf71d40b1","Type":"ContainerStarted","Data":"8d2383b2cbfbf4f762cf479f65bb85c52cff7c18bad1789472f613d55f6c636f"} Dec 04 10:17:23 crc kubenswrapper[4831]: I1204 10:17:23.069671 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-twg79" event={"ID":"bc1e4c46-b909-410a-980d-a045d5b3a636","Type":"ContainerStarted","Data":"7c69ff7524e7910118f58d65c9bd57f9a010ee22ca20405cca0beb413861e01b"} Dec 04 10:17:23 crc kubenswrapper[4831]: I1204 10:17:23.071228 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qp6r2" event={"ID":"73f9aaec-7f63-4909-9ffe-6b073e0225d9","Type":"ContainerStarted","Data":"89985e9b156ba7785d44196e744bc00ab4e4ca268a9a5c073e98447b1fc519aa"} Dec 04 10:17:23 crc kubenswrapper[4831]: I1204 10:17:23.071997 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" event={"ID":"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3","Type":"ContainerStarted","Data":"0904d2e17a1c0aa53a8c8f48f89bf14d1fbb84e936d2676480dca09a225514d0"} Dec 04 10:17:23 crc kubenswrapper[4831]: I1204 10:17:23.072792 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-skjn8" event={"ID":"e3ea5ca7-6c75-482a-9245-518056647743","Type":"ContainerStarted","Data":"2a82d005cb9dde8507499bdbf78ae7e15414e060602512261f4fe4a4ac8a9ff2"} Dec 04 10:17:23 crc kubenswrapper[4831]: I1204 10:17:23.073454 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-pgqgv" event={"ID":"7bed6763-7fd9-492f-8a67-cd65b0061950","Type":"ContainerStarted","Data":"47fa486134ce131e646fb261ca7daa2074f2988366a82db456955d5a5a579153"} Dec 04 10:17:23 crc kubenswrapper[4831]: I1204 10:17:23.074278 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-2h8nq" event={"ID":"08171e38-fc77-4d5a-acc9-58dca0784830","Type":"ContainerStarted","Data":"502156cd20a12cca5cfd4811b13e3b5084b2fd0e089931c750e8b109213b9c53"} Dec 04 10:17:23 crc kubenswrapper[4831]: I1204 10:17:23.074995 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqztc" event={"ID":"e57fed72-8221-44fb-8108-44e9a0dcc51a","Type":"ContainerStarted","Data":"74ad36286df7b77e1f8803e213c42a06723b5823f7c7e8b78f5001e3705f3f85"} Dec 04 10:17:23 crc kubenswrapper[4831]: I1204 10:17:23.076741 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l4jk9" event={"ID":"fbe252df-9100-4e9e-a054-1463b3fa50ed","Type":"ContainerStarted","Data":"451ab399937517ba3f4e20b2786dfb338c7cf39f87d55c0a604684dfdba4635b"} Dec 04 10:17:23 crc kubenswrapper[4831]: I1204 10:17:23.078768 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l2cns" event={"ID":"a592d34f-e6b6-4ec2-8c63-2249f70cd3f4","Type":"ContainerStarted","Data":"16e596203491d0c75e69d45a44a8366cf2d09b1f64ecb6a4fd7058ede97e165b"} Dec 04 10:17:23 crc kubenswrapper[4831]: I1204 10:17:23.079691 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pbnbq" event={"ID":"45a39bbd-157a-404e-ae10-a4a2893de563","Type":"ContainerStarted","Data":"451086a6a2e4cdddfc3322955e7cd40413f919b6dd030f70a297c7caab57ef90"} Dec 04 10:17:23 crc kubenswrapper[4831]: I1204 10:17:23.080918 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-qssm4" event={"ID":"2b483323-5ed6-40b5-b256-c9de7033e4eb","Type":"ContainerStarted","Data":"a4ab4eb18963e5eb18ffad30f6ac3e43e26a1ea255f8496c712d742a8bc03e3f"} Dec 04 10:17:23 crc kubenswrapper[4831]: I1204 10:17:23.081701 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-nmqqm" event={"ID":"4783614b-88f9-4207-b6ef-f73824ce9334","Type":"ContainerStarted","Data":"84a4e2b72dc134a3fb2eb50e42fce25afb6262556e76cf3413cacdc561afc637"} Dec 04 10:17:23 crc kubenswrapper[4831]: I1204 10:17:23.082494 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xjd68" event={"ID":"9a3a8166-a7c6-4fb0-a2a1-83003acf436f","Type":"ContainerStarted","Data":"cda500ed39521f34632ba915d28dd30e6cd46d9c83d2abd8b1d3b4827d60ddb8"} Dec 04 10:17:23 crc kubenswrapper[4831]: I1204 10:17:23.083080 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m44sl" event={"ID":"51167dc9-74db-4809-8d67-b9e4a48c039c","Type":"ContainerStarted","Data":"b9cbb2c70e364e18c8fca4a10fefd315b10fd814c6a33326b809399b59160665"} Dec 04 10:17:23 crc kubenswrapper[4831]: I1204 10:17:23.204295 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:23 crc kubenswrapper[4831]: E1204 10:17:23.205416 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:23.705395798 +0000 UTC m=+140.654571112 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:23 crc kubenswrapper[4831]: I1204 10:17:23.237304 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-tf7kq"] Dec 04 10:17:23 crc kubenswrapper[4831]: I1204 10:17:23.252369 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wtt7n"] Dec 04 10:17:23 crc kubenswrapper[4831]: I1204 10:17:23.261866 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-twg79" podStartSLOduration=120.261845238 podStartE2EDuration="2m0.261845238s" podCreationTimestamp="2025-12-04 10:15:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:17:23.259641484 +0000 UTC m=+140.208816808" watchObservedRunningTime="2025-12-04 10:17:23.261845238 +0000 UTC m=+140.211020552" Dec 04 10:17:23 crc kubenswrapper[4831]: I1204 10:17:23.289413 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pbnbq" podStartSLOduration=120.289389283 podStartE2EDuration="2m0.289389283s" podCreationTimestamp="2025-12-04 10:15:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:17:23.286547581 +0000 UTC m=+140.235722905" watchObservedRunningTime="2025-12-04 10:17:23.289389283 +0000 UTC m=+140.238564597" Dec 04 10:17:23 crc kubenswrapper[4831]: I1204 10:17:23.306494 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:23 crc kubenswrapper[4831]: E1204 10:17:23.307235 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:23.807223078 +0000 UTC m=+140.756398392 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:23 crc kubenswrapper[4831]: I1204 10:17:23.351312 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-56tfw"] Dec 04 10:17:23 crc kubenswrapper[4831]: I1204 10:17:23.407428 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:23 crc kubenswrapper[4831]: E1204 10:17:23.408036 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:23.908004869 +0000 UTC m=+140.857180183 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:23 crc kubenswrapper[4831]: I1204 10:17:23.408145 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:23 crc kubenswrapper[4831]: E1204 10:17:23.408592 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:23.908577936 +0000 UTC m=+140.857753250 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:23 crc kubenswrapper[4831]: I1204 10:17:23.419246 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qkzn2"] Dec 04 10:17:23 crc kubenswrapper[4831]: I1204 10:17:23.514630 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:23 crc kubenswrapper[4831]: E1204 10:17:23.515023 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:24.015008269 +0000 UTC m=+140.964183583 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:23 crc kubenswrapper[4831]: W1204 10:17:23.524252 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb40be81d_febb_4200_8271_9b562f0ace35.slice/crio-be0b75a7c597023cfb2f1a6cc50e3265cbe33240c1caafeca0f4561697f23947 WatchSource:0}: Error finding container be0b75a7c597023cfb2f1a6cc50e3265cbe33240c1caafeca0f4561697f23947: Status 404 returned error can't find the container with id be0b75a7c597023cfb2f1a6cc50e3265cbe33240c1caafeca0f4561697f23947 Dec 04 10:17:23 crc kubenswrapper[4831]: I1204 10:17:23.618994 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:23 crc kubenswrapper[4831]: E1204 10:17:23.619578 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:24.119551738 +0000 UTC m=+141.068727052 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:23 crc kubenswrapper[4831]: I1204 10:17:23.694851 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lzxr6"] Dec 04 10:17:23 crc kubenswrapper[4831]: I1204 10:17:23.720332 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:23 crc kubenswrapper[4831]: E1204 10:17:23.720579 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:24.220517464 +0000 UTC m=+141.169692778 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:23 crc kubenswrapper[4831]: I1204 10:17:23.720997 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:23 crc kubenswrapper[4831]: I1204 10:17:23.721134 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-mlcpb"] Dec 04 10:17:23 crc kubenswrapper[4831]: E1204 10:17:23.721546 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:24.221526763 +0000 UTC m=+141.170702078 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:23 crc kubenswrapper[4831]: I1204 10:17:23.822224 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:23 crc kubenswrapper[4831]: E1204 10:17:23.823145 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:24.323128888 +0000 UTC m=+141.272304202 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:23 crc kubenswrapper[4831]: I1204 10:17:23.923634 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:23 crc kubenswrapper[4831]: E1204 10:17:23.923964 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:24.423952479 +0000 UTC m=+141.373127793 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.025100 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:24 crc kubenswrapper[4831]: E1204 10:17:24.025499 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:24.525470661 +0000 UTC m=+141.474645975 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.111801 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-9q8xt" event={"ID":"51375b90-5931-416b-8a05-5a76d3e2852f","Type":"ContainerStarted","Data":"bbbdc0dc188f0deed7cec9a811112e3187d9e9e7ba798d8c5e393d675bb0c14f"} Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.118181 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-27tcv" event={"ID":"0d3db4fd-37e8-4257-8bb7-6ab5183de9e7","Type":"ContainerStarted","Data":"0f8cd1fc10f83d8aed782c55676318af0bb7bec63ba8696e3d64cc3e500be3b4"} Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.127492 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:24 crc kubenswrapper[4831]: E1204 10:17:24.128014 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:24.627986412 +0000 UTC m=+141.577161726 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.139829 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rsttb" event={"ID":"edd157b6-46a0-4a10-94fb-670544f743ca","Type":"ContainerStarted","Data":"1dd9407ee7daf190ffbc1948c448cbf5edd5f023b937130a5bdcff4a053f3013"} Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.140352 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rsttb" Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.140522 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-27tcv" podStartSLOduration=121.140501353 podStartE2EDuration="2m1.140501353s" podCreationTimestamp="2025-12-04 10:15:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:17:24.136199949 +0000 UTC m=+141.085375283" watchObservedRunningTime="2025-12-04 10:17:24.140501353 +0000 UTC m=+141.089676667" Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.155212 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6fssm" event={"ID":"3acf1264-0fb4-46e4-a876-8e7677b39304","Type":"ContainerStarted","Data":"2c5cd82812394b10e31c5e701dd1df1e6b967f7e41b13057595493d1ad1405fb"} Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.158123 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-pgqgv" event={"ID":"7bed6763-7fd9-492f-8a67-cd65b0061950","Type":"ContainerStarted","Data":"441006a39cf6690aecfcd8c3915a2d2855f5c09e5aa576a4297c0c3f0b9451c1"} Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.158840 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-pgqgv" Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.167614 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rsttb" podStartSLOduration=120.167594046 podStartE2EDuration="2m0.167594046s" podCreationTimestamp="2025-12-04 10:15:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:17:24.167174054 +0000 UTC m=+141.116349388" watchObservedRunningTime="2025-12-04 10:17:24.167594046 +0000 UTC m=+141.116769350" Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.170028 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-56tfw" event={"ID":"37cadb5f-eedf-40ab-b4db-44d9935e26eb","Type":"ContainerStarted","Data":"b26011d02c2a420b07dee711af1322b305d7ec12f45c854ebd5dc59e6383e16b"} Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.184073 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-7czl2" event={"ID":"b40be81d-febb-4200-8271-9b562f0ace35","Type":"ContainerStarted","Data":"be0b75a7c597023cfb2f1a6cc50e3265cbe33240c1caafeca0f4561697f23947"} Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.186903 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-4bj64" event={"ID":"7da1b053-67d9-4a4c-842e-cafb5dce5017","Type":"ContainerStarted","Data":"c8c7b8e9e63cb0f7c9e271be66cdf16f6c440a625ada9a4678c12535315c695f"} Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.188041 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2pnnb" event={"ID":"ad2c81a8-b9ac-4332-a6ae-a27e95e90537","Type":"ContainerStarted","Data":"316dd8f1ec7383e480d8d3dbb9e9b68fe3f601ad7e770c088f360dae457bc7b8"} Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.189237 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mlcpb" event={"ID":"a57c45a9-d720-4144-86c6-dc4c3cfbd9bd","Type":"ContainerStarted","Data":"fde0d354fbaf7bed8ce4403db6f11117152ea9bce5829e7598bcf77696be35b4"} Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.192950 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qkzn2" event={"ID":"d686eb67-074c-4086-b33c-c632acc21a66","Type":"ContainerStarted","Data":"1394be810064d18ad3aca8b88b919d8066b19ef2d353727d27b8f9be4b78a541"} Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.203676 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l4jk9" event={"ID":"fbe252df-9100-4e9e-a054-1463b3fa50ed","Type":"ContainerStarted","Data":"bba913defcfac54d0c38a68376f319967eeebd21bc47fea314166b3c4b132845"} Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.204078 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l4jk9" Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.217907 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l4jk9" podStartSLOduration=120.217887338 podStartE2EDuration="2m0.217887338s" podCreationTimestamp="2025-12-04 10:15:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:17:24.216892709 +0000 UTC m=+141.166068043" watchObservedRunningTime="2025-12-04 10:17:24.217887338 +0000 UTC m=+141.167062652" Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.218742 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-pgqgv" podStartSLOduration=121.218735723 podStartE2EDuration="2m1.218735723s" podCreationTimestamp="2025-12-04 10:15:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:17:24.185180974 +0000 UTC m=+141.134356318" watchObservedRunningTime="2025-12-04 10:17:24.218735723 +0000 UTC m=+141.167911047" Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.229332 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:24 crc kubenswrapper[4831]: E1204 10:17:24.230342 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:24.730325407 +0000 UTC m=+141.679500721 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.236706 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l4jk9" Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.272358 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cmg79" event={"ID":"bf8f0aa6-641c-4258-bd46-541bf71d40b1","Type":"ContainerDied","Data":"b90ffaaa863d5051ef1e64ae26b955cbaae3cf7e2207f75d8f881810e6ffbaef"} Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.272424 4831 generic.go:334] "Generic (PLEG): container finished" podID="bf8f0aa6-641c-4258-bd46-541bf71d40b1" containerID="b90ffaaa863d5051ef1e64ae26b955cbaae3cf7e2207f75d8f881810e6ffbaef" exitCode=0 Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.400187 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rsttb" Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.400599 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lzxr6" event={"ID":"04bb6d4d-643d-4015-9579-c522cfc43c2a","Type":"ContainerStarted","Data":"dadab234184bd8c594cd8fdbd920224c640c4d8992c040af1f7e70103f561a59"} Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.403982 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:24 crc kubenswrapper[4831]: E1204 10:17:24.411294 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:24.911279483 +0000 UTC m=+141.860454797 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.413252 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-d85d4" event={"ID":"f5d23b38-2bda-4faf-a3da-8565336a48a2","Type":"ContainerStarted","Data":"af12a3578178754325056ee2ef0929bf4eda919980d89992261c90619a7f95ec"} Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.461823 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tf7kq" event={"ID":"effa094e-e36b-4039-8523-dc11cffa2894","Type":"ContainerStarted","Data":"846b0db5374f0ae3242b5c85112854b909ab196a4dc758db1ecce6a6b68054fd"} Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.508265 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:24 crc kubenswrapper[4831]: E1204 10:17:24.514580 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:25.014557736 +0000 UTC m=+141.963733050 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.516517 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:24 crc kubenswrapper[4831]: E1204 10:17:24.517240 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:25.017194292 +0000 UTC m=+141.966369606 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.520433 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m44sl" event={"ID":"51167dc9-74db-4809-8d67-b9e4a48c039c","Type":"ContainerStarted","Data":"6e59d7a60ebeaffb3b499514b4355a22941268653c48bfe1e1a9ec0bbedbd756"} Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.522417 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wtt7n" event={"ID":"61498aaa-6ad9-4572-954f-40d841edd6d8","Type":"ContainerStarted","Data":"6c4c3746c31e155a7ef75582b9a9e886fc7aa572419df2f763479424fc90110d"} Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.523905 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-2h8nq" event={"ID":"08171e38-fc77-4d5a-acc9-58dca0784830","Type":"ContainerStarted","Data":"510664fab7e1943211c4d08f383507ac12b7446e3123094853f673acb2a57c39"} Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.524585 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-2h8nq" Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.537831 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-qssm4" event={"ID":"2b483323-5ed6-40b5-b256-c9de7033e4eb","Type":"ContainerStarted","Data":"66e55b3027ad31a707d9bf24e1630fb49945d54e22f6f0f95452c76e4067ee58"} Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.538526 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-qssm4" Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.552398 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m44sl" podStartSLOduration=121.552379078 podStartE2EDuration="2m1.552379078s" podCreationTimestamp="2025-12-04 10:15:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:17:24.543809751 +0000 UTC m=+141.492985055" watchObservedRunningTime="2025-12-04 10:17:24.552379078 +0000 UTC m=+141.501554382" Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.556987 4831 patch_prober.go:28] interesting pod/downloads-7954f5f757-2h8nq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.557181 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2h8nq" podUID="08171e38-fc77-4d5a-acc9-58dca0784830" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.581824 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-qssm4" podStartSLOduration=121.581806418 podStartE2EDuration="2m1.581806418s" podCreationTimestamp="2025-12-04 10:15:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:17:24.574038164 +0000 UTC m=+141.523213478" watchObservedRunningTime="2025-12-04 10:17:24.581806418 +0000 UTC m=+141.530981732" Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.583161 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mgrdx"] Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.591124 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-qssm4" Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.594711 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-pgqgv" Dec 04 10:17:24 crc kubenswrapper[4831]: W1204 10:17:24.598495 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68cdd202_055b_41c7_ac5f_a13b918c44fc.slice/crio-37750ca4acab53b909e7db354cff3610e593008ff51683eaefaa7a6faf62c81d WatchSource:0}: Error finding container 37750ca4acab53b909e7db354cff3610e593008ff51683eaefaa7a6faf62c81d: Status 404 returned error can't find the container with id 37750ca4acab53b909e7db354cff3610e593008ff51683eaefaa7a6faf62c81d Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.617411 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:24 crc kubenswrapper[4831]: E1204 10:17:24.617793 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:25.117742196 +0000 UTC m=+142.066917510 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.618188 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:24 crc kubenswrapper[4831]: E1204 10:17:24.619305 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:25.119295861 +0000 UTC m=+142.068471175 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.626202 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-bp6kt"] Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.636245 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-2h8nq" podStartSLOduration=121.63622872 podStartE2EDuration="2m1.63622872s" podCreationTimestamp="2025-12-04 10:15:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:17:24.587706738 +0000 UTC m=+141.536882042" watchObservedRunningTime="2025-12-04 10:17:24.63622872 +0000 UTC m=+141.585404034" Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.703179 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-2rbhc"] Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.722255 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:24 crc kubenswrapper[4831]: E1204 10:17:24.723838 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:25.223796369 +0000 UTC m=+142.172971683 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.752146 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zng9d"] Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.795933 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-frgsp"] Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.824047 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:24 crc kubenswrapper[4831]: E1204 10:17:24.824551 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:25.324540508 +0000 UTC m=+142.273715822 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.872628 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414055-8r2vj"] Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.882413 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-twgk8"] Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.894191 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zln28"] Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.909579 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-c4vs7"] Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.924585 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:24 crc kubenswrapper[4831]: E1204 10:17:24.924904 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:25.424890666 +0000 UTC m=+142.374065980 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:24 crc kubenswrapper[4831]: W1204 10:17:24.931439 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93463415_c819_44b0_9ee7_0de0698eb6a6.slice/crio-3c8c63a50ccd40eba1f8caa904ca1dda2c0e26776ba8110a64371fc44a541c0c WatchSource:0}: Error finding container 3c8c63a50ccd40eba1f8caa904ca1dda2c0e26776ba8110a64371fc44a541c0c: Status 404 returned error can't find the container with id 3c8c63a50ccd40eba1f8caa904ca1dda2c0e26776ba8110a64371fc44a541c0c Dec 04 10:17:24 crc kubenswrapper[4831]: I1204 10:17:24.971119 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-9sxtr"] Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.029628 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:25 crc kubenswrapper[4831]: E1204 10:17:25.032235 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:25.532213546 +0000 UTC m=+142.481388860 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.135571 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:25 crc kubenswrapper[4831]: E1204 10:17:25.136818 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:25.636791746 +0000 UTC m=+142.585967060 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.238170 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:25 crc kubenswrapper[4831]: E1204 10:17:25.238571 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:25.738555784 +0000 UTC m=+142.687731098 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.338746 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:25 crc kubenswrapper[4831]: E1204 10:17:25.339174 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:25.839158499 +0000 UTC m=+142.788333813 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.339380 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:25 crc kubenswrapper[4831]: E1204 10:17:25.339748 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:25.839739686 +0000 UTC m=+142.788915000 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.359868 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-k6xp2"] Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.383226 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k6xp2"] Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.383349 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k6xp2" Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.389975 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.441816 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:25 crc kubenswrapper[4831]: E1204 10:17:25.442148 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:25.942132763 +0000 UTC m=+142.891308077 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.549653 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.550712 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc589\" (UniqueName: \"kubernetes.io/projected/cdf24e44-d00c-424c-aa3c-3e7fdb1a2997-kube-api-access-xc589\") pod \"certified-operators-k6xp2\" (UID: \"cdf24e44-d00c-424c-aa3c-3e7fdb1a2997\") " pod="openshift-marketplace/certified-operators-k6xp2" Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.550843 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdf24e44-d00c-424c-aa3c-3e7fdb1a2997-utilities\") pod \"certified-operators-k6xp2\" (UID: \"cdf24e44-d00c-424c-aa3c-3e7fdb1a2997\") " pod="openshift-marketplace/certified-operators-k6xp2" Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.550956 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdf24e44-d00c-424c-aa3c-3e7fdb1a2997-catalog-content\") pod \"certified-operators-k6xp2\" (UID: \"cdf24e44-d00c-424c-aa3c-3e7fdb1a2997\") " pod="openshift-marketplace/certified-operators-k6xp2" Dec 04 10:17:25 crc kubenswrapper[4831]: E1204 10:17:25.551404 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:26.051393198 +0000 UTC m=+143.000568512 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.562369 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nbm8n"] Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.563481 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nbm8n" Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.574579 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.580359 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nbm8n"] Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.655552 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:25 crc kubenswrapper[4831]: E1204 10:17:25.655753 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:26.155733011 +0000 UTC m=+143.104908336 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.655958 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdf24e44-d00c-424c-aa3c-3e7fdb1a2997-catalog-content\") pod \"certified-operators-k6xp2\" (UID: \"cdf24e44-d00c-424c-aa3c-3e7fdb1a2997\") " pod="openshift-marketplace/certified-operators-k6xp2" Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.656009 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16d53c94-9ca3-4b5a-b33d-829f44de5367-catalog-content\") pod \"community-operators-nbm8n\" (UID: \"16d53c94-9ca3-4b5a-b33d-829f44de5367\") " pod="openshift-marketplace/community-operators-nbm8n" Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.656050 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mlp2\" (UniqueName: \"kubernetes.io/projected/16d53c94-9ca3-4b5a-b33d-829f44de5367-kube-api-access-7mlp2\") pod \"community-operators-nbm8n\" (UID: \"16d53c94-9ca3-4b5a-b33d-829f44de5367\") " pod="openshift-marketplace/community-operators-nbm8n" Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.656072 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.656124 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc589\" (UniqueName: \"kubernetes.io/projected/cdf24e44-d00c-424c-aa3c-3e7fdb1a2997-kube-api-access-xc589\") pod \"certified-operators-k6xp2\" (UID: \"cdf24e44-d00c-424c-aa3c-3e7fdb1a2997\") " pod="openshift-marketplace/certified-operators-k6xp2" Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.656159 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdf24e44-d00c-424c-aa3c-3e7fdb1a2997-utilities\") pod \"certified-operators-k6xp2\" (UID: \"cdf24e44-d00c-424c-aa3c-3e7fdb1a2997\") " pod="openshift-marketplace/certified-operators-k6xp2" Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.656176 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16d53c94-9ca3-4b5a-b33d-829f44de5367-utilities\") pod \"community-operators-nbm8n\" (UID: \"16d53c94-9ca3-4b5a-b33d-829f44de5367\") " pod="openshift-marketplace/community-operators-nbm8n" Dec 04 10:17:25 crc kubenswrapper[4831]: E1204 10:17:25.656440 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:26.156431332 +0000 UTC m=+143.105606656 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.656450 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdf24e44-d00c-424c-aa3c-3e7fdb1a2997-catalog-content\") pod \"certified-operators-k6xp2\" (UID: \"cdf24e44-d00c-424c-aa3c-3e7fdb1a2997\") " pod="openshift-marketplace/certified-operators-k6xp2" Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.656724 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdf24e44-d00c-424c-aa3c-3e7fdb1a2997-utilities\") pod \"certified-operators-k6xp2\" (UID: \"cdf24e44-d00c-424c-aa3c-3e7fdb1a2997\") " pod="openshift-marketplace/certified-operators-k6xp2" Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.657214 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-4bj64" event={"ID":"7da1b053-67d9-4a4c-842e-cafb5dce5017","Type":"ContainerStarted","Data":"72b4e07b0a07c41542a9f73ecb42f31e09452c8279d73464190c1a95774fa7fb"} Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.676876 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-9sxtr" event={"ID":"a8c8337c-874e-4c1d-9656-aeefef264a92","Type":"ContainerStarted","Data":"cd5c98b0dd19d02a0a565e363e2255a517551e5e50741a6c19faeb999b4a14c0"} Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.686042 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-2rbhc" event={"ID":"4ab7a865-d267-4ad7-88bd-e3b9e1f2510c","Type":"ContainerStarted","Data":"df82c21ca79234b90dea4e261e7d2ab92fcbb1b268a79d6be585dd3ab3c92ac7"} Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.697212 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-4bj64" podStartSLOduration=121.697196659 podStartE2EDuration="2m1.697196659s" podCreationTimestamp="2025-12-04 10:15:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:17:25.696086217 +0000 UTC m=+142.645261521" watchObservedRunningTime="2025-12-04 10:17:25.697196659 +0000 UTC m=+142.646371973" Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.698976 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tf7kq" event={"ID":"effa094e-e36b-4039-8523-dc11cffa2894","Type":"ContainerStarted","Data":"ede3dc0a21891f145d869bbbef1635845c878954111426ba460542f1d2976a78"} Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.711653 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc589\" (UniqueName: \"kubernetes.io/projected/cdf24e44-d00c-424c-aa3c-3e7fdb1a2997-kube-api-access-xc589\") pod \"certified-operators-k6xp2\" (UID: \"cdf24e44-d00c-424c-aa3c-3e7fdb1a2997\") " pod="openshift-marketplace/certified-operators-k6xp2" Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.718892 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wtt7n" event={"ID":"61498aaa-6ad9-4572-954f-40d841edd6d8","Type":"ContainerStarted","Data":"e453ed83fd8acdfdf405e3881dd31ee1721ddaea41b78ce132a64111eeec66bf"} Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.740835 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qp6r2" event={"ID":"73f9aaec-7f63-4909-9ffe-6b073e0225d9","Type":"ContainerStarted","Data":"26e109597f23457a82547a3ff0779067d116483ccf3d0aeb35eb8b1215b510eb"} Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.741775 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-qp6r2" Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.749574 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k6xp2" Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.750819 4831 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-qp6r2 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.750943 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-qp6r2" podUID="73f9aaec-7f63-4909-9ffe-6b073e0225d9" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.753223 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cdsf2"] Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.754055 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cdsf2" Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.758931 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.759056 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16d53c94-9ca3-4b5a-b33d-829f44de5367-utilities\") pod \"community-operators-nbm8n\" (UID: \"16d53c94-9ca3-4b5a-b33d-829f44de5367\") " pod="openshift-marketplace/community-operators-nbm8n" Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.759140 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16d53c94-9ca3-4b5a-b33d-829f44de5367-catalog-content\") pod \"community-operators-nbm8n\" (UID: \"16d53c94-9ca3-4b5a-b33d-829f44de5367\") " pod="openshift-marketplace/community-operators-nbm8n" Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.759209 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mlp2\" (UniqueName: \"kubernetes.io/projected/16d53c94-9ca3-4b5a-b33d-829f44de5367-kube-api-access-7mlp2\") pod \"community-operators-nbm8n\" (UID: \"16d53c94-9ca3-4b5a-b33d-829f44de5367\") " pod="openshift-marketplace/community-operators-nbm8n" Dec 04 10:17:25 crc kubenswrapper[4831]: E1204 10:17:25.760932 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:26.260912159 +0000 UTC m=+143.210087483 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.761205 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16d53c94-9ca3-4b5a-b33d-829f44de5367-catalog-content\") pod \"community-operators-nbm8n\" (UID: \"16d53c94-9ca3-4b5a-b33d-829f44de5367\") " pod="openshift-marketplace/community-operators-nbm8n" Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.761394 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16d53c94-9ca3-4b5a-b33d-829f44de5367-utilities\") pod \"community-operators-nbm8n\" (UID: \"16d53c94-9ca3-4b5a-b33d-829f44de5367\") " pod="openshift-marketplace/community-operators-nbm8n" Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.771975 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-9q8xt" event={"ID":"51375b90-5931-416b-8a05-5a76d3e2852f","Type":"ContainerStarted","Data":"c11f2c66ed49677dcead4e60082ccd9c2d2360874e1461ebe8b1d98654d8dbf5"} Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.784224 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wtt7n" podStartSLOduration=121.784201522 podStartE2EDuration="2m1.784201522s" podCreationTimestamp="2025-12-04 10:15:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:17:25.758276613 +0000 UTC m=+142.707451937" watchObservedRunningTime="2025-12-04 10:17:25.784201522 +0000 UTC m=+142.733376836" Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.786420 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cdsf2"] Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.831864 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mlp2\" (UniqueName: \"kubernetes.io/projected/16d53c94-9ca3-4b5a-b33d-829f44de5367-kube-api-access-7mlp2\") pod \"community-operators-nbm8n\" (UID: \"16d53c94-9ca3-4b5a-b33d-829f44de5367\") " pod="openshift-marketplace/community-operators-nbm8n" Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.858885 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bp6kt" event={"ID":"20c7a900-7157-4468-bd68-e74c2d382e85","Type":"ContainerStarted","Data":"b9e0fb5cb845aece4619db96800b8962f3f7bab67de53e625d5abdd7fce759b8"} Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.860197 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6ffc610-2a29-4b62-abc6-2cc0568c8bc1-utilities\") pod \"certified-operators-cdsf2\" (UID: \"e6ffc610-2a29-4b62-abc6-2cc0568c8bc1\") " pod="openshift-marketplace/certified-operators-cdsf2" Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.860267 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb4fz\" (UniqueName: \"kubernetes.io/projected/e6ffc610-2a29-4b62-abc6-2cc0568c8bc1-kube-api-access-tb4fz\") pod \"certified-operators-cdsf2\" (UID: \"e6ffc610-2a29-4b62-abc6-2cc0568c8bc1\") " pod="openshift-marketplace/certified-operators-cdsf2" Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.860356 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.860401 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6ffc610-2a29-4b62-abc6-2cc0568c8bc1-catalog-content\") pod \"certified-operators-cdsf2\" (UID: \"e6ffc610-2a29-4b62-abc6-2cc0568c8bc1\") " pod="openshift-marketplace/certified-operators-cdsf2" Dec 04 10:17:25 crc kubenswrapper[4831]: E1204 10:17:25.862048 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:26.362035229 +0000 UTC m=+143.311210543 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.904143 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-qp6r2" podStartSLOduration=121.904122085 podStartE2EDuration="2m1.904122085s" podCreationTimestamp="2025-12-04 10:15:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:17:25.89495111 +0000 UTC m=+142.844126424" watchObservedRunningTime="2025-12-04 10:17:25.904122085 +0000 UTC m=+142.853297399" Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.942348 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-9q8xt" podStartSLOduration=7.942327658 podStartE2EDuration="7.942327658s" podCreationTimestamp="2025-12-04 10:17:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:17:25.937811048 +0000 UTC m=+142.886986362" watchObservedRunningTime="2025-12-04 10:17:25.942327658 +0000 UTC m=+142.891502972" Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.945014 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nbm8n" Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.947365 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" event={"ID":"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3","Type":"ContainerStarted","Data":"bb21fdf73e12ca17a408e36b8b2948b4de195f855aecec64550b9723bb582263"} Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.947838 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.985414 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2nwlb"] Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.986448 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.986869 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6ffc610-2a29-4b62-abc6-2cc0568c8bc1-catalog-content\") pod \"certified-operators-cdsf2\" (UID: \"e6ffc610-2a29-4b62-abc6-2cc0568c8bc1\") " pod="openshift-marketplace/certified-operators-cdsf2" Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.986941 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6ffc610-2a29-4b62-abc6-2cc0568c8bc1-utilities\") pod \"certified-operators-cdsf2\" (UID: \"e6ffc610-2a29-4b62-abc6-2cc0568c8bc1\") " pod="openshift-marketplace/certified-operators-cdsf2" Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.986972 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb4fz\" (UniqueName: \"kubernetes.io/projected/e6ffc610-2a29-4b62-abc6-2cc0568c8bc1-kube-api-access-tb4fz\") pod \"certified-operators-cdsf2\" (UID: \"e6ffc610-2a29-4b62-abc6-2cc0568c8bc1\") " pod="openshift-marketplace/certified-operators-cdsf2" Dec 04 10:17:25 crc kubenswrapper[4831]: E1204 10:17:25.987100 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:26.487079121 +0000 UTC m=+143.436254435 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.987783 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6ffc610-2a29-4b62-abc6-2cc0568c8bc1-utilities\") pod \"certified-operators-cdsf2\" (UID: \"e6ffc610-2a29-4b62-abc6-2cc0568c8bc1\") " pod="openshift-marketplace/certified-operators-cdsf2" Dec 04 10:17:25 crc kubenswrapper[4831]: I1204 10:17:25.987809 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6ffc610-2a29-4b62-abc6-2cc0568c8bc1-catalog-content\") pod \"certified-operators-cdsf2\" (UID: \"e6ffc610-2a29-4b62-abc6-2cc0568c8bc1\") " pod="openshift-marketplace/certified-operators-cdsf2" Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.015362 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2nwlb" Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.015945 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-8r2vj" event={"ID":"93463415-c819-44b0-9ee7-0de0698eb6a6","Type":"ContainerStarted","Data":"3c8c63a50ccd40eba1f8caa904ca1dda2c0e26776ba8110a64371fc44a541c0c"} Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.027350 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2nwlb"] Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.030937 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" podStartSLOduration=123.030920107 podStartE2EDuration="2m3.030920107s" podCreationTimestamp="2025-12-04 10:15:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:17:26.028721753 +0000 UTC m=+142.977897077" watchObservedRunningTime="2025-12-04 10:17:26.030920107 +0000 UTC m=+142.980095421" Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.046478 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2pnnb" event={"ID":"ad2c81a8-b9ac-4332-a6ae-a27e95e90537","Type":"ContainerStarted","Data":"0db584441d291204f83a39cd350aa9e565e3a3c398693f4ca9690252aad573c2"} Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.050618 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb4fz\" (UniqueName: \"kubernetes.io/projected/e6ffc610-2a29-4b62-abc6-2cc0568c8bc1-kube-api-access-tb4fz\") pod \"certified-operators-cdsf2\" (UID: \"e6ffc610-2a29-4b62-abc6-2cc0568c8bc1\") " pod="openshift-marketplace/certified-operators-cdsf2" Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.060281 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-8r2vj" podStartSLOduration=123.060262754 podStartE2EDuration="2m3.060262754s" podCreationTimestamp="2025-12-04 10:15:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:17:26.058205755 +0000 UTC m=+143.007381069" watchObservedRunningTime="2025-12-04 10:17:26.060262754 +0000 UTC m=+143.009438068" Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.082519 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-wjdnp" event={"ID":"db6feab7-169d-4a82-a204-f9a1ff56f85e","Type":"ContainerStarted","Data":"945374146b7d5da2ce006101814fb803f6cad3d4484705adf75e966a1dc7fda0"} Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.091010 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sxrm\" (UniqueName: \"kubernetes.io/projected/d493a63f-9d1a-4995-8698-c3b75c46d69f-kube-api-access-9sxrm\") pod \"community-operators-2nwlb\" (UID: \"d493a63f-9d1a-4995-8698-c3b75c46d69f\") " pod="openshift-marketplace/community-operators-2nwlb" Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.091080 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d493a63f-9d1a-4995-8698-c3b75c46d69f-utilities\") pod \"community-operators-2nwlb\" (UID: \"d493a63f-9d1a-4995-8698-c3b75c46d69f\") " pod="openshift-marketplace/community-operators-2nwlb" Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.091290 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.091323 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d493a63f-9d1a-4995-8698-c3b75c46d69f-catalog-content\") pod \"community-operators-2nwlb\" (UID: \"d493a63f-9d1a-4995-8698-c3b75c46d69f\") " pod="openshift-marketplace/community-operators-2nwlb" Dec 04 10:17:26 crc kubenswrapper[4831]: E1204 10:17:26.095640 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:26.595627055 +0000 UTC m=+143.544802369 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.097459 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zng9d" event={"ID":"db1720b8-e97a-4399-8c33-a0dfe81a4621","Type":"ContainerStarted","Data":"ef1828b1533d0f4b978de573d72818a22544304a52df21c7650e78d075ab0436"} Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.115568 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-wjdnp" podStartSLOduration=123.115547171 podStartE2EDuration="2m3.115547171s" podCreationTimestamp="2025-12-04 10:15:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:17:26.113277355 +0000 UTC m=+143.062452669" watchObservedRunningTime="2025-12-04 10:17:26.115547171 +0000 UTC m=+143.064722485" Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.139828 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2pnnb" podStartSLOduration=122.139809291 podStartE2EDuration="2m2.139809291s" podCreationTimestamp="2025-12-04 10:15:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:17:26.139340508 +0000 UTC m=+143.088515822" watchObservedRunningTime="2025-12-04 10:17:26.139809291 +0000 UTC m=+143.088984605" Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.151458 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mlcpb" event={"ID":"a57c45a9-d720-4144-86c6-dc4c3cfbd9bd","Type":"ContainerStarted","Data":"244d84e766aacfa5937e7a3c57c843812db4325e4dced122283d52612c7449a1"} Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.169500 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mgrdx" event={"ID":"68cdd202-055b-41c7-ac5f-a13b918c44fc","Type":"ContainerStarted","Data":"37750ca4acab53b909e7db354cff3610e593008ff51683eaefaa7a6faf62c81d"} Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.169788 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mlcpb" podStartSLOduration=122.169770367 podStartE2EDuration="2m2.169770367s" podCreationTimestamp="2025-12-04 10:15:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:17:26.167005147 +0000 UTC m=+143.116180461" watchObservedRunningTime="2025-12-04 10:17:26.169770367 +0000 UTC m=+143.118945681" Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.173935 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cdsf2" Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.191729 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.191948 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d493a63f-9d1a-4995-8698-c3b75c46d69f-catalog-content\") pod \"community-operators-2nwlb\" (UID: \"d493a63f-9d1a-4995-8698-c3b75c46d69f\") " pod="openshift-marketplace/community-operators-2nwlb" Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.192012 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sxrm\" (UniqueName: \"kubernetes.io/projected/d493a63f-9d1a-4995-8698-c3b75c46d69f-kube-api-access-9sxrm\") pod \"community-operators-2nwlb\" (UID: \"d493a63f-9d1a-4995-8698-c3b75c46d69f\") " pod="openshift-marketplace/community-operators-2nwlb" Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.192046 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d493a63f-9d1a-4995-8698-c3b75c46d69f-utilities\") pod \"community-operators-2nwlb\" (UID: \"d493a63f-9d1a-4995-8698-c3b75c46d69f\") " pod="openshift-marketplace/community-operators-2nwlb" Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.192524 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d493a63f-9d1a-4995-8698-c3b75c46d69f-utilities\") pod \"community-operators-2nwlb\" (UID: \"d493a63f-9d1a-4995-8698-c3b75c46d69f\") " pod="openshift-marketplace/community-operators-2nwlb" Dec 04 10:17:26 crc kubenswrapper[4831]: E1204 10:17:26.192633 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:26.692601216 +0000 UTC m=+143.641776530 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.194067 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d493a63f-9d1a-4995-8698-c3b75c46d69f-catalog-content\") pod \"community-operators-2nwlb\" (UID: \"d493a63f-9d1a-4995-8698-c3b75c46d69f\") " pod="openshift-marketplace/community-operators-2nwlb" Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.205806 4831 generic.go:334] "Generic (PLEG): container finished" podID="e57fed72-8221-44fb-8108-44e9a0dcc51a" containerID="d8a4dee1d40884e41accc8bcee288847eb7caa0d57f6ac780ae50c71f9eb2c48" exitCode=0 Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.206013 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqztc" event={"ID":"e57fed72-8221-44fb-8108-44e9a0dcc51a","Type":"ContainerDied","Data":"d8a4dee1d40884e41accc8bcee288847eb7caa0d57f6ac780ae50c71f9eb2c48"} Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.222888 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.224099 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.276741 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sxrm\" (UniqueName: \"kubernetes.io/projected/d493a63f-9d1a-4995-8698-c3b75c46d69f-kube-api-access-9sxrm\") pod \"community-operators-2nwlb\" (UID: \"d493a63f-9d1a-4995-8698-c3b75c46d69f\") " pod="openshift-marketplace/community-operators-2nwlb" Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.288939 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zln28" event={"ID":"ab874db5-da93-478d-927b-21104115cf12","Type":"ContainerStarted","Data":"4e7e694c1465442e341486186e1ea2e9a4884e7687227e6c89ea02e0651e8424"} Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.290301 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mgrdx" podStartSLOduration=122.290284217 podStartE2EDuration="2m2.290284217s" podCreationTimestamp="2025-12-04 10:15:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:17:26.223965792 +0000 UTC m=+143.173141106" watchObservedRunningTime="2025-12-04 10:17:26.290284217 +0000 UTC m=+143.239459531" Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.291558 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c4vs7" event={"ID":"c589b350-6e72-40be-9f47-b26e80dc5ba6","Type":"ContainerStarted","Data":"5219ba9463432a7a13a1824fa14495dfb3b24541c087a9c148520fc620762370"} Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.292963 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:26 crc kubenswrapper[4831]: E1204 10:17:26.293776 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:26.793761407 +0000 UTC m=+143.742936731 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.309953 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-nmqqm" event={"ID":"4783614b-88f9-4207-b6ef-f73824ce9334","Type":"ContainerStarted","Data":"ed71f4054b2aa4db66065811211299e71b9414ce42880c6488b10d0ec195aaa8"} Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.326867 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-frgsp" event={"ID":"87734f51-2b97-4726-8eba-c95450c2686b","Type":"ContainerStarted","Data":"6dbe4c103f505d271dea007c902667a50aa9780d754a262d1b190eea71425b50"} Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.327921 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-frgsp" Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.335341 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-7czl2" event={"ID":"b40be81d-febb-4200-8271-9b562f0ace35","Type":"ContainerStarted","Data":"235131eeec75b3bbcf45d48aa86aec2ff91a6f3a410d210dfdfe5b9ab3dc69ed"} Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.337881 4831 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-frgsp container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.337919 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-frgsp" podUID="87734f51-2b97-4726-8eba-c95450c2686b" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.373812 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2nwlb" Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.393740 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:26 crc kubenswrapper[4831]: E1204 10:17:26.393978 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:26.893946721 +0000 UTC m=+143.843122035 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.394174 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.393841 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lzxr6" event={"ID":"04bb6d4d-643d-4015-9579-c522cfc43c2a","Type":"ContainerStarted","Data":"fe221d0181e5bbaa71e81aa3c6164a215ac51033f9d3f2733fecbe5df989fc29"} Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.395205 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-lzxr6" Dec 04 10:17:26 crc kubenswrapper[4831]: E1204 10:17:26.396471 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:26.896462573 +0000 UTC m=+143.845637877 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.453922 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-twgk8" event={"ID":"7eb2c02c-884b-45f4-8387-e2e71df329a9","Type":"ContainerStarted","Data":"06b8a6d166f1a3385f09b4f05dc26db8b572efc7c58a994d685dd9ae448b67a9"} Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.470471 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xjd68" event={"ID":"9a3a8166-a7c6-4fb0-a2a1-83003acf436f","Type":"ContainerStarted","Data":"b49e42f0e887efc4c5cc1a78843ebf1be13f8894731fc86f1a1f8bfda830697c"} Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.470510 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xjd68" event={"ID":"9a3a8166-a7c6-4fb0-a2a1-83003acf436f","Type":"ContainerStarted","Data":"2d47354b9bb1a43036cd97fe5b6feca6f644d997bcd27387eb2b089a97d58f36"} Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.472122 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-7czl2" podStartSLOduration=122.472098818 podStartE2EDuration="2m2.472098818s" podCreationTimestamp="2025-12-04 10:15:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:17:26.468189245 +0000 UTC m=+143.417364569" watchObservedRunningTime="2025-12-04 10:17:26.472098818 +0000 UTC m=+143.421274132" Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.477858 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cmg79" event={"ID":"bf8f0aa6-641c-4258-bd46-541bf71d40b1","Type":"ContainerStarted","Data":"605bb32b283094516b86be542972e2caa9cd90e1c12f44f5aeaa0327cf194ee6"} Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.478459 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cmg79" Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.495233 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:26 crc kubenswrapper[4831]: E1204 10:17:26.496544 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:26.996529403 +0000 UTC m=+143.945704717 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.517727 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-lzxr6" podStartSLOduration=8.517714335 podStartE2EDuration="8.517714335s" podCreationTimestamp="2025-12-04 10:17:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:17:26.516880701 +0000 UTC m=+143.466056015" watchObservedRunningTime="2025-12-04 10:17:26.517714335 +0000 UTC m=+143.466889649" Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.538031 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l2cns" event={"ID":"a592d34f-e6b6-4ec2-8c63-2249f70cd3f4","Type":"ContainerStarted","Data":"1b3d1a7cde53db98618fc822d15d51e58a609e03373759c3d31f1021ae904036"} Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.550564 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-nmqqm" podStartSLOduration=122.550542103 podStartE2EDuration="2m2.550542103s" podCreationTimestamp="2025-12-04 10:15:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:17:26.550366468 +0000 UTC m=+143.499541782" watchObservedRunningTime="2025-12-04 10:17:26.550542103 +0000 UTC m=+143.499717417" Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.599867 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:26 crc kubenswrapper[4831]: E1204 10:17:26.601424 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:27.101411402 +0000 UTC m=+144.050586716 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.640330 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-d85d4" event={"ID":"f5d23b38-2bda-4faf-a3da-8565336a48a2","Type":"ContainerStarted","Data":"ed87ff39f47affc51bab56bcb35975797a17f67f15553d428e2246227805aaee"} Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.652217 4831 generic.go:334] "Generic (PLEG): container finished" podID="e3ea5ca7-6c75-482a-9245-518056647743" containerID="6fff4ac3199b1f6da4def1d5bd52a87be7b04e58158199733fa6bc85fdf2ef68" exitCode=0 Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.652293 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-skjn8" event={"ID":"e3ea5ca7-6c75-482a-9245-518056647743","Type":"ContainerDied","Data":"6fff4ac3199b1f6da4def1d5bd52a87be7b04e58158199733fa6bc85fdf2ef68"} Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.655117 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pr7kc" event={"ID":"25ecb4dd-adbb-48db-8563-78c6f9faff4f","Type":"ContainerStarted","Data":"298750aec944e378c436fbf86f956ceb7685880e868af964154741557ef4620d"} Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.678152 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6fssm" event={"ID":"3acf1264-0fb4-46e4-a876-8e7677b39304","Type":"ContainerStarted","Data":"171540a4df789416212f34aecc97da1c13cbdc8441a31860ddd98b933bc96a0f"} Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.678447 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6fssm" event={"ID":"3acf1264-0fb4-46e4-a876-8e7677b39304","Type":"ContainerStarted","Data":"333900fa20e9db8476e8a831a42e5fa20ed7bfca2b3d791b4f8be2d5a0e41e4d"} Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.679863 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-frgsp" podStartSLOduration=122.679839547 podStartE2EDuration="2m2.679839547s" podCreationTimestamp="2025-12-04 10:15:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:17:26.619980689 +0000 UTC m=+143.569156003" watchObservedRunningTime="2025-12-04 10:17:26.679839547 +0000 UTC m=+143.629014851" Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.702364 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:26 crc kubenswrapper[4831]: E1204 10:17:26.756334 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:27.256304296 +0000 UTC m=+144.205479610 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.804556 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.804878 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qkzn2" event={"ID":"d686eb67-074c-4086-b33c-c632acc21a66","Type":"ContainerStarted","Data":"fefee6f79ffa63fa7fa063f900295e491f5c243faaddc9cb0e7041b56ec9e77e"} Dec 04 10:17:26 crc kubenswrapper[4831]: E1204 10:17:26.804976 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:27.304962161 +0000 UTC m=+144.254137475 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.826231 4831 patch_prober.go:28] interesting pod/downloads-7954f5f757-2h8nq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.826280 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2h8nq" podUID="08171e38-fc77-4d5a-acc9-58dca0784830" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.916697 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zln28" podStartSLOduration=122.916676597 podStartE2EDuration="2m2.916676597s" podCreationTimestamp="2025-12-04 10:15:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:17:26.66780365 +0000 UTC m=+143.616978964" watchObservedRunningTime="2025-12-04 10:17:26.916676597 +0000 UTC m=+143.865851911" Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.917077 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:26 crc kubenswrapper[4831]: E1204 10:17:26.918239 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:27.418223682 +0000 UTC m=+144.367398996 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:26 crc kubenswrapper[4831]: I1204 10:17:26.992821 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k6xp2"] Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.025407 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:27 crc kubenswrapper[4831]: E1204 10:17:27.025735 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:27.525723466 +0000 UTC m=+144.474898770 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.027074 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-d85d4" podStartSLOduration=9.027064285 podStartE2EDuration="9.027064285s" podCreationTimestamp="2025-12-04 10:17:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:17:27.000178798 +0000 UTC m=+143.949354122" watchObservedRunningTime="2025-12-04 10:17:27.027064285 +0000 UTC m=+143.976239599" Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.029948 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nbm8n"] Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.053557 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cmg79" podStartSLOduration=124.05354033 podStartE2EDuration="2m4.05354033s" podCreationTimestamp="2025-12-04 10:15:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:17:27.045876738 +0000 UTC m=+143.995052062" watchObservedRunningTime="2025-12-04 10:17:27.05354033 +0000 UTC m=+144.002715644" Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.093251 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l2cns" podStartSLOduration=126.093228986 podStartE2EDuration="2m6.093228986s" podCreationTimestamp="2025-12-04 10:15:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:17:27.090125026 +0000 UTC m=+144.039300340" watchObservedRunningTime="2025-12-04 10:17:27.093228986 +0000 UTC m=+144.042404300" Dec 04 10:17:27 crc kubenswrapper[4831]: W1204 10:17:27.118817 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16d53c94_9ca3_4b5a_b33d_829f44de5367.slice/crio-f29c3128e2572671a588f18c9de98595288b3b9ba61f830595e990fb06bf345c WatchSource:0}: Error finding container f29c3128e2572671a588f18c9de98595288b3b9ba61f830595e990fb06bf345c: Status 404 returned error can't find the container with id f29c3128e2572671a588f18c9de98595288b3b9ba61f830595e990fb06bf345c Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.127196 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:27 crc kubenswrapper[4831]: E1204 10:17:27.127633 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:27.627615889 +0000 UTC m=+144.576791193 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.160418 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xjd68" podStartSLOduration=124.160402576 podStartE2EDuration="2m4.160402576s" podCreationTimestamp="2025-12-04 10:15:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:17:27.157124531 +0000 UTC m=+144.106299845" watchObservedRunningTime="2025-12-04 10:17:27.160402576 +0000 UTC m=+144.109577890" Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.160548 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cdsf2"] Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.186746 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-nmqqm" Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.190073 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pr7kc" podStartSLOduration=123.190059312 podStartE2EDuration="2m3.190059312s" podCreationTimestamp="2025-12-04 10:15:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:17:27.189155036 +0000 UTC m=+144.138330350" watchObservedRunningTime="2025-12-04 10:17:27.190059312 +0000 UTC m=+144.139234626" Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.191017 4831 patch_prober.go:28] interesting pod/router-default-5444994796-nmqqm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 10:17:27 crc kubenswrapper[4831]: [-]has-synced failed: reason withheld Dec 04 10:17:27 crc kubenswrapper[4831]: [+]process-running ok Dec 04 10:17:27 crc kubenswrapper[4831]: healthz check failed Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.191053 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nmqqm" podUID="4783614b-88f9-4207-b6ef-f73824ce9334" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.230310 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:27 crc kubenswrapper[4831]: E1204 10:17:27.230642 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:27.730630924 +0000 UTC m=+144.679806238 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.232918 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6fssm" podStartSLOduration=123.23290516 podStartE2EDuration="2m3.23290516s" podCreationTimestamp="2025-12-04 10:15:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:17:27.225172766 +0000 UTC m=+144.174348080" watchObservedRunningTime="2025-12-04 10:17:27.23290516 +0000 UTC m=+144.182080474" Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.263099 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qkzn2" podStartSLOduration=123.263085521 podStartE2EDuration="2m3.263085521s" podCreationTimestamp="2025-12-04 10:15:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:17:27.261741472 +0000 UTC m=+144.210916806" watchObservedRunningTime="2025-12-04 10:17:27.263085521 +0000 UTC m=+144.212260835" Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.332277 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:27 crc kubenswrapper[4831]: E1204 10:17:27.332560 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:27.832546577 +0000 UTC m=+144.781721891 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.356005 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-twgk8" podStartSLOduration=123.355976464 podStartE2EDuration="2m3.355976464s" podCreationTimestamp="2025-12-04 10:15:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:17:27.350050843 +0000 UTC m=+144.299226157" watchObservedRunningTime="2025-12-04 10:17:27.355976464 +0000 UTC m=+144.305151798" Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.370527 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2nwlb"] Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.435485 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:27 crc kubenswrapper[4831]: E1204 10:17:27.435944 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:27.935930763 +0000 UTC m=+144.885106077 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.540133 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:27 crc kubenswrapper[4831]: E1204 10:17:27.540543 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:28.040525524 +0000 UTC m=+144.989700838 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.562455 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xgjhm"] Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.564193 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xgjhm" Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.569020 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.583881 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xgjhm"] Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.641761 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:27 crc kubenswrapper[4831]: E1204 10:17:27.642338 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:28.142323593 +0000 UTC m=+145.091498907 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.743305 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.743503 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc9hn\" (UniqueName: \"kubernetes.io/projected/0164870d-aba7-43b6-b798-93ec48968837-kube-api-access-sc9hn\") pod \"redhat-marketplace-xgjhm\" (UID: \"0164870d-aba7-43b6-b798-93ec48968837\") " pod="openshift-marketplace/redhat-marketplace-xgjhm" Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.743556 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0164870d-aba7-43b6-b798-93ec48968837-catalog-content\") pod \"redhat-marketplace-xgjhm\" (UID: \"0164870d-aba7-43b6-b798-93ec48968837\") " pod="openshift-marketplace/redhat-marketplace-xgjhm" Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.743612 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0164870d-aba7-43b6-b798-93ec48968837-utilities\") pod \"redhat-marketplace-xgjhm\" (UID: \"0164870d-aba7-43b6-b798-93ec48968837\") " pod="openshift-marketplace/redhat-marketplace-xgjhm" Dec 04 10:17:27 crc kubenswrapper[4831]: E1204 10:17:27.743745 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:28.243728542 +0000 UTC m=+145.192903856 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.843077 4831 generic.go:334] "Generic (PLEG): container finished" podID="cdf24e44-d00c-424c-aa3c-3e7fdb1a2997" containerID="63f91e592ea006ce92a36cf5e1f99851ccdbd4e1979bf367ba56187011e52ec5" exitCode=0 Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.843138 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6xp2" event={"ID":"cdf24e44-d00c-424c-aa3c-3e7fdb1a2997","Type":"ContainerDied","Data":"63f91e592ea006ce92a36cf5e1f99851ccdbd4e1979bf367ba56187011e52ec5"} Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.843161 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6xp2" event={"ID":"cdf24e44-d00c-424c-aa3c-3e7fdb1a2997","Type":"ContainerStarted","Data":"c11fdb5031e0611630e2779ccd2df35caf362f5c79525779de53c51647211fd1"} Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.845089 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.845571 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc9hn\" (UniqueName: \"kubernetes.io/projected/0164870d-aba7-43b6-b798-93ec48968837-kube-api-access-sc9hn\") pod \"redhat-marketplace-xgjhm\" (UID: \"0164870d-aba7-43b6-b798-93ec48968837\") " pod="openshift-marketplace/redhat-marketplace-xgjhm" Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.845618 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0164870d-aba7-43b6-b798-93ec48968837-catalog-content\") pod \"redhat-marketplace-xgjhm\" (UID: \"0164870d-aba7-43b6-b798-93ec48968837\") " pod="openshift-marketplace/redhat-marketplace-xgjhm" Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.845671 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.845700 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0164870d-aba7-43b6-b798-93ec48968837-utilities\") pod \"redhat-marketplace-xgjhm\" (UID: \"0164870d-aba7-43b6-b798-93ec48968837\") " pod="openshift-marketplace/redhat-marketplace-xgjhm" Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.846059 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0164870d-aba7-43b6-b798-93ec48968837-utilities\") pod \"redhat-marketplace-xgjhm\" (UID: \"0164870d-aba7-43b6-b798-93ec48968837\") " pod="openshift-marketplace/redhat-marketplace-xgjhm" Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.846425 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0164870d-aba7-43b6-b798-93ec48968837-catalog-content\") pod \"redhat-marketplace-xgjhm\" (UID: \"0164870d-aba7-43b6-b798-93ec48968837\") " pod="openshift-marketplace/redhat-marketplace-xgjhm" Dec 04 10:17:27 crc kubenswrapper[4831]: E1204 10:17:27.848092 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:28.348074115 +0000 UTC m=+145.297249429 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.860268 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bp6kt" event={"ID":"20c7a900-7157-4468-bd68-e74c2d382e85","Type":"ContainerStarted","Data":"941232591ae95513abe6dff7e4ed303d1f934c54c042d38dfc6556de060a7c29"} Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.860306 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bp6kt" event={"ID":"20c7a900-7157-4468-bd68-e74c2d382e85","Type":"ContainerStarted","Data":"57440271068ea386a703aa0ae5b44100e7f8f26aa5592754738840ae5455ad5c"} Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.870622 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-skjn8" event={"ID":"e3ea5ca7-6c75-482a-9245-518056647743","Type":"ContainerStarted","Data":"e51c348b7183a7a86df4a784cf1c3090f376c94c84a83fde917445eaa0ea0a83"} Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.874789 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-9sxtr" event={"ID":"a8c8337c-874e-4c1d-9656-aeefef264a92","Type":"ContainerStarted","Data":"a93d973e3a21626821c6a59bfcbfbfd210cfbe31ca88e42704e529358c7bf4e4"} Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.874824 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-9sxtr" event={"ID":"a8c8337c-874e-4c1d-9656-aeefef264a92","Type":"ContainerStarted","Data":"76cd9cd7751643f0ec273e065e826d87b129f3208e11daf3fe959c2dca8f0261"} Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.877510 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-56tfw" event={"ID":"37cadb5f-eedf-40ab-b4db-44d9935e26eb","Type":"ContainerStarted","Data":"978c20121088d8a1f4f5133a6f43245e186112ef2215db7536b53ca4e9e77696"} Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.879581 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-2rbhc" event={"ID":"4ab7a865-d267-4ad7-88bd-e3b9e1f2510c","Type":"ContainerStarted","Data":"90abea2bb74e08c3ad552dbbafb3b34be3726c72d947430291938704f869bb7b"} Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.913171 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zng9d" event={"ID":"db1720b8-e97a-4399-8c33-a0dfe81a4621","Type":"ContainerStarted","Data":"56b72b206ad3a879b0469bfc04100d6b8e050071e2fcba2f0cc4a8e8f09563b0"} Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.913211 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zng9d" event={"ID":"db1720b8-e97a-4399-8c33-a0dfe81a4621","Type":"ContainerStarted","Data":"5258d015c883cd59f28b8d8baed0de650c84d08776b8d1a0fbceb3198559b3ed"} Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.913854 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zng9d" Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.933392 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc9hn\" (UniqueName: \"kubernetes.io/projected/0164870d-aba7-43b6-b798-93ec48968837-kube-api-access-sc9hn\") pod \"redhat-marketplace-xgjhm\" (UID: \"0164870d-aba7-43b6-b798-93ec48968837\") " pod="openshift-marketplace/redhat-marketplace-xgjhm" Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.949206 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:27 crc kubenswrapper[4831]: E1204 10:17:27.950169 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:28.450154333 +0000 UTC m=+145.399329647 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:27 crc kubenswrapper[4831]: I1204 10:17:27.950176 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mgrdx" event={"ID":"68cdd202-055b-41c7-ac5f-a13b918c44fc","Type":"ContainerStarted","Data":"1e354b9beb36af95427744abd8a9741f26d21ffdc829c5780aa7f0d57013b3f9"} Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.007132 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2nwlb" event={"ID":"d493a63f-9d1a-4995-8698-c3b75c46d69f","Type":"ContainerStarted","Data":"d503e5db61636659044d128fb4d62d8422ad3b5040e1d73af8e853aa5ffda2fc"} Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.007446 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2nwlb" event={"ID":"d493a63f-9d1a-4995-8698-c3b75c46d69f","Type":"ContainerStarted","Data":"98ed8bafb6fd6deef417dd7fea6fac8fd18d995cf61be09e6819f65959725801"} Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.028186 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c4vs7" event={"ID":"c589b350-6e72-40be-9f47-b26e80dc5ba6","Type":"ContainerStarted","Data":"a832477279c1a22c19d28ed7d96802a54871787969bbd316e4313b351630d862"} Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.028238 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c4vs7" event={"ID":"c589b350-6e72-40be-9f47-b26e80dc5ba6","Type":"ContainerStarted","Data":"7e75a3f8f037648bbd106d905e12bf1e6dba991ae72aa3bdd8352410543b9193"} Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.033586 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-frgsp" event={"ID":"87734f51-2b97-4726-8eba-c95450c2686b","Type":"ContainerStarted","Data":"6a78c2d146554e1c4d2ddaefc2856f0376afcc682bcd1471df4e10a23dce226f"} Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.035703 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-twgk8" event={"ID":"7eb2c02c-884b-45f4-8387-e2e71df329a9","Type":"ContainerStarted","Data":"13b7c2adeaf6e238d8ab7a43aa5f22f5c284e1bb1e04cbed48c7d3350a1f3073"} Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.038189 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bp6kt" podStartSLOduration=124.038172785 podStartE2EDuration="2m4.038172785s" podCreationTimestamp="2025-12-04 10:15:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:17:27.997041077 +0000 UTC m=+144.946216391" watchObservedRunningTime="2025-12-04 10:17:28.038172785 +0000 UTC m=+144.987348099" Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.038493 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cgzpd"] Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.039698 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cgzpd" Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.040038 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqztc" event={"ID":"e57fed72-8221-44fb-8108-44e9a0dcc51a","Type":"ContainerStarted","Data":"dad4c1c578f3284cdb3fd565f3796c973be930528cb9f76e7d09cf83b8d542d4"} Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.050188 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:28 crc kubenswrapper[4831]: E1204 10:17:28.051189 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:28.551178161 +0000 UTC m=+145.500353475 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.067537 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lzxr6" event={"ID":"04bb6d4d-643d-4015-9579-c522cfc43c2a","Type":"ContainerStarted","Data":"aa22555a175fefb601e6fc8458e7e1caa9d44b889b6671daee7647cdb1d10a3c"} Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.085584 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cgzpd"] Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.106801 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-9sxtr" podStartSLOduration=124.106787477 podStartE2EDuration="2m4.106787477s" podCreationTimestamp="2025-12-04 10:15:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:17:28.093944066 +0000 UTC m=+145.043119390" watchObservedRunningTime="2025-12-04 10:17:28.106787477 +0000 UTC m=+145.055962791" Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.116672 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tf7kq" event={"ID":"effa094e-e36b-4039-8523-dc11cffa2894","Type":"ContainerStarted","Data":"38b71dbeef8c74c8402f9be257654f7de1092b62f1df2e5f8b0b964450d5f9bb"} Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.128998 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-frgsp" Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.157703 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zln28" event={"ID":"ab874db5-da93-478d-927b-21104115cf12","Type":"ContainerStarted","Data":"11c3520b2ff2cf9e603fd1cc15a4920017829bfa2ed4219dd967ea066ed06936"} Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.161360 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.161539 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbtj9\" (UniqueName: \"kubernetes.io/projected/c1d2b5a4-7636-444a-88c3-38bd88a35f99-kube-api-access-cbtj9\") pod \"redhat-marketplace-cgzpd\" (UID: \"c1d2b5a4-7636-444a-88c3-38bd88a35f99\") " pod="openshift-marketplace/redhat-marketplace-cgzpd" Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.161819 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1d2b5a4-7636-444a-88c3-38bd88a35f99-catalog-content\") pod \"redhat-marketplace-cgzpd\" (UID: \"c1d2b5a4-7636-444a-88c3-38bd88a35f99\") " pod="openshift-marketplace/redhat-marketplace-cgzpd" Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.161895 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1d2b5a4-7636-444a-88c3-38bd88a35f99-utilities\") pod \"redhat-marketplace-cgzpd\" (UID: \"c1d2b5a4-7636-444a-88c3-38bd88a35f99\") " pod="openshift-marketplace/redhat-marketplace-cgzpd" Dec 04 10:17:28 crc kubenswrapper[4831]: E1204 10:17:28.162376 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:28.662360312 +0000 UTC m=+145.611535616 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.162864 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-2rbhc" podStartSLOduration=124.162853226 podStartE2EDuration="2m4.162853226s" podCreationTimestamp="2025-12-04 10:15:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:17:28.16020847 +0000 UTC m=+145.109383784" watchObservedRunningTime="2025-12-04 10:17:28.162853226 +0000 UTC m=+145.112028540" Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.177930 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-8r2vj" event={"ID":"93463415-c819-44b0-9ee7-0de0698eb6a6","Type":"ContainerStarted","Data":"c90d8cfb7440730d1485d43bae49e534d471fe62033843e093914a44c8c4d6e0"} Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.199566 4831 patch_prober.go:28] interesting pod/router-default-5444994796-nmqqm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 10:17:28 crc kubenswrapper[4831]: [-]has-synced failed: reason withheld Dec 04 10:17:28 crc kubenswrapper[4831]: [+]process-running ok Dec 04 10:17:28 crc kubenswrapper[4831]: healthz check failed Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.199623 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nmqqm" podUID="4783614b-88f9-4207-b6ef-f73824ce9334" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.200151 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xgjhm" Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.203614 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nbm8n" event={"ID":"16d53c94-9ca3-4b5a-b33d-829f44de5367","Type":"ContainerStarted","Data":"1558e7adba5c0f215d8ec18f9990aa8d5fa8a37a15b71744737e363e88e0d2a2"} Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.203673 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nbm8n" event={"ID":"16d53c94-9ca3-4b5a-b33d-829f44de5367","Type":"ContainerStarted","Data":"f29c3128e2572671a588f18c9de98595288b3b9ba61f830595e990fb06bf345c"} Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.217410 4831 generic.go:334] "Generic (PLEG): container finished" podID="e6ffc610-2a29-4b62-abc6-2cc0568c8bc1" containerID="39b244c2954a101a943244d0a98bb0600199e5dfb21b16b42d2856e5a42dd095" exitCode=0 Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.218416 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cdsf2" event={"ID":"e6ffc610-2a29-4b62-abc6-2cc0568c8bc1","Type":"ContainerDied","Data":"39b244c2954a101a943244d0a98bb0600199e5dfb21b16b42d2856e5a42dd095"} Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.218450 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cdsf2" event={"ID":"e6ffc610-2a29-4b62-abc6-2cc0568c8bc1","Type":"ContainerStarted","Data":"8ac161e7372733eb638a32be109e3ec5d29d6d4fcd912058b1f7d75d86fef5fc"} Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.219796 4831 patch_prober.go:28] interesting pod/downloads-7954f5f757-2h8nq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.219850 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2h8nq" podUID="08171e38-fc77-4d5a-acc9-58dca0784830" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.220157 4831 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-qp6r2 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.220197 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-qp6r2" podUID="73f9aaec-7f63-4909-9ffe-6b073e0225d9" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.221715 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qkzn2" Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.227732 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zng9d" podStartSLOduration=124.227718859 podStartE2EDuration="2m4.227718859s" podCreationTimestamp="2025-12-04 10:15:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:17:28.226622408 +0000 UTC m=+145.175797742" watchObservedRunningTime="2025-12-04 10:17:28.227718859 +0000 UTC m=+145.176894173" Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.245872 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qkzn2" Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.263237 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1d2b5a4-7636-444a-88c3-38bd88a35f99-catalog-content\") pod \"redhat-marketplace-cgzpd\" (UID: \"c1d2b5a4-7636-444a-88c3-38bd88a35f99\") " pod="openshift-marketplace/redhat-marketplace-cgzpd" Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.263417 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1d2b5a4-7636-444a-88c3-38bd88a35f99-utilities\") pod \"redhat-marketplace-cgzpd\" (UID: \"c1d2b5a4-7636-444a-88c3-38bd88a35f99\") " pod="openshift-marketplace/redhat-marketplace-cgzpd" Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.263634 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbtj9\" (UniqueName: \"kubernetes.io/projected/c1d2b5a4-7636-444a-88c3-38bd88a35f99-kube-api-access-cbtj9\") pod \"redhat-marketplace-cgzpd\" (UID: \"c1d2b5a4-7636-444a-88c3-38bd88a35f99\") " pod="openshift-marketplace/redhat-marketplace-cgzpd" Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.263739 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.264435 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1d2b5a4-7636-444a-88c3-38bd88a35f99-utilities\") pod \"redhat-marketplace-cgzpd\" (UID: \"c1d2b5a4-7636-444a-88c3-38bd88a35f99\") " pod="openshift-marketplace/redhat-marketplace-cgzpd" Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.268959 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1d2b5a4-7636-444a-88c3-38bd88a35f99-catalog-content\") pod \"redhat-marketplace-cgzpd\" (UID: \"c1d2b5a4-7636-444a-88c3-38bd88a35f99\") " pod="openshift-marketplace/redhat-marketplace-cgzpd" Dec 04 10:17:28 crc kubenswrapper[4831]: E1204 10:17:28.273923 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:28.773909443 +0000 UTC m=+145.723084757 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.355823 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c4vs7" podStartSLOduration=124.355805539 podStartE2EDuration="2m4.355805539s" podCreationTimestamp="2025-12-04 10:15:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:17:28.354344796 +0000 UTC m=+145.303520110" watchObservedRunningTime="2025-12-04 10:17:28.355805539 +0000 UTC m=+145.304980853" Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.364412 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbtj9\" (UniqueName: \"kubernetes.io/projected/c1d2b5a4-7636-444a-88c3-38bd88a35f99-kube-api-access-cbtj9\") pod \"redhat-marketplace-cgzpd\" (UID: \"c1d2b5a4-7636-444a-88c3-38bd88a35f99\") " pod="openshift-marketplace/redhat-marketplace-cgzpd" Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.365110 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:28 crc kubenswrapper[4831]: E1204 10:17:28.365291 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:28.865270232 +0000 UTC m=+145.814445556 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.365551 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:28 crc kubenswrapper[4831]: E1204 10:17:28.365838 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:28.865828468 +0000 UTC m=+145.815003782 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.380918 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cgzpd" Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.424519 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cmg79" Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.469481 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:28 crc kubenswrapper[4831]: E1204 10:17:28.470024 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:28.970003936 +0000 UTC m=+145.919179250 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.571245 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:28 crc kubenswrapper[4831]: E1204 10:17:28.571577 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:29.07156252 +0000 UTC m=+146.020737834 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.573298 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tf7kq" podStartSLOduration=124.573285309 podStartE2EDuration="2m4.573285309s" podCreationTimestamp="2025-12-04 10:15:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:17:28.538463544 +0000 UTC m=+145.487638858" watchObservedRunningTime="2025-12-04 10:17:28.573285309 +0000 UTC m=+145.522460613" Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.575586 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bbqht"] Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.581727 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bbqht" Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.593290 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.601833 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bbqht"] Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.603160 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqztc" podStartSLOduration=124.60310586 podStartE2EDuration="2m4.60310586s" podCreationTimestamp="2025-12-04 10:15:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:17:28.593535504 +0000 UTC m=+145.542710818" watchObservedRunningTime="2025-12-04 10:17:28.60310586 +0000 UTC m=+145.552281304" Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.684699 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:28 crc kubenswrapper[4831]: E1204 10:17:28.685204 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:29.185188981 +0000 UTC m=+146.134364285 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.776999 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xgjhm"] Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.788502 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/069ef39a-dfc9-4a4c-acf4-758475a8a7b0-utilities\") pod \"redhat-operators-bbqht\" (UID: \"069ef39a-dfc9-4a4c-acf4-758475a8a7b0\") " pod="openshift-marketplace/redhat-operators-bbqht" Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.788560 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.788584 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/069ef39a-dfc9-4a4c-acf4-758475a8a7b0-catalog-content\") pod \"redhat-operators-bbqht\" (UID: \"069ef39a-dfc9-4a4c-acf4-758475a8a7b0\") " pod="openshift-marketplace/redhat-operators-bbqht" Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.788610 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndrmg\" (UniqueName: \"kubernetes.io/projected/069ef39a-dfc9-4a4c-acf4-758475a8a7b0-kube-api-access-ndrmg\") pod \"redhat-operators-bbqht\" (UID: \"069ef39a-dfc9-4a4c-acf4-758475a8a7b0\") " pod="openshift-marketplace/redhat-operators-bbqht" Dec 04 10:17:28 crc kubenswrapper[4831]: E1204 10:17:28.788915 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:29.288903416 +0000 UTC m=+146.238078730 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.892982 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.893480 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/069ef39a-dfc9-4a4c-acf4-758475a8a7b0-utilities\") pod \"redhat-operators-bbqht\" (UID: \"069ef39a-dfc9-4a4c-acf4-758475a8a7b0\") " pod="openshift-marketplace/redhat-operators-bbqht" Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.893552 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/069ef39a-dfc9-4a4c-acf4-758475a8a7b0-catalog-content\") pod \"redhat-operators-bbqht\" (UID: \"069ef39a-dfc9-4a4c-acf4-758475a8a7b0\") " pod="openshift-marketplace/redhat-operators-bbqht" Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.893592 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndrmg\" (UniqueName: \"kubernetes.io/projected/069ef39a-dfc9-4a4c-acf4-758475a8a7b0-kube-api-access-ndrmg\") pod \"redhat-operators-bbqht\" (UID: \"069ef39a-dfc9-4a4c-acf4-758475a8a7b0\") " pod="openshift-marketplace/redhat-operators-bbqht" Dec 04 10:17:28 crc kubenswrapper[4831]: E1204 10:17:28.894386 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:29.394371261 +0000 UTC m=+146.343546575 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.894758 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/069ef39a-dfc9-4a4c-acf4-758475a8a7b0-utilities\") pod \"redhat-operators-bbqht\" (UID: \"069ef39a-dfc9-4a4c-acf4-758475a8a7b0\") " pod="openshift-marketplace/redhat-operators-bbqht" Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.894956 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/069ef39a-dfc9-4a4c-acf4-758475a8a7b0-catalog-content\") pod \"redhat-operators-bbqht\" (UID: \"069ef39a-dfc9-4a4c-acf4-758475a8a7b0\") " pod="openshift-marketplace/redhat-operators-bbqht" Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.934782 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndrmg\" (UniqueName: \"kubernetes.io/projected/069ef39a-dfc9-4a4c-acf4-758475a8a7b0-kube-api-access-ndrmg\") pod \"redhat-operators-bbqht\" (UID: \"069ef39a-dfc9-4a4c-acf4-758475a8a7b0\") " pod="openshift-marketplace/redhat-operators-bbqht" Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.935928 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bbqht" Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.960816 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cgzpd"] Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.964435 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-c7k4c"] Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.965339 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c7k4c" Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.984049 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c7k4c"] Dec 04 10:17:28 crc kubenswrapper[4831]: I1204 10:17:28.994534 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:28 crc kubenswrapper[4831]: E1204 10:17:28.994951 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:29.494937585 +0000 UTC m=+146.444112899 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:29 crc kubenswrapper[4831]: I1204 10:17:29.095777 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:29 crc kubenswrapper[4831]: E1204 10:17:29.095931 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:29.595906731 +0000 UTC m=+146.545082045 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:29 crc kubenswrapper[4831]: I1204 10:17:29.096642 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/470670f0-d9fa-4aca-8086-a63b711191d1-catalog-content\") pod \"redhat-operators-c7k4c\" (UID: \"470670f0-d9fa-4aca-8086-a63b711191d1\") " pod="openshift-marketplace/redhat-operators-c7k4c" Dec 04 10:17:29 crc kubenswrapper[4831]: I1204 10:17:29.096833 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2ptk\" (UniqueName: \"kubernetes.io/projected/470670f0-d9fa-4aca-8086-a63b711191d1-kube-api-access-g2ptk\") pod \"redhat-operators-c7k4c\" (UID: \"470670f0-d9fa-4aca-8086-a63b711191d1\") " pod="openshift-marketplace/redhat-operators-c7k4c" Dec 04 10:17:29 crc kubenswrapper[4831]: I1204 10:17:29.096957 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/470670f0-d9fa-4aca-8086-a63b711191d1-utilities\") pod \"redhat-operators-c7k4c\" (UID: \"470670f0-d9fa-4aca-8086-a63b711191d1\") " pod="openshift-marketplace/redhat-operators-c7k4c" Dec 04 10:17:29 crc kubenswrapper[4831]: I1204 10:17:29.097158 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:29 crc kubenswrapper[4831]: E1204 10:17:29.097861 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:29.597845437 +0000 UTC m=+146.547020761 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:29 crc kubenswrapper[4831]: I1204 10:17:29.188525 4831 patch_prober.go:28] interesting pod/router-default-5444994796-nmqqm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 10:17:29 crc kubenswrapper[4831]: [-]has-synced failed: reason withheld Dec 04 10:17:29 crc kubenswrapper[4831]: [+]process-running ok Dec 04 10:17:29 crc kubenswrapper[4831]: healthz check failed Dec 04 10:17:29 crc kubenswrapper[4831]: I1204 10:17:29.188567 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nmqqm" podUID="4783614b-88f9-4207-b6ef-f73824ce9334" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 10:17:29 crc kubenswrapper[4831]: I1204 10:17:29.198261 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:29 crc kubenswrapper[4831]: E1204 10:17:29.198373 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:29.69835078 +0000 UTC m=+146.647526094 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:29 crc kubenswrapper[4831]: I1204 10:17:29.198553 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:29 crc kubenswrapper[4831]: I1204 10:17:29.198607 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/470670f0-d9fa-4aca-8086-a63b711191d1-catalog-content\") pod \"redhat-operators-c7k4c\" (UID: \"470670f0-d9fa-4aca-8086-a63b711191d1\") " pod="openshift-marketplace/redhat-operators-c7k4c" Dec 04 10:17:29 crc kubenswrapper[4831]: I1204 10:17:29.198709 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2ptk\" (UniqueName: \"kubernetes.io/projected/470670f0-d9fa-4aca-8086-a63b711191d1-kube-api-access-g2ptk\") pod \"redhat-operators-c7k4c\" (UID: \"470670f0-d9fa-4aca-8086-a63b711191d1\") " pod="openshift-marketplace/redhat-operators-c7k4c" Dec 04 10:17:29 crc kubenswrapper[4831]: I1204 10:17:29.198767 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/470670f0-d9fa-4aca-8086-a63b711191d1-utilities\") pod \"redhat-operators-c7k4c\" (UID: \"470670f0-d9fa-4aca-8086-a63b711191d1\") " pod="openshift-marketplace/redhat-operators-c7k4c" Dec 04 10:17:29 crc kubenswrapper[4831]: I1204 10:17:29.199229 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/470670f0-d9fa-4aca-8086-a63b711191d1-utilities\") pod \"redhat-operators-c7k4c\" (UID: \"470670f0-d9fa-4aca-8086-a63b711191d1\") " pod="openshift-marketplace/redhat-operators-c7k4c" Dec 04 10:17:29 crc kubenswrapper[4831]: E1204 10:17:29.199503 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:29.699489683 +0000 UTC m=+146.648664997 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:29 crc kubenswrapper[4831]: I1204 10:17:29.199915 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/470670f0-d9fa-4aca-8086-a63b711191d1-catalog-content\") pod \"redhat-operators-c7k4c\" (UID: \"470670f0-d9fa-4aca-8086-a63b711191d1\") " pod="openshift-marketplace/redhat-operators-c7k4c" Dec 04 10:17:29 crc kubenswrapper[4831]: I1204 10:17:29.220042 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2ptk\" (UniqueName: \"kubernetes.io/projected/470670f0-d9fa-4aca-8086-a63b711191d1-kube-api-access-g2ptk\") pod \"redhat-operators-c7k4c\" (UID: \"470670f0-d9fa-4aca-8086-a63b711191d1\") " pod="openshift-marketplace/redhat-operators-c7k4c" Dec 04 10:17:29 crc kubenswrapper[4831]: I1204 10:17:29.232979 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xgjhm" event={"ID":"0164870d-aba7-43b6-b798-93ec48968837","Type":"ContainerStarted","Data":"3600c2a4350f451bd62d5c3fce4de9543305f2144642dd5effa9f8d1d91128ee"} Dec 04 10:17:29 crc kubenswrapper[4831]: I1204 10:17:29.236082 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-skjn8" event={"ID":"e3ea5ca7-6c75-482a-9245-518056647743","Type":"ContainerStarted","Data":"d022f42803746f1f0b024e9f0eabcbcde63965533c4ec5d1c82358cc1eecde1e"} Dec 04 10:17:29 crc kubenswrapper[4831]: I1204 10:17:29.248966 4831 generic.go:334] "Generic (PLEG): container finished" podID="d493a63f-9d1a-4995-8698-c3b75c46d69f" containerID="d503e5db61636659044d128fb4d62d8422ad3b5040e1d73af8e853aa5ffda2fc" exitCode=0 Dec 04 10:17:29 crc kubenswrapper[4831]: I1204 10:17:29.249095 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2nwlb" event={"ID":"d493a63f-9d1a-4995-8698-c3b75c46d69f","Type":"ContainerDied","Data":"d503e5db61636659044d128fb4d62d8422ad3b5040e1d73af8e853aa5ffda2fc"} Dec 04 10:17:29 crc kubenswrapper[4831]: I1204 10:17:29.252198 4831 generic.go:334] "Generic (PLEG): container finished" podID="16d53c94-9ca3-4b5a-b33d-829f44de5367" containerID="1558e7adba5c0f215d8ec18f9990aa8d5fa8a37a15b71744737e363e88e0d2a2" exitCode=0 Dec 04 10:17:29 crc kubenswrapper[4831]: I1204 10:17:29.252239 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nbm8n" event={"ID":"16d53c94-9ca3-4b5a-b33d-829f44de5367","Type":"ContainerDied","Data":"1558e7adba5c0f215d8ec18f9990aa8d5fa8a37a15b71744737e363e88e0d2a2"} Dec 04 10:17:29 crc kubenswrapper[4831]: I1204 10:17:29.259108 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-skjn8" podStartSLOduration=126.259079734 podStartE2EDuration="2m6.259079734s" podCreationTimestamp="2025-12-04 10:15:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:17:29.25515032 +0000 UTC m=+146.204325644" watchObservedRunningTime="2025-12-04 10:17:29.259079734 +0000 UTC m=+146.208255048" Dec 04 10:17:29 crc kubenswrapper[4831]: I1204 10:17:29.262211 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cgzpd" event={"ID":"c1d2b5a4-7636-444a-88c3-38bd88a35f99","Type":"ContainerStarted","Data":"5a8b3302b4c33265a169cf3814d4e9b6419bee391df98534c2da1cc22dae181b"} Dec 04 10:17:29 crc kubenswrapper[4831]: I1204 10:17:29.268334 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-qp6r2" Dec 04 10:17:29 crc kubenswrapper[4831]: I1204 10:17:29.290376 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bbqht"] Dec 04 10:17:29 crc kubenswrapper[4831]: I1204 10:17:29.304593 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c7k4c" Dec 04 10:17:29 crc kubenswrapper[4831]: I1204 10:17:29.306218 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:29 crc kubenswrapper[4831]: E1204 10:17:29.310834 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 10:17:29.810794227 +0000 UTC m=+146.759969541 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:29 crc kubenswrapper[4831]: I1204 10:17:29.409093 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:29 crc kubenswrapper[4831]: E1204 10:17:29.414913 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 10:17:29.914898874 +0000 UTC m=+146.864074178 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7b6hb" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 10:17:29 crc kubenswrapper[4831]: I1204 10:17:29.474912 4831 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 04 10:17:29 crc kubenswrapper[4831]: I1204 10:17:29.476869 4831 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-04T10:17:29.475142034Z","Handler":null,"Name":""} Dec 04 10:17:29 crc kubenswrapper[4831]: I1204 10:17:29.478755 4831 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 04 10:17:29 crc kubenswrapper[4831]: I1204 10:17:29.478808 4831 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 04 10:17:29 crc kubenswrapper[4831]: I1204 10:17:29.510401 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 10:17:29 crc kubenswrapper[4831]: I1204 10:17:29.518107 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 04 10:17:29 crc kubenswrapper[4831]: I1204 10:17:29.553950 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c7k4c"] Dec 04 10:17:29 crc kubenswrapper[4831]: I1204 10:17:29.611678 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:29 crc kubenswrapper[4831]: I1204 10:17:29.629903 4831 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 04 10:17:29 crc kubenswrapper[4831]: I1204 10:17:29.629965 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:29 crc kubenswrapper[4831]: I1204 10:17:29.688912 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7b6hb\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:29 crc kubenswrapper[4831]: I1204 10:17:29.864882 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.068099 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-7b6hb"] Dec 04 10:17:30 crc kubenswrapper[4831]: W1204 10:17:30.128334 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca0af9c8_f2fa_45f8_a428_9b061441bddf.slice/crio-342cc0e1e622c6b03b82f2a37c15b03e4981eb14c6c54ed2150bebe14ad86c37 WatchSource:0}: Error finding container 342cc0e1e622c6b03b82f2a37c15b03e4981eb14c6c54ed2150bebe14ad86c37: Status 404 returned error can't find the container with id 342cc0e1e622c6b03b82f2a37c15b03e4981eb14c6c54ed2150bebe14ad86c37 Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.188612 4831 patch_prober.go:28] interesting pod/router-default-5444994796-nmqqm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 10:17:30 crc kubenswrapper[4831]: [-]has-synced failed: reason withheld Dec 04 10:17:30 crc kubenswrapper[4831]: [+]process-running ok Dec 04 10:17:30 crc kubenswrapper[4831]: healthz check failed Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.188704 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nmqqm" podUID="4783614b-88f9-4207-b6ef-f73824ce9334" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.267648 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c7k4c" event={"ID":"470670f0-d9fa-4aca-8086-a63b711191d1","Type":"ContainerStarted","Data":"4d1beaf5fdb87acf1375d4ec15b2e393aa26e577d1bf40ae0476755eb6374ec7"} Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.267963 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c7k4c" event={"ID":"470670f0-d9fa-4aca-8086-a63b711191d1","Type":"ContainerStarted","Data":"2c4e1513bfb1d8789eff855f374410bd10bbe99eb9dd3a9364990461d369f014"} Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.269265 4831 generic.go:334] "Generic (PLEG): container finished" podID="93463415-c819-44b0-9ee7-0de0698eb6a6" containerID="c90d8cfb7440730d1485d43bae49e534d471fe62033843e093914a44c8c4d6e0" exitCode=0 Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.269314 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-8r2vj" event={"ID":"93463415-c819-44b0-9ee7-0de0698eb6a6","Type":"ContainerDied","Data":"c90d8cfb7440730d1485d43bae49e534d471fe62033843e093914a44c8c4d6e0"} Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.270525 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" event={"ID":"ca0af9c8-f2fa-45f8-a428-9b061441bddf","Type":"ContainerStarted","Data":"342cc0e1e622c6b03b82f2a37c15b03e4981eb14c6c54ed2150bebe14ad86c37"} Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.272318 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bbqht" event={"ID":"069ef39a-dfc9-4a4c-acf4-758475a8a7b0","Type":"ContainerStarted","Data":"0058f6b7a61d2ab2d50cc0839901fe460e913d30fe8348c2345bba8a1af2ea7b"} Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.272358 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bbqht" event={"ID":"069ef39a-dfc9-4a4c-acf4-758475a8a7b0","Type":"ContainerStarted","Data":"7e7f82c3d9f35b88b7583ada3d87c557c2b15de840cfc783e45152c8b8e4c1fe"} Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.273502 4831 generic.go:334] "Generic (PLEG): container finished" podID="c1d2b5a4-7636-444a-88c3-38bd88a35f99" containerID="bda75a0751b7174d619161e97d7751cd79e9e49952bf29f6262318c72c526482" exitCode=0 Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.273555 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cgzpd" event={"ID":"c1d2b5a4-7636-444a-88c3-38bd88a35f99","Type":"ContainerDied","Data":"bda75a0751b7174d619161e97d7751cd79e9e49952bf29f6262318c72c526482"} Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.275653 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-56tfw" event={"ID":"37cadb5f-eedf-40ab-b4db-44d9935e26eb","Type":"ContainerStarted","Data":"879a020789f2130db92238a0e45aec5f1fe5cce2eadf2c4c3f1b30ec9fdceee9"} Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.277035 4831 generic.go:334] "Generic (PLEG): container finished" podID="0164870d-aba7-43b6-b798-93ec48968837" containerID="6b6c94559811f8faf79684dcc5dfcd29da386088365131a2a6ffa4e9e9cc5ee1" exitCode=0 Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.277801 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xgjhm" event={"ID":"0164870d-aba7-43b6-b798-93ec48968837","Type":"ContainerDied","Data":"6b6c94559811f8faf79684dcc5dfcd29da386088365131a2a6ffa4e9e9cc5ee1"} Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.323274 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.323330 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.323376 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.323395 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.324007 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.329166 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.329638 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.331076 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.508432 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.520489 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.530978 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-twg79" Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.531046 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-twg79" Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.536713 4831 patch_prober.go:28] interesting pod/console-f9d7485db-twg79 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.28:8443/health\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.536762 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-twg79" podUID="bc1e4c46-b909-410a-980d-a045d5b3a636" containerName="console" probeResult="failure" output="Get \"https://10.217.0.28:8443/health\": dial tcp 10.217.0.28:8443: connect: connection refused" Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.537150 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.626341 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.627353 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.630634 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.630793 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.634703 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.676768 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-skjn8" Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.676811 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-skjn8" Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.723091 4831 patch_prober.go:28] interesting pod/apiserver-76f77b778f-skjn8 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 04 10:17:30 crc kubenswrapper[4831]: [+]log ok Dec 04 10:17:30 crc kubenswrapper[4831]: [+]etcd ok Dec 04 10:17:30 crc kubenswrapper[4831]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 04 10:17:30 crc kubenswrapper[4831]: [+]poststarthook/generic-apiserver-start-informers ok Dec 04 10:17:30 crc kubenswrapper[4831]: [+]poststarthook/max-in-flight-filter ok Dec 04 10:17:30 crc kubenswrapper[4831]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 04 10:17:30 crc kubenswrapper[4831]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 04 10:17:30 crc kubenswrapper[4831]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 04 10:17:30 crc kubenswrapper[4831]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Dec 04 10:17:30 crc kubenswrapper[4831]: [+]poststarthook/project.openshift.io-projectcache ok Dec 04 10:17:30 crc kubenswrapper[4831]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 04 10:17:30 crc kubenswrapper[4831]: [+]poststarthook/openshift.io-startinformers ok Dec 04 10:17:30 crc kubenswrapper[4831]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 04 10:17:30 crc kubenswrapper[4831]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 04 10:17:30 crc kubenswrapper[4831]: livez check failed Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.723173 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-skjn8" podUID="e3ea5ca7-6c75-482a-9245-518056647743" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.729993 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c334259c-2658-491e-8e4d-37332147790c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c334259c-2658-491e-8e4d-37332147790c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.730082 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c334259c-2658-491e-8e4d-37332147790c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c334259c-2658-491e-8e4d-37332147790c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.735725 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqztc" Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.736644 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqztc" Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.780429 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqztc" Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.780573 4831 patch_prober.go:28] interesting pod/downloads-7954f5f757-2h8nq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.780642 4831 patch_prober.go:28] interesting pod/downloads-7954f5f757-2h8nq container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.780676 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2h8nq" podUID="08171e38-fc77-4d5a-acc9-58dca0784830" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.780706 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-2h8nq" podUID="08171e38-fc77-4d5a-acc9-58dca0784830" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.832160 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c334259c-2658-491e-8e4d-37332147790c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c334259c-2658-491e-8e4d-37332147790c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.832571 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c334259c-2658-491e-8e4d-37332147790c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c334259c-2658-491e-8e4d-37332147790c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.832353 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c334259c-2658-491e-8e4d-37332147790c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c334259c-2658-491e-8e4d-37332147790c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.863920 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c334259c-2658-491e-8e4d-37332147790c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c334259c-2658-491e-8e4d-37332147790c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 10:17:30 crc kubenswrapper[4831]: W1204 10:17:30.876588 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-b2517aff0f0bb2ba318de16b5342cb7feb654a789d504494af74bb67602a96e0 WatchSource:0}: Error finding container b2517aff0f0bb2ba318de16b5342cb7feb654a789d504494af74bb67602a96e0: Status 404 returned error can't find the container with id b2517aff0f0bb2ba318de16b5342cb7feb654a789d504494af74bb67602a96e0 Dec 04 10:17:30 crc kubenswrapper[4831]: I1204 10:17:30.998964 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 10:17:31 crc kubenswrapper[4831]: I1204 10:17:31.187880 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-nmqqm" Dec 04 10:17:31 crc kubenswrapper[4831]: I1204 10:17:31.192798 4831 patch_prober.go:28] interesting pod/router-default-5444994796-nmqqm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 10:17:31 crc kubenswrapper[4831]: [-]has-synced failed: reason withheld Dec 04 10:17:31 crc kubenswrapper[4831]: [+]process-running ok Dec 04 10:17:31 crc kubenswrapper[4831]: healthz check failed Dec 04 10:17:31 crc kubenswrapper[4831]: I1204 10:17:31.192840 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nmqqm" podUID="4783614b-88f9-4207-b6ef-f73824ce9334" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 10:17:31 crc kubenswrapper[4831]: W1204 10:17:31.238372 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-bbe77617a2bbcba859601a95f29dc931d05f800fc9d69b24df9c7b0ffefc8886 WatchSource:0}: Error finding container bbe77617a2bbcba859601a95f29dc931d05f800fc9d69b24df9c7b0ffefc8886: Status 404 returned error can't find the container with id bbe77617a2bbcba859601a95f29dc931d05f800fc9d69b24df9c7b0ffefc8886 Dec 04 10:17:31 crc kubenswrapper[4831]: I1204 10:17:31.290633 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 04 10:17:31 crc kubenswrapper[4831]: I1204 10:17:31.292556 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"bbe77617a2bbcba859601a95f29dc931d05f800fc9d69b24df9c7b0ffefc8886"} Dec 04 10:17:31 crc kubenswrapper[4831]: I1204 10:17:31.296426 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-56tfw" event={"ID":"37cadb5f-eedf-40ab-b4db-44d9935e26eb","Type":"ContainerStarted","Data":"8c1c6a27c704475dfe352d9241822b72e738615ef65d7c794a308e9f8fce12ef"} Dec 04 10:17:31 crc kubenswrapper[4831]: I1204 10:17:31.296486 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-56tfw" event={"ID":"37cadb5f-eedf-40ab-b4db-44d9935e26eb","Type":"ContainerStarted","Data":"f4b9d69b3fd862b1c2ac66fb837a9f1f8f3b1af23fc1d76c97329eb02460730c"} Dec 04 10:17:31 crc kubenswrapper[4831]: I1204 10:17:31.315534 4831 generic.go:334] "Generic (PLEG): container finished" podID="470670f0-d9fa-4aca-8086-a63b711191d1" containerID="4d1beaf5fdb87acf1375d4ec15b2e393aa26e577d1bf40ae0476755eb6374ec7" exitCode=0 Dec 04 10:17:31 crc kubenswrapper[4831]: I1204 10:17:31.315681 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c7k4c" event={"ID":"470670f0-d9fa-4aca-8086-a63b711191d1","Type":"ContainerDied","Data":"4d1beaf5fdb87acf1375d4ec15b2e393aa26e577d1bf40ae0476755eb6374ec7"} Dec 04 10:17:31 crc kubenswrapper[4831]: I1204 10:17:31.318685 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 04 10:17:31 crc kubenswrapper[4831]: I1204 10:17:31.321205 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-56tfw" podStartSLOduration=13.321186257 podStartE2EDuration="13.321186257s" podCreationTimestamp="2025-12-04 10:17:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:17:31.3143904 +0000 UTC m=+148.263565714" watchObservedRunningTime="2025-12-04 10:17:31.321186257 +0000 UTC m=+148.270361571" Dec 04 10:17:31 crc kubenswrapper[4831]: I1204 10:17:31.321372 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"e751eccdc53a88ac7cc1634895fe0c3e0fe716ce1b41e13846e619e4fb2c1541"} Dec 04 10:17:31 crc kubenswrapper[4831]: I1204 10:17:31.324755 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"2df4a3d81a71e94084b741347778baaef45eb4fcc5256d04a3a1a5639ce96ba6"} Dec 04 10:17:31 crc kubenswrapper[4831]: I1204 10:17:31.324792 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"b2517aff0f0bb2ba318de16b5342cb7feb654a789d504494af74bb67602a96e0"} Dec 04 10:17:31 crc kubenswrapper[4831]: I1204 10:17:31.325395 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:17:31 crc kubenswrapper[4831]: I1204 10:17:31.328171 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" event={"ID":"ca0af9c8-f2fa-45f8-a428-9b061441bddf","Type":"ContainerStarted","Data":"c022d73d85974e6e516667345714b78fb0a15772cc42647110cf984c2551736d"} Dec 04 10:17:31 crc kubenswrapper[4831]: I1204 10:17:31.328363 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:31 crc kubenswrapper[4831]: W1204 10:17:31.328549 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc334259c_2658_491e_8e4d_37332147790c.slice/crio-8b16d27b258c1d3d200c955b2961ea0e21d127c74f5ed92692f1ce6888420269 WatchSource:0}: Error finding container 8b16d27b258c1d3d200c955b2961ea0e21d127c74f5ed92692f1ce6888420269: Status 404 returned error can't find the container with id 8b16d27b258c1d3d200c955b2961ea0e21d127c74f5ed92692f1ce6888420269 Dec 04 10:17:31 crc kubenswrapper[4831]: I1204 10:17:31.329972 4831 generic.go:334] "Generic (PLEG): container finished" podID="069ef39a-dfc9-4a4c-acf4-758475a8a7b0" containerID="0058f6b7a61d2ab2d50cc0839901fe460e913d30fe8348c2345bba8a1af2ea7b" exitCode=0 Dec 04 10:17:31 crc kubenswrapper[4831]: I1204 10:17:31.330195 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bbqht" event={"ID":"069ef39a-dfc9-4a4c-acf4-758475a8a7b0","Type":"ContainerDied","Data":"0058f6b7a61d2ab2d50cc0839901fe460e913d30fe8348c2345bba8a1af2ea7b"} Dec 04 10:17:31 crc kubenswrapper[4831]: I1204 10:17:31.339040 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqztc" Dec 04 10:17:31 crc kubenswrapper[4831]: I1204 10:17:31.376856 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" podStartSLOduration=127.376809363 podStartE2EDuration="2m7.376809363s" podCreationTimestamp="2025-12-04 10:15:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:17:31.372851399 +0000 UTC m=+148.322026723" watchObservedRunningTime="2025-12-04 10:17:31.376809363 +0000 UTC m=+148.325984677" Dec 04 10:17:31 crc kubenswrapper[4831]: I1204 10:17:31.634805 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-8r2vj" Dec 04 10:17:31 crc kubenswrapper[4831]: I1204 10:17:31.759220 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93463415-c819-44b0-9ee7-0de0698eb6a6-config-volume\") pod \"93463415-c819-44b0-9ee7-0de0698eb6a6\" (UID: \"93463415-c819-44b0-9ee7-0de0698eb6a6\") " Dec 04 10:17:31 crc kubenswrapper[4831]: I1204 10:17:31.759341 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/93463415-c819-44b0-9ee7-0de0698eb6a6-secret-volume\") pod \"93463415-c819-44b0-9ee7-0de0698eb6a6\" (UID: \"93463415-c819-44b0-9ee7-0de0698eb6a6\") " Dec 04 10:17:31 crc kubenswrapper[4831]: I1204 10:17:31.759391 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmrtg\" (UniqueName: \"kubernetes.io/projected/93463415-c819-44b0-9ee7-0de0698eb6a6-kube-api-access-lmrtg\") pod \"93463415-c819-44b0-9ee7-0de0698eb6a6\" (UID: \"93463415-c819-44b0-9ee7-0de0698eb6a6\") " Dec 04 10:17:31 crc kubenswrapper[4831]: I1204 10:17:31.760137 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93463415-c819-44b0-9ee7-0de0698eb6a6-config-volume" (OuterVolumeSpecName: "config-volume") pod "93463415-c819-44b0-9ee7-0de0698eb6a6" (UID: "93463415-c819-44b0-9ee7-0de0698eb6a6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:17:31 crc kubenswrapper[4831]: I1204 10:17:31.765998 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93463415-c819-44b0-9ee7-0de0698eb6a6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "93463415-c819-44b0-9ee7-0de0698eb6a6" (UID: "93463415-c819-44b0-9ee7-0de0698eb6a6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:17:31 crc kubenswrapper[4831]: I1204 10:17:31.768132 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93463415-c819-44b0-9ee7-0de0698eb6a6-kube-api-access-lmrtg" (OuterVolumeSpecName: "kube-api-access-lmrtg") pod "93463415-c819-44b0-9ee7-0de0698eb6a6" (UID: "93463415-c819-44b0-9ee7-0de0698eb6a6"). InnerVolumeSpecName "kube-api-access-lmrtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:17:31 crc kubenswrapper[4831]: I1204 10:17:31.861269 4831 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93463415-c819-44b0-9ee7-0de0698eb6a6-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 10:17:31 crc kubenswrapper[4831]: I1204 10:17:31.861303 4831 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/93463415-c819-44b0-9ee7-0de0698eb6a6-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 10:17:31 crc kubenswrapper[4831]: I1204 10:17:31.861316 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmrtg\" (UniqueName: \"kubernetes.io/projected/93463415-c819-44b0-9ee7-0de0698eb6a6-kube-api-access-lmrtg\") on node \"crc\" DevicePath \"\"" Dec 04 10:17:32 crc kubenswrapper[4831]: I1204 10:17:32.189980 4831 patch_prober.go:28] interesting pod/router-default-5444994796-nmqqm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 10:17:32 crc kubenswrapper[4831]: [-]has-synced failed: reason withheld Dec 04 10:17:32 crc kubenswrapper[4831]: [+]process-running ok Dec 04 10:17:32 crc kubenswrapper[4831]: healthz check failed Dec 04 10:17:32 crc kubenswrapper[4831]: I1204 10:17:32.190032 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nmqqm" podUID="4783614b-88f9-4207-b6ef-f73824ce9334" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 10:17:32 crc kubenswrapper[4831]: I1204 10:17:32.358479 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"7d9a5be071685e79faf0fca203bc1aadd7479c50a5db1f908dbd2ec76abc15b5"} Dec 04 10:17:32 crc kubenswrapper[4831]: I1204 10:17:32.362649 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"57ee242df74aad7ec0ee4512129cc38c9cb2b5a2efb4f4c376467903870a1251"} Dec 04 10:17:32 crc kubenswrapper[4831]: I1204 10:17:32.364397 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c334259c-2658-491e-8e4d-37332147790c","Type":"ContainerStarted","Data":"f181d9ae5d531115297a448f41e2cd00159fe791bc080c14a802856e760e7a33"} Dec 04 10:17:32 crc kubenswrapper[4831]: I1204 10:17:32.364429 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c334259c-2658-491e-8e4d-37332147790c","Type":"ContainerStarted","Data":"8b16d27b258c1d3d200c955b2961ea0e21d127c74f5ed92692f1ce6888420269"} Dec 04 10:17:32 crc kubenswrapper[4831]: I1204 10:17:32.366900 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-8r2vj" Dec 04 10:17:32 crc kubenswrapper[4831]: I1204 10:17:32.369472 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-8r2vj" event={"ID":"93463415-c819-44b0-9ee7-0de0698eb6a6","Type":"ContainerDied","Data":"3c8c63a50ccd40eba1f8caa904ca1dda2c0e26776ba8110a64371fc44a541c0c"} Dec 04 10:17:32 crc kubenswrapper[4831]: I1204 10:17:32.369502 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c8c63a50ccd40eba1f8caa904ca1dda2c0e26776ba8110a64371fc44a541c0c" Dec 04 10:17:33 crc kubenswrapper[4831]: I1204 10:17:33.188546 4831 patch_prober.go:28] interesting pod/router-default-5444994796-nmqqm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 10:17:33 crc kubenswrapper[4831]: [-]has-synced failed: reason withheld Dec 04 10:17:33 crc kubenswrapper[4831]: [+]process-running ok Dec 04 10:17:33 crc kubenswrapper[4831]: healthz check failed Dec 04 10:17:33 crc kubenswrapper[4831]: I1204 10:17:33.188602 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nmqqm" podUID="4783614b-88f9-4207-b6ef-f73824ce9334" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 10:17:33 crc kubenswrapper[4831]: I1204 10:17:33.381193 4831 generic.go:334] "Generic (PLEG): container finished" podID="c334259c-2658-491e-8e4d-37332147790c" containerID="f181d9ae5d531115297a448f41e2cd00159fe791bc080c14a802856e760e7a33" exitCode=0 Dec 04 10:17:33 crc kubenswrapper[4831]: I1204 10:17:33.381295 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c334259c-2658-491e-8e4d-37332147790c","Type":"ContainerDied","Data":"f181d9ae5d531115297a448f41e2cd00159fe791bc080c14a802856e760e7a33"} Dec 04 10:17:33 crc kubenswrapper[4831]: I1204 10:17:33.849977 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 04 10:17:33 crc kubenswrapper[4831]: E1204 10:17:33.850787 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93463415-c819-44b0-9ee7-0de0698eb6a6" containerName="collect-profiles" Dec 04 10:17:33 crc kubenswrapper[4831]: I1204 10:17:33.850806 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="93463415-c819-44b0-9ee7-0de0698eb6a6" containerName="collect-profiles" Dec 04 10:17:33 crc kubenswrapper[4831]: I1204 10:17:33.850935 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="93463415-c819-44b0-9ee7-0de0698eb6a6" containerName="collect-profiles" Dec 04 10:17:33 crc kubenswrapper[4831]: I1204 10:17:33.851573 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 10:17:33 crc kubenswrapper[4831]: I1204 10:17:33.854304 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 04 10:17:33 crc kubenswrapper[4831]: I1204 10:17:33.854943 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 04 10:17:33 crc kubenswrapper[4831]: I1204 10:17:33.855280 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 04 10:17:33 crc kubenswrapper[4831]: I1204 10:17:33.888755 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/27020e38-b505-48b1-8068-96c14dba1b9d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"27020e38-b505-48b1-8068-96c14dba1b9d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 10:17:33 crc kubenswrapper[4831]: I1204 10:17:33.888855 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/27020e38-b505-48b1-8068-96c14dba1b9d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"27020e38-b505-48b1-8068-96c14dba1b9d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 10:17:33 crc kubenswrapper[4831]: I1204 10:17:33.989826 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/27020e38-b505-48b1-8068-96c14dba1b9d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"27020e38-b505-48b1-8068-96c14dba1b9d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 10:17:33 crc kubenswrapper[4831]: I1204 10:17:33.989892 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/27020e38-b505-48b1-8068-96c14dba1b9d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"27020e38-b505-48b1-8068-96c14dba1b9d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 10:17:33 crc kubenswrapper[4831]: I1204 10:17:33.991341 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/27020e38-b505-48b1-8068-96c14dba1b9d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"27020e38-b505-48b1-8068-96c14dba1b9d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 10:17:34 crc kubenswrapper[4831]: I1204 10:17:34.014797 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/27020e38-b505-48b1-8068-96c14dba1b9d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"27020e38-b505-48b1-8068-96c14dba1b9d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 10:17:34 crc kubenswrapper[4831]: I1204 10:17:34.187951 4831 patch_prober.go:28] interesting pod/router-default-5444994796-nmqqm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 10:17:34 crc kubenswrapper[4831]: [-]has-synced failed: reason withheld Dec 04 10:17:34 crc kubenswrapper[4831]: [+]process-running ok Dec 04 10:17:34 crc kubenswrapper[4831]: healthz check failed Dec 04 10:17:34 crc kubenswrapper[4831]: I1204 10:17:34.188031 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nmqqm" podUID="4783614b-88f9-4207-b6ef-f73824ce9334" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 10:17:34 crc kubenswrapper[4831]: I1204 10:17:34.193538 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 10:17:35 crc kubenswrapper[4831]: I1204 10:17:35.189270 4831 patch_prober.go:28] interesting pod/router-default-5444994796-nmqqm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 10:17:35 crc kubenswrapper[4831]: [-]has-synced failed: reason withheld Dec 04 10:17:35 crc kubenswrapper[4831]: [+]process-running ok Dec 04 10:17:35 crc kubenswrapper[4831]: healthz check failed Dec 04 10:17:35 crc kubenswrapper[4831]: I1204 10:17:35.189821 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nmqqm" podUID="4783614b-88f9-4207-b6ef-f73824ce9334" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 10:17:35 crc kubenswrapper[4831]: I1204 10:17:35.671231 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-skjn8" Dec 04 10:17:35 crc kubenswrapper[4831]: I1204 10:17:35.676839 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-skjn8" Dec 04 10:17:36 crc kubenswrapper[4831]: I1204 10:17:36.188079 4831 patch_prober.go:28] interesting pod/router-default-5444994796-nmqqm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 10:17:36 crc kubenswrapper[4831]: [-]has-synced failed: reason withheld Dec 04 10:17:36 crc kubenswrapper[4831]: [+]process-running ok Dec 04 10:17:36 crc kubenswrapper[4831]: healthz check failed Dec 04 10:17:36 crc kubenswrapper[4831]: I1204 10:17:36.188139 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nmqqm" podUID="4783614b-88f9-4207-b6ef-f73824ce9334" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 10:17:37 crc kubenswrapper[4831]: I1204 10:17:37.187316 4831 patch_prober.go:28] interesting pod/router-default-5444994796-nmqqm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 10:17:37 crc kubenswrapper[4831]: [-]has-synced failed: reason withheld Dec 04 10:17:37 crc kubenswrapper[4831]: [+]process-running ok Dec 04 10:17:37 crc kubenswrapper[4831]: healthz check failed Dec 04 10:17:37 crc kubenswrapper[4831]: I1204 10:17:37.187368 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nmqqm" podUID="4783614b-88f9-4207-b6ef-f73824ce9334" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 10:17:37 crc kubenswrapper[4831]: I1204 10:17:37.220139 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-lzxr6" Dec 04 10:17:38 crc kubenswrapper[4831]: I1204 10:17:38.187618 4831 patch_prober.go:28] interesting pod/router-default-5444994796-nmqqm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 10:17:38 crc kubenswrapper[4831]: [-]has-synced failed: reason withheld Dec 04 10:17:38 crc kubenswrapper[4831]: [+]process-running ok Dec 04 10:17:38 crc kubenswrapper[4831]: healthz check failed Dec 04 10:17:38 crc kubenswrapper[4831]: I1204 10:17:38.187712 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nmqqm" podUID="4783614b-88f9-4207-b6ef-f73824ce9334" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 10:17:38 crc kubenswrapper[4831]: I1204 10:17:38.891177 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 10:17:38 crc kubenswrapper[4831]: I1204 10:17:38.963268 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c334259c-2658-491e-8e4d-37332147790c-kubelet-dir\") pod \"c334259c-2658-491e-8e4d-37332147790c\" (UID: \"c334259c-2658-491e-8e4d-37332147790c\") " Dec 04 10:17:38 crc kubenswrapper[4831]: I1204 10:17:38.963644 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c334259c-2658-491e-8e4d-37332147790c-kube-api-access\") pod \"c334259c-2658-491e-8e4d-37332147790c\" (UID: \"c334259c-2658-491e-8e4d-37332147790c\") " Dec 04 10:17:38 crc kubenswrapper[4831]: I1204 10:17:38.963421 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c334259c-2658-491e-8e4d-37332147790c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c334259c-2658-491e-8e4d-37332147790c" (UID: "c334259c-2658-491e-8e4d-37332147790c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:17:38 crc kubenswrapper[4831]: I1204 10:17:38.968433 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c334259c-2658-491e-8e4d-37332147790c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c334259c-2658-491e-8e4d-37332147790c" (UID: "c334259c-2658-491e-8e4d-37332147790c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:17:39 crc kubenswrapper[4831]: I1204 10:17:39.064783 4831 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c334259c-2658-491e-8e4d-37332147790c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 04 10:17:39 crc kubenswrapper[4831]: I1204 10:17:39.064817 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c334259c-2658-491e-8e4d-37332147790c-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 10:17:39 crc kubenswrapper[4831]: I1204 10:17:39.187904 4831 patch_prober.go:28] interesting pod/router-default-5444994796-nmqqm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 10:17:39 crc kubenswrapper[4831]: [-]has-synced failed: reason withheld Dec 04 10:17:39 crc kubenswrapper[4831]: [+]process-running ok Dec 04 10:17:39 crc kubenswrapper[4831]: healthz check failed Dec 04 10:17:39 crc kubenswrapper[4831]: I1204 10:17:39.187966 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nmqqm" podUID="4783614b-88f9-4207-b6ef-f73824ce9334" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 10:17:39 crc kubenswrapper[4831]: I1204 10:17:39.446421 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c334259c-2658-491e-8e4d-37332147790c","Type":"ContainerDied","Data":"8b16d27b258c1d3d200c955b2961ea0e21d127c74f5ed92692f1ce6888420269"} Dec 04 10:17:39 crc kubenswrapper[4831]: I1204 10:17:39.446457 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b16d27b258c1d3d200c955b2961ea0e21d127c74f5ed92692f1ce6888420269" Dec 04 10:17:39 crc kubenswrapper[4831]: I1204 10:17:39.446460 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 10:17:40 crc kubenswrapper[4831]: I1204 10:17:40.189007 4831 patch_prober.go:28] interesting pod/router-default-5444994796-nmqqm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 10:17:40 crc kubenswrapper[4831]: [-]has-synced failed: reason withheld Dec 04 10:17:40 crc kubenswrapper[4831]: [+]process-running ok Dec 04 10:17:40 crc kubenswrapper[4831]: healthz check failed Dec 04 10:17:40 crc kubenswrapper[4831]: I1204 10:17:40.190379 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nmqqm" podUID="4783614b-88f9-4207-b6ef-f73824ce9334" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 10:17:40 crc kubenswrapper[4831]: I1204 10:17:40.531703 4831 patch_prober.go:28] interesting pod/console-f9d7485db-twg79 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.28:8443/health\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Dec 04 10:17:40 crc kubenswrapper[4831]: I1204 10:17:40.531796 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-twg79" podUID="bc1e4c46-b909-410a-980d-a045d5b3a636" containerName="console" probeResult="failure" output="Get \"https://10.217.0.28:8443/health\": dial tcp 10.217.0.28:8443: connect: connection refused" Dec 04 10:17:40 crc kubenswrapper[4831]: I1204 10:17:40.786613 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-2h8nq" Dec 04 10:17:40 crc kubenswrapper[4831]: E1204 10:17:40.869409 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: Get \"https://access.redhat.com/webassets/docker/content/sigstore/redhat/redhat-marketplace-index@sha256=e8990432556acad31519b1a73ec32f32d27c2034cf9e5cc4db8980efc7331594/signature-1\": net/http: TLS handshake timeout" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 04 10:17:40 crc kubenswrapper[4831]: E1204 10:17:40.869874 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cbtj9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-cgzpd_openshift-marketplace(c1d2b5a4-7636-444a-88c3-38bd88a35f99): ErrImagePull: copying system image from manifest list: reading signatures: Get \"https://access.redhat.com/webassets/docker/content/sigstore/redhat/redhat-marketplace-index@sha256=e8990432556acad31519b1a73ec32f32d27c2034cf9e5cc4db8980efc7331594/signature-1\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 04 10:17:40 crc kubenswrapper[4831]: E1204 10:17:40.871279 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"copying system image from manifest list: reading signatures: Get \\\"https://access.redhat.com/webassets/docker/content/sigstore/redhat/redhat-marketplace-index@sha256=e8990432556acad31519b1a73ec32f32d27c2034cf9e5cc4db8980efc7331594/signature-1\\\": net/http: TLS handshake timeout\"" pod="openshift-marketplace/redhat-marketplace-cgzpd" podUID="c1d2b5a4-7636-444a-88c3-38bd88a35f99" Dec 04 10:17:41 crc kubenswrapper[4831]: I1204 10:17:41.189975 4831 patch_prober.go:28] interesting pod/router-default-5444994796-nmqqm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 10:17:41 crc kubenswrapper[4831]: [-]has-synced failed: reason withheld Dec 04 10:17:41 crc kubenswrapper[4831]: [+]process-running ok Dec 04 10:17:41 crc kubenswrapper[4831]: healthz check failed Dec 04 10:17:41 crc kubenswrapper[4831]: I1204 10:17:41.190076 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nmqqm" podUID="4783614b-88f9-4207-b6ef-f73824ce9334" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 10:17:41 crc kubenswrapper[4831]: E1204 10:17:41.490306 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-cgzpd" podUID="c1d2b5a4-7636-444a-88c3-38bd88a35f99" Dec 04 10:17:42 crc kubenswrapper[4831]: I1204 10:17:42.189462 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-nmqqm" Dec 04 10:17:42 crc kubenswrapper[4831]: I1204 10:17:42.191808 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-nmqqm" Dec 04 10:17:46 crc kubenswrapper[4831]: I1204 10:17:46.058222 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b50ce71-ca0a-4532-86e9-4f779dcc7b93-metrics-certs\") pod \"network-metrics-daemon-fd6cw\" (UID: \"5b50ce71-ca0a-4532-86e9-4f779dcc7b93\") " pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:17:46 crc kubenswrapper[4831]: I1204 10:17:46.064483 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b50ce71-ca0a-4532-86e9-4f779dcc7b93-metrics-certs\") pod \"network-metrics-daemon-fd6cw\" (UID: \"5b50ce71-ca0a-4532-86e9-4f779dcc7b93\") " pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:17:46 crc kubenswrapper[4831]: I1204 10:17:46.190024 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd6cw" Dec 04 10:17:49 crc kubenswrapper[4831]: I1204 10:17:49.869524 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:17:50 crc kubenswrapper[4831]: I1204 10:17:50.537032 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-twg79" Dec 04 10:17:50 crc kubenswrapper[4831]: I1204 10:17:50.541117 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-twg79" Dec 04 10:17:51 crc kubenswrapper[4831]: I1204 10:17:51.971940 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:17:51 crc kubenswrapper[4831]: I1204 10:17:51.972249 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:18:01 crc kubenswrapper[4831]: E1204 10:18:01.523309 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 04 10:18:01 crc kubenswrapper[4831]: E1204 10:18:01.523873 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9sxrm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-2nwlb_openshift-marketplace(d493a63f-9d1a-4995-8698-c3b75c46d69f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 10:18:01 crc kubenswrapper[4831]: E1204 10:18:01.525141 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-2nwlb" podUID="d493a63f-9d1a-4995-8698-c3b75c46d69f" Dec 04 10:18:01 crc kubenswrapper[4831]: I1204 10:18:01.815978 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zng9d" Dec 04 10:18:08 crc kubenswrapper[4831]: I1204 10:18:08.265995 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 04 10:18:08 crc kubenswrapper[4831]: E1204 10:18:08.267283 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c334259c-2658-491e-8e4d-37332147790c" containerName="pruner" Dec 04 10:18:08 crc kubenswrapper[4831]: I1204 10:18:08.267316 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="c334259c-2658-491e-8e4d-37332147790c" containerName="pruner" Dec 04 10:18:08 crc kubenswrapper[4831]: I1204 10:18:08.267604 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="c334259c-2658-491e-8e4d-37332147790c" containerName="pruner" Dec 04 10:18:08 crc kubenswrapper[4831]: I1204 10:18:08.269270 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 10:18:08 crc kubenswrapper[4831]: I1204 10:18:08.277175 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 04 10:18:08 crc kubenswrapper[4831]: I1204 10:18:08.369820 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/33a67ea8-4808-4214-b17b-4f647b447bf3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"33a67ea8-4808-4214-b17b-4f647b447bf3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 10:18:08 crc kubenswrapper[4831]: I1204 10:18:08.370113 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/33a67ea8-4808-4214-b17b-4f647b447bf3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"33a67ea8-4808-4214-b17b-4f647b447bf3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 10:18:08 crc kubenswrapper[4831]: I1204 10:18:08.470719 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/33a67ea8-4808-4214-b17b-4f647b447bf3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"33a67ea8-4808-4214-b17b-4f647b447bf3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 10:18:08 crc kubenswrapper[4831]: I1204 10:18:08.471112 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/33a67ea8-4808-4214-b17b-4f647b447bf3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"33a67ea8-4808-4214-b17b-4f647b447bf3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 10:18:08 crc kubenswrapper[4831]: I1204 10:18:08.470887 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/33a67ea8-4808-4214-b17b-4f647b447bf3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"33a67ea8-4808-4214-b17b-4f647b447bf3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 10:18:08 crc kubenswrapper[4831]: I1204 10:18:08.488396 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/33a67ea8-4808-4214-b17b-4f647b447bf3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"33a67ea8-4808-4214-b17b-4f647b447bf3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 10:18:08 crc kubenswrapper[4831]: I1204 10:18:08.586234 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 10:18:10 crc kubenswrapper[4831]: I1204 10:18:10.513208 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 10:18:10 crc kubenswrapper[4831]: E1204 10:18:10.944264 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-2nwlb" podUID="d493a63f-9d1a-4995-8698-c3b75c46d69f" Dec 04 10:18:11 crc kubenswrapper[4831]: E1204 10:18:11.033557 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 04 10:18:11 crc kubenswrapper[4831]: E1204 10:18:11.033869 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g2ptk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-c7k4c_openshift-marketplace(470670f0-d9fa-4aca-8086-a63b711191d1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 10:18:11 crc kubenswrapper[4831]: E1204 10:18:11.035724 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-c7k4c" podUID="470670f0-d9fa-4aca-8086-a63b711191d1" Dec 04 10:18:11 crc kubenswrapper[4831]: E1204 10:18:11.883806 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-c7k4c" podUID="470670f0-d9fa-4aca-8086-a63b711191d1" Dec 04 10:18:11 crc kubenswrapper[4831]: E1204 10:18:11.979457 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 04 10:18:11 crc kubenswrapper[4831]: E1204 10:18:11.979950 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sc9hn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-xgjhm_openshift-marketplace(0164870d-aba7-43b6-b798-93ec48968837): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 10:18:11 crc kubenswrapper[4831]: E1204 10:18:11.981235 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-xgjhm" podUID="0164870d-aba7-43b6-b798-93ec48968837" Dec 04 10:18:13 crc kubenswrapper[4831]: E1204 10:18:13.286099 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-xgjhm" podUID="0164870d-aba7-43b6-b798-93ec48968837" Dec 04 10:18:13 crc kubenswrapper[4831]: E1204 10:18:13.344366 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 04 10:18:13 crc kubenswrapper[4831]: E1204 10:18:13.344500 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7mlp2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-nbm8n_openshift-marketplace(16d53c94-9ca3-4b5a-b33d-829f44de5367): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 10:18:13 crc kubenswrapper[4831]: E1204 10:18:13.345668 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-nbm8n" podUID="16d53c94-9ca3-4b5a-b33d-829f44de5367" Dec 04 10:18:13 crc kubenswrapper[4831]: E1204 10:18:13.350623 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 04 10:18:13 crc kubenswrapper[4831]: E1204 10:18:13.350741 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ndrmg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-bbqht_openshift-marketplace(069ef39a-dfc9-4a4c-acf4-758475a8a7b0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 10:18:13 crc kubenswrapper[4831]: E1204 10:18:13.352270 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-bbqht" podUID="069ef39a-dfc9-4a4c-acf4-758475a8a7b0" Dec 04 10:18:13 crc kubenswrapper[4831]: E1204 10:18:13.369938 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 04 10:18:13 crc kubenswrapper[4831]: E1204 10:18:13.370072 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xc589,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-k6xp2_openshift-marketplace(cdf24e44-d00c-424c-aa3c-3e7fdb1a2997): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 10:18:13 crc kubenswrapper[4831]: E1204 10:18:13.371221 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-k6xp2" podUID="cdf24e44-d00c-424c-aa3c-3e7fdb1a2997" Dec 04 10:18:13 crc kubenswrapper[4831]: E1204 10:18:13.447822 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 04 10:18:13 crc kubenswrapper[4831]: E1204 10:18:13.447980 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tb4fz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-cdsf2_openshift-marketplace(e6ffc610-2a29-4b62-abc6-2cc0568c8bc1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 10:18:13 crc kubenswrapper[4831]: E1204 10:18:13.449208 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-cdsf2" podUID="e6ffc610-2a29-4b62-abc6-2cc0568c8bc1" Dec 04 10:18:13 crc kubenswrapper[4831]: I1204 10:18:13.635473 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cgzpd" event={"ID":"c1d2b5a4-7636-444a-88c3-38bd88a35f99","Type":"ContainerStarted","Data":"aa08b3dc6719a5ff09f802a7f4eaa1b4f9210d4f9faf827b39f984dbd54c5ccf"} Dec 04 10:18:13 crc kubenswrapper[4831]: E1204 10:18:13.636340 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-nbm8n" podUID="16d53c94-9ca3-4b5a-b33d-829f44de5367" Dec 04 10:18:13 crc kubenswrapper[4831]: E1204 10:18:13.636941 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-k6xp2" podUID="cdf24e44-d00c-424c-aa3c-3e7fdb1a2997" Dec 04 10:18:13 crc kubenswrapper[4831]: E1204 10:18:13.637010 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-cdsf2" podUID="e6ffc610-2a29-4b62-abc6-2cc0568c8bc1" Dec 04 10:18:13 crc kubenswrapper[4831]: E1204 10:18:13.637091 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-bbqht" podUID="069ef39a-dfc9-4a4c-acf4-758475a8a7b0" Dec 04 10:18:13 crc kubenswrapper[4831]: I1204 10:18:13.657401 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 04 10:18:13 crc kubenswrapper[4831]: I1204 10:18:13.658524 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 04 10:18:13 crc kubenswrapper[4831]: I1204 10:18:13.668775 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 04 10:18:13 crc kubenswrapper[4831]: I1204 10:18:13.757843 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 04 10:18:13 crc kubenswrapper[4831]: I1204 10:18:13.823672 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 04 10:18:13 crc kubenswrapper[4831]: I1204 10:18:13.828430 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fd6cw"] Dec 04 10:18:13 crc kubenswrapper[4831]: W1204 10:18:13.832611 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod33a67ea8_4808_4214_b17b_4f647b447bf3.slice/crio-56afb16f503c0258321420e467e4632ae2fddfa413473a464b53fe6e200ebd14 WatchSource:0}: Error finding container 56afb16f503c0258321420e467e4632ae2fddfa413473a464b53fe6e200ebd14: Status 404 returned error can't find the container with id 56afb16f503c0258321420e467e4632ae2fddfa413473a464b53fe6e200ebd14 Dec 04 10:18:13 crc kubenswrapper[4831]: W1204 10:18:13.836940 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b50ce71_ca0a_4532_86e9_4f779dcc7b93.slice/crio-20294568c9b580df1e11bdb25c4c02328a510adfa880db6d3cb8574d6985465f WatchSource:0}: Error finding container 20294568c9b580df1e11bdb25c4c02328a510adfa880db6d3cb8574d6985465f: Status 404 returned error can't find the container with id 20294568c9b580df1e11bdb25c4c02328a510adfa880db6d3cb8574d6985465f Dec 04 10:18:13 crc kubenswrapper[4831]: I1204 10:18:13.848301 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fe06f3bd-34a4-4f9c-9258-ca780b0a510b-kubelet-dir\") pod \"installer-9-crc\" (UID: \"fe06f3bd-34a4-4f9c-9258-ca780b0a510b\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 10:18:13 crc kubenswrapper[4831]: I1204 10:18:13.848387 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fe06f3bd-34a4-4f9c-9258-ca780b0a510b-var-lock\") pod \"installer-9-crc\" (UID: \"fe06f3bd-34a4-4f9c-9258-ca780b0a510b\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 10:18:13 crc kubenswrapper[4831]: I1204 10:18:13.848416 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fe06f3bd-34a4-4f9c-9258-ca780b0a510b-kube-api-access\") pod \"installer-9-crc\" (UID: \"fe06f3bd-34a4-4f9c-9258-ca780b0a510b\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 10:18:13 crc kubenswrapper[4831]: I1204 10:18:13.949160 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fe06f3bd-34a4-4f9c-9258-ca780b0a510b-kubelet-dir\") pod \"installer-9-crc\" (UID: \"fe06f3bd-34a4-4f9c-9258-ca780b0a510b\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 10:18:13 crc kubenswrapper[4831]: I1204 10:18:13.949463 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fe06f3bd-34a4-4f9c-9258-ca780b0a510b-var-lock\") pod \"installer-9-crc\" (UID: \"fe06f3bd-34a4-4f9c-9258-ca780b0a510b\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 10:18:13 crc kubenswrapper[4831]: I1204 10:18:13.949485 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fe06f3bd-34a4-4f9c-9258-ca780b0a510b-kube-api-access\") pod \"installer-9-crc\" (UID: \"fe06f3bd-34a4-4f9c-9258-ca780b0a510b\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 10:18:13 crc kubenswrapper[4831]: I1204 10:18:13.949488 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fe06f3bd-34a4-4f9c-9258-ca780b0a510b-var-lock\") pod \"installer-9-crc\" (UID: \"fe06f3bd-34a4-4f9c-9258-ca780b0a510b\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 10:18:13 crc kubenswrapper[4831]: I1204 10:18:13.949263 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fe06f3bd-34a4-4f9c-9258-ca780b0a510b-kubelet-dir\") pod \"installer-9-crc\" (UID: \"fe06f3bd-34a4-4f9c-9258-ca780b0a510b\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 10:18:13 crc kubenswrapper[4831]: I1204 10:18:13.966475 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fe06f3bd-34a4-4f9c-9258-ca780b0a510b-kube-api-access\") pod \"installer-9-crc\" (UID: \"fe06f3bd-34a4-4f9c-9258-ca780b0a510b\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 10:18:14 crc kubenswrapper[4831]: I1204 10:18:14.031255 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 04 10:18:14 crc kubenswrapper[4831]: I1204 10:18:14.439837 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 04 10:18:14 crc kubenswrapper[4831]: W1204 10:18:14.453147 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podfe06f3bd_34a4_4f9c_9258_ca780b0a510b.slice/crio-92fd6db70bd70204fa4e2acc4c172baefcdc78f0bd0fc9fb96bad3f3fc08a964 WatchSource:0}: Error finding container 92fd6db70bd70204fa4e2acc4c172baefcdc78f0bd0fc9fb96bad3f3fc08a964: Status 404 returned error can't find the container with id 92fd6db70bd70204fa4e2acc4c172baefcdc78f0bd0fc9fb96bad3f3fc08a964 Dec 04 10:18:14 crc kubenswrapper[4831]: I1204 10:18:14.643690 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cgzpd" event={"ID":"c1d2b5a4-7636-444a-88c3-38bd88a35f99","Type":"ContainerDied","Data":"aa08b3dc6719a5ff09f802a7f4eaa1b4f9210d4f9faf827b39f984dbd54c5ccf"} Dec 04 10:18:14 crc kubenswrapper[4831]: I1204 10:18:14.643688 4831 generic.go:334] "Generic (PLEG): container finished" podID="c1d2b5a4-7636-444a-88c3-38bd88a35f99" containerID="aa08b3dc6719a5ff09f802a7f4eaa1b4f9210d4f9faf827b39f984dbd54c5ccf" exitCode=0 Dec 04 10:18:14 crc kubenswrapper[4831]: I1204 10:18:14.646582 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"fe06f3bd-34a4-4f9c-9258-ca780b0a510b","Type":"ContainerStarted","Data":"92fd6db70bd70204fa4e2acc4c172baefcdc78f0bd0fc9fb96bad3f3fc08a964"} Dec 04 10:18:14 crc kubenswrapper[4831]: I1204 10:18:14.649498 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"33a67ea8-4808-4214-b17b-4f647b447bf3","Type":"ContainerStarted","Data":"41eddc86434877a682592c84f01b63f46f86a9deafe866d12f4f2bc87d613cc4"} Dec 04 10:18:14 crc kubenswrapper[4831]: I1204 10:18:14.649558 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"33a67ea8-4808-4214-b17b-4f647b447bf3","Type":"ContainerStarted","Data":"56afb16f503c0258321420e467e4632ae2fddfa413473a464b53fe6e200ebd14"} Dec 04 10:18:14 crc kubenswrapper[4831]: I1204 10:18:14.652180 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"27020e38-b505-48b1-8068-96c14dba1b9d","Type":"ContainerStarted","Data":"fd7ac7a72d5b289c2dd07cfb5996451c3ce9edf8a067f7cf850901e8136c81be"} Dec 04 10:18:14 crc kubenswrapper[4831]: I1204 10:18:14.652231 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"27020e38-b505-48b1-8068-96c14dba1b9d","Type":"ContainerStarted","Data":"dd68e13090481f07ce876eb5a9ccfce072e9b27f16ed01e1e6993e5aa6893b23"} Dec 04 10:18:14 crc kubenswrapper[4831]: I1204 10:18:14.654896 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fd6cw" event={"ID":"5b50ce71-ca0a-4532-86e9-4f779dcc7b93","Type":"ContainerStarted","Data":"8564b60ff29263a5d412efe8fec011249b211a6cbf24bf1a806eb445fd323ee6"} Dec 04 10:18:14 crc kubenswrapper[4831]: I1204 10:18:14.654942 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fd6cw" event={"ID":"5b50ce71-ca0a-4532-86e9-4f779dcc7b93","Type":"ContainerStarted","Data":"131cbb2a41096798990e58f38cadd69713257b5d833f3772da767cff3e0f093b"} Dec 04 10:18:14 crc kubenswrapper[4831]: I1204 10:18:14.654961 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fd6cw" event={"ID":"5b50ce71-ca0a-4532-86e9-4f779dcc7b93","Type":"ContainerStarted","Data":"20294568c9b580df1e11bdb25c4c02328a510adfa880db6d3cb8574d6985465f"} Dec 04 10:18:14 crc kubenswrapper[4831]: I1204 10:18:14.678268 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=41.678245304 podStartE2EDuration="41.678245304s" podCreationTimestamp="2025-12-04 10:17:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:18:14.674343001 +0000 UTC m=+191.623518385" watchObservedRunningTime="2025-12-04 10:18:14.678245304 +0000 UTC m=+191.627420658" Dec 04 10:18:14 crc kubenswrapper[4831]: I1204 10:18:14.701344 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-fd6cw" podStartSLOduration=171.70131024 podStartE2EDuration="2m51.70131024s" podCreationTimestamp="2025-12-04 10:15:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:18:14.696952914 +0000 UTC m=+191.646128328" watchObservedRunningTime="2025-12-04 10:18:14.70131024 +0000 UTC m=+191.650485604" Dec 04 10:18:14 crc kubenswrapper[4831]: I1204 10:18:14.719416 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=6.7194001629999995 podStartE2EDuration="6.719400163s" podCreationTimestamp="2025-12-04 10:18:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:18:14.718146856 +0000 UTC m=+191.667322210" watchObservedRunningTime="2025-12-04 10:18:14.719400163 +0000 UTC m=+191.668575477" Dec 04 10:18:15 crc kubenswrapper[4831]: I1204 10:18:15.663987 4831 generic.go:334] "Generic (PLEG): container finished" podID="33a67ea8-4808-4214-b17b-4f647b447bf3" containerID="41eddc86434877a682592c84f01b63f46f86a9deafe866d12f4f2bc87d613cc4" exitCode=0 Dec 04 10:18:15 crc kubenswrapper[4831]: I1204 10:18:15.664102 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"33a67ea8-4808-4214-b17b-4f647b447bf3","Type":"ContainerDied","Data":"41eddc86434877a682592c84f01b63f46f86a9deafe866d12f4f2bc87d613cc4"} Dec 04 10:18:15 crc kubenswrapper[4831]: I1204 10:18:15.668375 4831 generic.go:334] "Generic (PLEG): container finished" podID="27020e38-b505-48b1-8068-96c14dba1b9d" containerID="fd7ac7a72d5b289c2dd07cfb5996451c3ce9edf8a067f7cf850901e8136c81be" exitCode=0 Dec 04 10:18:15 crc kubenswrapper[4831]: I1204 10:18:15.668494 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"27020e38-b505-48b1-8068-96c14dba1b9d","Type":"ContainerDied","Data":"fd7ac7a72d5b289c2dd07cfb5996451c3ce9edf8a067f7cf850901e8136c81be"} Dec 04 10:18:15 crc kubenswrapper[4831]: I1204 10:18:15.671561 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cgzpd" event={"ID":"c1d2b5a4-7636-444a-88c3-38bd88a35f99","Type":"ContainerStarted","Data":"19d6203266a999f016a082635afb1e4d0a1870ab5c188da90569ce268d02bffe"} Dec 04 10:18:15 crc kubenswrapper[4831]: I1204 10:18:15.673801 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"fe06f3bd-34a4-4f9c-9258-ca780b0a510b","Type":"ContainerStarted","Data":"73928568938a5f466568a9cc96eeada6f3f43226837fb06565369da84f65483b"} Dec 04 10:18:15 crc kubenswrapper[4831]: I1204 10:18:15.699596 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.699577459 podStartE2EDuration="2.699577459s" podCreationTimestamp="2025-12-04 10:18:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:18:15.699137406 +0000 UTC m=+192.648312720" watchObservedRunningTime="2025-12-04 10:18:15.699577459 +0000 UTC m=+192.648752773" Dec 04 10:18:15 crc kubenswrapper[4831]: I1204 10:18:15.725642 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cgzpd" podStartSLOduration=3.651945869 podStartE2EDuration="48.725623141s" podCreationTimestamp="2025-12-04 10:17:27 +0000 UTC" firstStartedPulling="2025-12-04 10:17:30.302816216 +0000 UTC m=+147.251991530" lastFinishedPulling="2025-12-04 10:18:15.376493498 +0000 UTC m=+192.325668802" observedRunningTime="2025-12-04 10:18:15.724078196 +0000 UTC m=+192.673253520" watchObservedRunningTime="2025-12-04 10:18:15.725623141 +0000 UTC m=+192.674798455" Dec 04 10:18:16 crc kubenswrapper[4831]: I1204 10:18:16.982586 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 10:18:17 crc kubenswrapper[4831]: I1204 10:18:17.039212 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 10:18:17 crc kubenswrapper[4831]: I1204 10:18:17.183031 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/33a67ea8-4808-4214-b17b-4f647b447bf3-kubelet-dir\") pod \"33a67ea8-4808-4214-b17b-4f647b447bf3\" (UID: \"33a67ea8-4808-4214-b17b-4f647b447bf3\") " Dec 04 10:18:17 crc kubenswrapper[4831]: I1204 10:18:17.183075 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/27020e38-b505-48b1-8068-96c14dba1b9d-kubelet-dir\") pod \"27020e38-b505-48b1-8068-96c14dba1b9d\" (UID: \"27020e38-b505-48b1-8068-96c14dba1b9d\") " Dec 04 10:18:17 crc kubenswrapper[4831]: I1204 10:18:17.183134 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/27020e38-b505-48b1-8068-96c14dba1b9d-kube-api-access\") pod \"27020e38-b505-48b1-8068-96c14dba1b9d\" (UID: \"27020e38-b505-48b1-8068-96c14dba1b9d\") " Dec 04 10:18:17 crc kubenswrapper[4831]: I1204 10:18:17.183165 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/33a67ea8-4808-4214-b17b-4f647b447bf3-kube-api-access\") pod \"33a67ea8-4808-4214-b17b-4f647b447bf3\" (UID: \"33a67ea8-4808-4214-b17b-4f647b447bf3\") " Dec 04 10:18:17 crc kubenswrapper[4831]: I1204 10:18:17.183418 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/27020e38-b505-48b1-8068-96c14dba1b9d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "27020e38-b505-48b1-8068-96c14dba1b9d" (UID: "27020e38-b505-48b1-8068-96c14dba1b9d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:18:17 crc kubenswrapper[4831]: I1204 10:18:17.183455 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33a67ea8-4808-4214-b17b-4f647b447bf3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "33a67ea8-4808-4214-b17b-4f647b447bf3" (UID: "33a67ea8-4808-4214-b17b-4f647b447bf3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:18:17 crc kubenswrapper[4831]: I1204 10:18:17.187989 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27020e38-b505-48b1-8068-96c14dba1b9d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "27020e38-b505-48b1-8068-96c14dba1b9d" (UID: "27020e38-b505-48b1-8068-96c14dba1b9d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:18:17 crc kubenswrapper[4831]: I1204 10:18:17.188828 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33a67ea8-4808-4214-b17b-4f647b447bf3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "33a67ea8-4808-4214-b17b-4f647b447bf3" (UID: "33a67ea8-4808-4214-b17b-4f647b447bf3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:18:17 crc kubenswrapper[4831]: I1204 10:18:17.284903 4831 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/33a67ea8-4808-4214-b17b-4f647b447bf3-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 04 10:18:17 crc kubenswrapper[4831]: I1204 10:18:17.284929 4831 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/27020e38-b505-48b1-8068-96c14dba1b9d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 04 10:18:17 crc kubenswrapper[4831]: I1204 10:18:17.284938 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/27020e38-b505-48b1-8068-96c14dba1b9d-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 10:18:17 crc kubenswrapper[4831]: I1204 10:18:17.284947 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/33a67ea8-4808-4214-b17b-4f647b447bf3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 10:18:17 crc kubenswrapper[4831]: I1204 10:18:17.685563 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"33a67ea8-4808-4214-b17b-4f647b447bf3","Type":"ContainerDied","Data":"56afb16f503c0258321420e467e4632ae2fddfa413473a464b53fe6e200ebd14"} Dec 04 10:18:17 crc kubenswrapper[4831]: I1204 10:18:17.685612 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 10:18:17 crc kubenswrapper[4831]: I1204 10:18:17.685634 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56afb16f503c0258321420e467e4632ae2fddfa413473a464b53fe6e200ebd14" Dec 04 10:18:17 crc kubenswrapper[4831]: I1204 10:18:17.687593 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"27020e38-b505-48b1-8068-96c14dba1b9d","Type":"ContainerDied","Data":"dd68e13090481f07ce876eb5a9ccfce072e9b27f16ed01e1e6993e5aa6893b23"} Dec 04 10:18:17 crc kubenswrapper[4831]: I1204 10:18:17.687616 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd68e13090481f07ce876eb5a9ccfce072e9b27f16ed01e1e6993e5aa6893b23" Dec 04 10:18:17 crc kubenswrapper[4831]: I1204 10:18:17.687696 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 10:18:18 crc kubenswrapper[4831]: I1204 10:18:18.381897 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cgzpd" Dec 04 10:18:18 crc kubenswrapper[4831]: I1204 10:18:18.382168 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cgzpd" Dec 04 10:18:18 crc kubenswrapper[4831]: I1204 10:18:18.440956 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cgzpd" Dec 04 10:18:21 crc kubenswrapper[4831]: I1204 10:18:21.971803 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:18:21 crc kubenswrapper[4831]: I1204 10:18:21.972141 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:18:25 crc kubenswrapper[4831]: I1204 10:18:25.726733 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2nwlb" event={"ID":"d493a63f-9d1a-4995-8698-c3b75c46d69f","Type":"ContainerStarted","Data":"9fe9dec0237d3c55a331e8c0d1c8a37f50468981792b8c52df95e449ff350f24"} Dec 04 10:18:26 crc kubenswrapper[4831]: I1204 10:18:26.735166 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c7k4c" event={"ID":"470670f0-d9fa-4aca-8086-a63b711191d1","Type":"ContainerStarted","Data":"567715934c3b8fa65dc09b65835c65eb08ba4c73479c0be2de65cf01291db06f"} Dec 04 10:18:26 crc kubenswrapper[4831]: I1204 10:18:26.750026 4831 generic.go:334] "Generic (PLEG): container finished" podID="d493a63f-9d1a-4995-8698-c3b75c46d69f" containerID="9fe9dec0237d3c55a331e8c0d1c8a37f50468981792b8c52df95e449ff350f24" exitCode=0 Dec 04 10:18:26 crc kubenswrapper[4831]: I1204 10:18:26.750078 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2nwlb" event={"ID":"d493a63f-9d1a-4995-8698-c3b75c46d69f","Type":"ContainerDied","Data":"9fe9dec0237d3c55a331e8c0d1c8a37f50468981792b8c52df95e449ff350f24"} Dec 04 10:18:27 crc kubenswrapper[4831]: I1204 10:18:27.763914 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2nwlb" event={"ID":"d493a63f-9d1a-4995-8698-c3b75c46d69f","Type":"ContainerStarted","Data":"4a558ae5d67c86868ad78b63eca19665484b05adc6602c36258c8b0d06d36e66"} Dec 04 10:18:27 crc kubenswrapper[4831]: I1204 10:18:27.766200 4831 generic.go:334] "Generic (PLEG): container finished" podID="e6ffc610-2a29-4b62-abc6-2cc0568c8bc1" containerID="cac3ad17efef243893e9b0aba847f8cd57662629862857e4e66fc5d502535d16" exitCode=0 Dec 04 10:18:27 crc kubenswrapper[4831]: I1204 10:18:27.766259 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cdsf2" event={"ID":"e6ffc610-2a29-4b62-abc6-2cc0568c8bc1","Type":"ContainerDied","Data":"cac3ad17efef243893e9b0aba847f8cd57662629862857e4e66fc5d502535d16"} Dec 04 10:18:27 crc kubenswrapper[4831]: I1204 10:18:27.769844 4831 generic.go:334] "Generic (PLEG): container finished" podID="cdf24e44-d00c-424c-aa3c-3e7fdb1a2997" containerID="f65497c5935b108d68c9c9bdfab5b9984be8d376d71d1770615e21ddff93b5e1" exitCode=0 Dec 04 10:18:27 crc kubenswrapper[4831]: I1204 10:18:27.769902 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6xp2" event={"ID":"cdf24e44-d00c-424c-aa3c-3e7fdb1a2997","Type":"ContainerDied","Data":"f65497c5935b108d68c9c9bdfab5b9984be8d376d71d1770615e21ddff93b5e1"} Dec 04 10:18:27 crc kubenswrapper[4831]: I1204 10:18:27.775894 4831 generic.go:334] "Generic (PLEG): container finished" podID="470670f0-d9fa-4aca-8086-a63b711191d1" containerID="567715934c3b8fa65dc09b65835c65eb08ba4c73479c0be2de65cf01291db06f" exitCode=0 Dec 04 10:18:27 crc kubenswrapper[4831]: I1204 10:18:27.775926 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c7k4c" event={"ID":"470670f0-d9fa-4aca-8086-a63b711191d1","Type":"ContainerDied","Data":"567715934c3b8fa65dc09b65835c65eb08ba4c73479c0be2de65cf01291db06f"} Dec 04 10:18:27 crc kubenswrapper[4831]: I1204 10:18:27.801766 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2nwlb" podStartSLOduration=3.5766211820000002 podStartE2EDuration="1m2.801742923s" podCreationTimestamp="2025-12-04 10:17:25 +0000 UTC" firstStartedPulling="2025-12-04 10:17:28.014835141 +0000 UTC m=+144.964010455" lastFinishedPulling="2025-12-04 10:18:27.239956852 +0000 UTC m=+204.189132196" observedRunningTime="2025-12-04 10:18:27.784081395 +0000 UTC m=+204.733256719" watchObservedRunningTime="2025-12-04 10:18:27.801742923 +0000 UTC m=+204.750918237" Dec 04 10:18:28 crc kubenswrapper[4831]: I1204 10:18:28.421788 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cgzpd" Dec 04 10:18:28 crc kubenswrapper[4831]: I1204 10:18:28.782880 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cdsf2" event={"ID":"e6ffc610-2a29-4b62-abc6-2cc0568c8bc1","Type":"ContainerStarted","Data":"056a642aad6281e3a4ae8a9fd8c521078f8e1651e9eebfcc44ff896cc5e23e43"} Dec 04 10:18:28 crc kubenswrapper[4831]: I1204 10:18:28.785319 4831 generic.go:334] "Generic (PLEG): container finished" podID="0164870d-aba7-43b6-b798-93ec48968837" containerID="cac80f8593894e36e4f9857cacc4e0368924db553aceea716589908a51699e70" exitCode=0 Dec 04 10:18:28 crc kubenswrapper[4831]: I1204 10:18:28.785376 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xgjhm" event={"ID":"0164870d-aba7-43b6-b798-93ec48968837","Type":"ContainerDied","Data":"cac80f8593894e36e4f9857cacc4e0368924db553aceea716589908a51699e70"} Dec 04 10:18:28 crc kubenswrapper[4831]: I1204 10:18:28.792392 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6xp2" event={"ID":"cdf24e44-d00c-424c-aa3c-3e7fdb1a2997","Type":"ContainerStarted","Data":"398a5435e0416adc6d4938d7ad0d91b9ca52042c42d9094b63038b7167258736"} Dec 04 10:18:28 crc kubenswrapper[4831]: I1204 10:18:28.804085 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cdsf2" podStartSLOduration=3.8044577569999998 podStartE2EDuration="1m3.804064419s" podCreationTimestamp="2025-12-04 10:17:25 +0000 UTC" firstStartedPulling="2025-12-04 10:17:28.235866615 +0000 UTC m=+145.185041929" lastFinishedPulling="2025-12-04 10:18:28.235473277 +0000 UTC m=+205.184648591" observedRunningTime="2025-12-04 10:18:28.80303053 +0000 UTC m=+205.752205864" watchObservedRunningTime="2025-12-04 10:18:28.804064419 +0000 UTC m=+205.753239733" Dec 04 10:18:28 crc kubenswrapper[4831]: I1204 10:18:28.847728 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-k6xp2" podStartSLOduration=3.504102079 podStartE2EDuration="1m3.847710599s" podCreationTimestamp="2025-12-04 10:17:25 +0000 UTC" firstStartedPulling="2025-12-04 10:17:27.844826342 +0000 UTC m=+144.794001656" lastFinishedPulling="2025-12-04 10:18:28.188434862 +0000 UTC m=+205.137610176" observedRunningTime="2025-12-04 10:18:28.845714662 +0000 UTC m=+205.794890006" watchObservedRunningTime="2025-12-04 10:18:28.847710599 +0000 UTC m=+205.796885913" Dec 04 10:18:29 crc kubenswrapper[4831]: I1204 10:18:29.799746 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c7k4c" event={"ID":"470670f0-d9fa-4aca-8086-a63b711191d1","Type":"ContainerStarted","Data":"ffae62b72edfa3152ace2371bb333b17c2024866896e064465545a1df1ac1199"} Dec 04 10:18:29 crc kubenswrapper[4831]: I1204 10:18:29.801603 4831 generic.go:334] "Generic (PLEG): container finished" podID="069ef39a-dfc9-4a4c-acf4-758475a8a7b0" containerID="2a385cb47e0c776be64864678c3446b6ec49aa0ce5bd5986a1e6eb8253d4d2af" exitCode=0 Dec 04 10:18:29 crc kubenswrapper[4831]: I1204 10:18:29.801702 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bbqht" event={"ID":"069ef39a-dfc9-4a4c-acf4-758475a8a7b0","Type":"ContainerDied","Data":"2a385cb47e0c776be64864678c3446b6ec49aa0ce5bd5986a1e6eb8253d4d2af"} Dec 04 10:18:29 crc kubenswrapper[4831]: I1204 10:18:29.819999 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-c7k4c" podStartSLOduration=3.737531997 podStartE2EDuration="1m1.819982458s" podCreationTimestamp="2025-12-04 10:17:28 +0000 UTC" firstStartedPulling="2025-12-04 10:17:31.317228262 +0000 UTC m=+148.266403576" lastFinishedPulling="2025-12-04 10:18:29.399678723 +0000 UTC m=+206.348854037" observedRunningTime="2025-12-04 10:18:29.81720144 +0000 UTC m=+206.766376744" watchObservedRunningTime="2025-12-04 10:18:29.819982458 +0000 UTC m=+206.769157772" Dec 04 10:18:30 crc kubenswrapper[4831]: I1204 10:18:30.808678 4831 generic.go:334] "Generic (PLEG): container finished" podID="16d53c94-9ca3-4b5a-b33d-829f44de5367" containerID="e429cd7acf7a841733146b57baf4d335bd58700bbff2acd50cd7c5e20d635e6c" exitCode=0 Dec 04 10:18:30 crc kubenswrapper[4831]: I1204 10:18:30.808847 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nbm8n" event={"ID":"16d53c94-9ca3-4b5a-b33d-829f44de5367","Type":"ContainerDied","Data":"e429cd7acf7a841733146b57baf4d335bd58700bbff2acd50cd7c5e20d635e6c"} Dec 04 10:18:30 crc kubenswrapper[4831]: I1204 10:18:30.812593 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bbqht" event={"ID":"069ef39a-dfc9-4a4c-acf4-758475a8a7b0","Type":"ContainerStarted","Data":"fd98962a5f834818a07e310d528c6e9ddbaa8d27ba5da5a19eb238fa134473d4"} Dec 04 10:18:30 crc kubenswrapper[4831]: I1204 10:18:30.815533 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xgjhm" event={"ID":"0164870d-aba7-43b6-b798-93ec48968837","Type":"ContainerStarted","Data":"d9581d7f0482b6774ac9be45016a2394459c4c00b1484d3ee835cfc5a8446204"} Dec 04 10:18:30 crc kubenswrapper[4831]: I1204 10:18:30.847835 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xgjhm" podStartSLOduration=3.979856583 podStartE2EDuration="1m3.847820424s" podCreationTimestamp="2025-12-04 10:17:27 +0000 UTC" firstStartedPulling="2025-12-04 10:17:30.302457876 +0000 UTC m=+147.251633190" lastFinishedPulling="2025-12-04 10:18:30.170421717 +0000 UTC m=+207.119597031" observedRunningTime="2025-12-04 10:18:30.844354636 +0000 UTC m=+207.793529950" watchObservedRunningTime="2025-12-04 10:18:30.847820424 +0000 UTC m=+207.796995738" Dec 04 10:18:30 crc kubenswrapper[4831]: I1204 10:18:30.875770 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bbqht" podStartSLOduration=3.98008422 podStartE2EDuration="1m2.87575163s" podCreationTimestamp="2025-12-04 10:17:28 +0000 UTC" firstStartedPulling="2025-12-04 10:17:31.333396029 +0000 UTC m=+148.282571353" lastFinishedPulling="2025-12-04 10:18:30.229063449 +0000 UTC m=+207.178238763" observedRunningTime="2025-12-04 10:18:30.874387622 +0000 UTC m=+207.823562956" watchObservedRunningTime="2025-12-04 10:18:30.87575163 +0000 UTC m=+207.824926944" Dec 04 10:18:30 crc kubenswrapper[4831]: I1204 10:18:30.908822 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cgzpd"] Dec 04 10:18:30 crc kubenswrapper[4831]: I1204 10:18:30.909200 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cgzpd" podUID="c1d2b5a4-7636-444a-88c3-38bd88a35f99" containerName="registry-server" containerID="cri-o://19d6203266a999f016a082635afb1e4d0a1870ab5c188da90569ce268d02bffe" gracePeriod=2 Dec 04 10:18:31 crc kubenswrapper[4831]: I1204 10:18:31.258868 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cgzpd" Dec 04 10:18:31 crc kubenswrapper[4831]: I1204 10:18:31.360820 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1d2b5a4-7636-444a-88c3-38bd88a35f99-catalog-content\") pod \"c1d2b5a4-7636-444a-88c3-38bd88a35f99\" (UID: \"c1d2b5a4-7636-444a-88c3-38bd88a35f99\") " Dec 04 10:18:31 crc kubenswrapper[4831]: I1204 10:18:31.361070 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1d2b5a4-7636-444a-88c3-38bd88a35f99-utilities\") pod \"c1d2b5a4-7636-444a-88c3-38bd88a35f99\" (UID: \"c1d2b5a4-7636-444a-88c3-38bd88a35f99\") " Dec 04 10:18:31 crc kubenswrapper[4831]: I1204 10:18:31.361110 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbtj9\" (UniqueName: \"kubernetes.io/projected/c1d2b5a4-7636-444a-88c3-38bd88a35f99-kube-api-access-cbtj9\") pod \"c1d2b5a4-7636-444a-88c3-38bd88a35f99\" (UID: \"c1d2b5a4-7636-444a-88c3-38bd88a35f99\") " Dec 04 10:18:31 crc kubenswrapper[4831]: I1204 10:18:31.361860 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1d2b5a4-7636-444a-88c3-38bd88a35f99-utilities" (OuterVolumeSpecName: "utilities") pod "c1d2b5a4-7636-444a-88c3-38bd88a35f99" (UID: "c1d2b5a4-7636-444a-88c3-38bd88a35f99"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:18:31 crc kubenswrapper[4831]: I1204 10:18:31.370395 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1d2b5a4-7636-444a-88c3-38bd88a35f99-kube-api-access-cbtj9" (OuterVolumeSpecName: "kube-api-access-cbtj9") pod "c1d2b5a4-7636-444a-88c3-38bd88a35f99" (UID: "c1d2b5a4-7636-444a-88c3-38bd88a35f99"). InnerVolumeSpecName "kube-api-access-cbtj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:18:31 crc kubenswrapper[4831]: I1204 10:18:31.382703 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1d2b5a4-7636-444a-88c3-38bd88a35f99-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c1d2b5a4-7636-444a-88c3-38bd88a35f99" (UID: "c1d2b5a4-7636-444a-88c3-38bd88a35f99"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:18:31 crc kubenswrapper[4831]: I1204 10:18:31.462763 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1d2b5a4-7636-444a-88c3-38bd88a35f99-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:18:31 crc kubenswrapper[4831]: I1204 10:18:31.463091 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbtj9\" (UniqueName: \"kubernetes.io/projected/c1d2b5a4-7636-444a-88c3-38bd88a35f99-kube-api-access-cbtj9\") on node \"crc\" DevicePath \"\"" Dec 04 10:18:31 crc kubenswrapper[4831]: I1204 10:18:31.463180 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1d2b5a4-7636-444a-88c3-38bd88a35f99-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:18:31 crc kubenswrapper[4831]: I1204 10:18:31.821373 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nbm8n" event={"ID":"16d53c94-9ca3-4b5a-b33d-829f44de5367","Type":"ContainerStarted","Data":"771ce5120d6ca6775253a55e3f28ae12556da44b973d47e5e26e216dd46c01f4"} Dec 04 10:18:31 crc kubenswrapper[4831]: I1204 10:18:31.823756 4831 generic.go:334] "Generic (PLEG): container finished" podID="c1d2b5a4-7636-444a-88c3-38bd88a35f99" containerID="19d6203266a999f016a082635afb1e4d0a1870ab5c188da90569ce268d02bffe" exitCode=0 Dec 04 10:18:31 crc kubenswrapper[4831]: I1204 10:18:31.823788 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cgzpd" event={"ID":"c1d2b5a4-7636-444a-88c3-38bd88a35f99","Type":"ContainerDied","Data":"19d6203266a999f016a082635afb1e4d0a1870ab5c188da90569ce268d02bffe"} Dec 04 10:18:31 crc kubenswrapper[4831]: I1204 10:18:31.823812 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cgzpd" event={"ID":"c1d2b5a4-7636-444a-88c3-38bd88a35f99","Type":"ContainerDied","Data":"5a8b3302b4c33265a169cf3814d4e9b6419bee391df98534c2da1cc22dae181b"} Dec 04 10:18:31 crc kubenswrapper[4831]: I1204 10:18:31.823813 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cgzpd" Dec 04 10:18:31 crc kubenswrapper[4831]: I1204 10:18:31.823832 4831 scope.go:117] "RemoveContainer" containerID="19d6203266a999f016a082635afb1e4d0a1870ab5c188da90569ce268d02bffe" Dec 04 10:18:31 crc kubenswrapper[4831]: I1204 10:18:31.838075 4831 scope.go:117] "RemoveContainer" containerID="aa08b3dc6719a5ff09f802a7f4eaa1b4f9210d4f9faf827b39f984dbd54c5ccf" Dec 04 10:18:31 crc kubenswrapper[4831]: I1204 10:18:31.849965 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nbm8n" podStartSLOduration=3.587194502 podStartE2EDuration="1m6.849940795s" podCreationTimestamp="2025-12-04 10:17:25 +0000 UTC" firstStartedPulling="2025-12-04 10:17:28.226146044 +0000 UTC m=+145.175321358" lastFinishedPulling="2025-12-04 10:18:31.488892337 +0000 UTC m=+208.438067651" observedRunningTime="2025-12-04 10:18:31.847454165 +0000 UTC m=+208.796629509" watchObservedRunningTime="2025-12-04 10:18:31.849940795 +0000 UTC m=+208.799116129" Dec 04 10:18:31 crc kubenswrapper[4831]: I1204 10:18:31.857736 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cgzpd"] Dec 04 10:18:31 crc kubenswrapper[4831]: I1204 10:18:31.863296 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cgzpd"] Dec 04 10:18:31 crc kubenswrapper[4831]: I1204 10:18:31.870284 4831 scope.go:117] "RemoveContainer" containerID="bda75a0751b7174d619161e97d7751cd79e9e49952bf29f6262318c72c526482" Dec 04 10:18:31 crc kubenswrapper[4831]: I1204 10:18:31.881782 4831 scope.go:117] "RemoveContainer" containerID="19d6203266a999f016a082635afb1e4d0a1870ab5c188da90569ce268d02bffe" Dec 04 10:18:31 crc kubenswrapper[4831]: E1204 10:18:31.882273 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19d6203266a999f016a082635afb1e4d0a1870ab5c188da90569ce268d02bffe\": container with ID starting with 19d6203266a999f016a082635afb1e4d0a1870ab5c188da90569ce268d02bffe not found: ID does not exist" containerID="19d6203266a999f016a082635afb1e4d0a1870ab5c188da90569ce268d02bffe" Dec 04 10:18:31 crc kubenswrapper[4831]: I1204 10:18:31.882316 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19d6203266a999f016a082635afb1e4d0a1870ab5c188da90569ce268d02bffe"} err="failed to get container status \"19d6203266a999f016a082635afb1e4d0a1870ab5c188da90569ce268d02bffe\": rpc error: code = NotFound desc = could not find container \"19d6203266a999f016a082635afb1e4d0a1870ab5c188da90569ce268d02bffe\": container with ID starting with 19d6203266a999f016a082635afb1e4d0a1870ab5c188da90569ce268d02bffe not found: ID does not exist" Dec 04 10:18:31 crc kubenswrapper[4831]: I1204 10:18:31.882393 4831 scope.go:117] "RemoveContainer" containerID="aa08b3dc6719a5ff09f802a7f4eaa1b4f9210d4f9faf827b39f984dbd54c5ccf" Dec 04 10:18:31 crc kubenswrapper[4831]: E1204 10:18:31.882822 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa08b3dc6719a5ff09f802a7f4eaa1b4f9210d4f9faf827b39f984dbd54c5ccf\": container with ID starting with aa08b3dc6719a5ff09f802a7f4eaa1b4f9210d4f9faf827b39f984dbd54c5ccf not found: ID does not exist" containerID="aa08b3dc6719a5ff09f802a7f4eaa1b4f9210d4f9faf827b39f984dbd54c5ccf" Dec 04 10:18:31 crc kubenswrapper[4831]: I1204 10:18:31.882846 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa08b3dc6719a5ff09f802a7f4eaa1b4f9210d4f9faf827b39f984dbd54c5ccf"} err="failed to get container status \"aa08b3dc6719a5ff09f802a7f4eaa1b4f9210d4f9faf827b39f984dbd54c5ccf\": rpc error: code = NotFound desc = could not find container \"aa08b3dc6719a5ff09f802a7f4eaa1b4f9210d4f9faf827b39f984dbd54c5ccf\": container with ID starting with aa08b3dc6719a5ff09f802a7f4eaa1b4f9210d4f9faf827b39f984dbd54c5ccf not found: ID does not exist" Dec 04 10:18:31 crc kubenswrapper[4831]: I1204 10:18:31.882865 4831 scope.go:117] "RemoveContainer" containerID="bda75a0751b7174d619161e97d7751cd79e9e49952bf29f6262318c72c526482" Dec 04 10:18:31 crc kubenswrapper[4831]: E1204 10:18:31.883189 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bda75a0751b7174d619161e97d7751cd79e9e49952bf29f6262318c72c526482\": container with ID starting with bda75a0751b7174d619161e97d7751cd79e9e49952bf29f6262318c72c526482 not found: ID does not exist" containerID="bda75a0751b7174d619161e97d7751cd79e9e49952bf29f6262318c72c526482" Dec 04 10:18:31 crc kubenswrapper[4831]: I1204 10:18:31.883222 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bda75a0751b7174d619161e97d7751cd79e9e49952bf29f6262318c72c526482"} err="failed to get container status \"bda75a0751b7174d619161e97d7751cd79e9e49952bf29f6262318c72c526482\": rpc error: code = NotFound desc = could not find container \"bda75a0751b7174d619161e97d7751cd79e9e49952bf29f6262318c72c526482\": container with ID starting with bda75a0751b7174d619161e97d7751cd79e9e49952bf29f6262318c72c526482 not found: ID does not exist" Dec 04 10:18:33 crc kubenswrapper[4831]: I1204 10:18:33.283026 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1d2b5a4-7636-444a-88c3-38bd88a35f99" path="/var/lib/kubelet/pods/c1d2b5a4-7636-444a-88c3-38bd88a35f99/volumes" Dec 04 10:18:35 crc kubenswrapper[4831]: I1204 10:18:35.750705 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-k6xp2" Dec 04 10:18:35 crc kubenswrapper[4831]: I1204 10:18:35.751145 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-k6xp2" Dec 04 10:18:35 crc kubenswrapper[4831]: I1204 10:18:35.799092 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-k6xp2" Dec 04 10:18:35 crc kubenswrapper[4831]: I1204 10:18:35.893718 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-k6xp2" Dec 04 10:18:35 crc kubenswrapper[4831]: I1204 10:18:35.946716 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nbm8n" Dec 04 10:18:35 crc kubenswrapper[4831]: I1204 10:18:35.946785 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nbm8n" Dec 04 10:18:35 crc kubenswrapper[4831]: I1204 10:18:35.987827 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nbm8n" Dec 04 10:18:36 crc kubenswrapper[4831]: I1204 10:18:36.176315 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cdsf2" Dec 04 10:18:36 crc kubenswrapper[4831]: I1204 10:18:36.176366 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cdsf2" Dec 04 10:18:36 crc kubenswrapper[4831]: I1204 10:18:36.240861 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cdsf2" Dec 04 10:18:36 crc kubenswrapper[4831]: I1204 10:18:36.374692 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2nwlb" Dec 04 10:18:36 crc kubenswrapper[4831]: I1204 10:18:36.374751 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2nwlb" Dec 04 10:18:36 crc kubenswrapper[4831]: I1204 10:18:36.416049 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2nwlb" Dec 04 10:18:36 crc kubenswrapper[4831]: I1204 10:18:36.889979 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nbm8n" Dec 04 10:18:36 crc kubenswrapper[4831]: I1204 10:18:36.892403 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2nwlb" Dec 04 10:18:36 crc kubenswrapper[4831]: I1204 10:18:36.893456 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cdsf2" Dec 04 10:18:38 crc kubenswrapper[4831]: I1204 10:18:38.201035 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xgjhm" Dec 04 10:18:38 crc kubenswrapper[4831]: I1204 10:18:38.201119 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xgjhm" Dec 04 10:18:38 crc kubenswrapper[4831]: I1204 10:18:38.246950 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xgjhm" Dec 04 10:18:38 crc kubenswrapper[4831]: I1204 10:18:38.317687 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2nwlb"] Dec 04 10:18:38 crc kubenswrapper[4831]: I1204 10:18:38.874446 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2nwlb" podUID="d493a63f-9d1a-4995-8698-c3b75c46d69f" containerName="registry-server" containerID="cri-o://4a558ae5d67c86868ad78b63eca19665484b05adc6602c36258c8b0d06d36e66" gracePeriod=2 Dec 04 10:18:38 crc kubenswrapper[4831]: I1204 10:18:38.912018 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xgjhm" Dec 04 10:18:38 crc kubenswrapper[4831]: I1204 10:18:38.936807 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bbqht" Dec 04 10:18:38 crc kubenswrapper[4831]: I1204 10:18:38.937078 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bbqht" Dec 04 10:18:38 crc kubenswrapper[4831]: I1204 10:18:38.977215 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bbqht" Dec 04 10:18:39 crc kubenswrapper[4831]: I1204 10:18:39.307510 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-c7k4c" Dec 04 10:18:39 crc kubenswrapper[4831]: I1204 10:18:39.307571 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-c7k4c" Dec 04 10:18:39 crc kubenswrapper[4831]: I1204 10:18:39.312986 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cdsf2"] Dec 04 10:18:39 crc kubenswrapper[4831]: I1204 10:18:39.313260 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cdsf2" podUID="e6ffc610-2a29-4b62-abc6-2cc0568c8bc1" containerName="registry-server" containerID="cri-o://056a642aad6281e3a4ae8a9fd8c521078f8e1651e9eebfcc44ff896cc5e23e43" gracePeriod=2 Dec 04 10:18:39 crc kubenswrapper[4831]: I1204 10:18:39.346771 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-c7k4c" Dec 04 10:18:39 crc kubenswrapper[4831]: I1204 10:18:39.929532 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bbqht" Dec 04 10:18:39 crc kubenswrapper[4831]: I1204 10:18:39.947195 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-c7k4c" Dec 04 10:18:40 crc kubenswrapper[4831]: I1204 10:18:40.886620 4831 generic.go:334] "Generic (PLEG): container finished" podID="d493a63f-9d1a-4995-8698-c3b75c46d69f" containerID="4a558ae5d67c86868ad78b63eca19665484b05adc6602c36258c8b0d06d36e66" exitCode=0 Dec 04 10:18:40 crc kubenswrapper[4831]: I1204 10:18:40.886696 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2nwlb" event={"ID":"d493a63f-9d1a-4995-8698-c3b75c46d69f","Type":"ContainerDied","Data":"4a558ae5d67c86868ad78b63eca19665484b05adc6602c36258c8b0d06d36e66"} Dec 04 10:18:40 crc kubenswrapper[4831]: I1204 10:18:40.889862 4831 generic.go:334] "Generic (PLEG): container finished" podID="e6ffc610-2a29-4b62-abc6-2cc0568c8bc1" containerID="056a642aad6281e3a4ae8a9fd8c521078f8e1651e9eebfcc44ff896cc5e23e43" exitCode=0 Dec 04 10:18:40 crc kubenswrapper[4831]: I1204 10:18:40.889969 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cdsf2" event={"ID":"e6ffc610-2a29-4b62-abc6-2cc0568c8bc1","Type":"ContainerDied","Data":"056a642aad6281e3a4ae8a9fd8c521078f8e1651e9eebfcc44ff896cc5e23e43"} Dec 04 10:18:41 crc kubenswrapper[4831]: I1204 10:18:41.312024 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2nwlb" Dec 04 10:18:41 crc kubenswrapper[4831]: I1204 10:18:41.494466 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d493a63f-9d1a-4995-8698-c3b75c46d69f-catalog-content\") pod \"d493a63f-9d1a-4995-8698-c3b75c46d69f\" (UID: \"d493a63f-9d1a-4995-8698-c3b75c46d69f\") " Dec 04 10:18:41 crc kubenswrapper[4831]: I1204 10:18:41.494554 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sxrm\" (UniqueName: \"kubernetes.io/projected/d493a63f-9d1a-4995-8698-c3b75c46d69f-kube-api-access-9sxrm\") pod \"d493a63f-9d1a-4995-8698-c3b75c46d69f\" (UID: \"d493a63f-9d1a-4995-8698-c3b75c46d69f\") " Dec 04 10:18:41 crc kubenswrapper[4831]: I1204 10:18:41.494583 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d493a63f-9d1a-4995-8698-c3b75c46d69f-utilities\") pod \"d493a63f-9d1a-4995-8698-c3b75c46d69f\" (UID: \"d493a63f-9d1a-4995-8698-c3b75c46d69f\") " Dec 04 10:18:41 crc kubenswrapper[4831]: I1204 10:18:41.495509 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d493a63f-9d1a-4995-8698-c3b75c46d69f-utilities" (OuterVolumeSpecName: "utilities") pod "d493a63f-9d1a-4995-8698-c3b75c46d69f" (UID: "d493a63f-9d1a-4995-8698-c3b75c46d69f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:18:41 crc kubenswrapper[4831]: I1204 10:18:41.502332 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d493a63f-9d1a-4995-8698-c3b75c46d69f-kube-api-access-9sxrm" (OuterVolumeSpecName: "kube-api-access-9sxrm") pod "d493a63f-9d1a-4995-8698-c3b75c46d69f" (UID: "d493a63f-9d1a-4995-8698-c3b75c46d69f"). InnerVolumeSpecName "kube-api-access-9sxrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:18:41 crc kubenswrapper[4831]: I1204 10:18:41.554553 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d493a63f-9d1a-4995-8698-c3b75c46d69f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d493a63f-9d1a-4995-8698-c3b75c46d69f" (UID: "d493a63f-9d1a-4995-8698-c3b75c46d69f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:18:41 crc kubenswrapper[4831]: I1204 10:18:41.596150 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d493a63f-9d1a-4995-8698-c3b75c46d69f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:18:41 crc kubenswrapper[4831]: I1204 10:18:41.596198 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sxrm\" (UniqueName: \"kubernetes.io/projected/d493a63f-9d1a-4995-8698-c3b75c46d69f-kube-api-access-9sxrm\") on node \"crc\" DevicePath \"\"" Dec 04 10:18:41 crc kubenswrapper[4831]: I1204 10:18:41.596210 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d493a63f-9d1a-4995-8698-c3b75c46d69f-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:18:41 crc kubenswrapper[4831]: I1204 10:18:41.687515 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cdsf2" Dec 04 10:18:41 crc kubenswrapper[4831]: I1204 10:18:41.798074 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6ffc610-2a29-4b62-abc6-2cc0568c8bc1-catalog-content\") pod \"e6ffc610-2a29-4b62-abc6-2cc0568c8bc1\" (UID: \"e6ffc610-2a29-4b62-abc6-2cc0568c8bc1\") " Dec 04 10:18:41 crc kubenswrapper[4831]: I1204 10:18:41.798138 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tb4fz\" (UniqueName: \"kubernetes.io/projected/e6ffc610-2a29-4b62-abc6-2cc0568c8bc1-kube-api-access-tb4fz\") pod \"e6ffc610-2a29-4b62-abc6-2cc0568c8bc1\" (UID: \"e6ffc610-2a29-4b62-abc6-2cc0568c8bc1\") " Dec 04 10:18:41 crc kubenswrapper[4831]: I1204 10:18:41.798159 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6ffc610-2a29-4b62-abc6-2cc0568c8bc1-utilities\") pod \"e6ffc610-2a29-4b62-abc6-2cc0568c8bc1\" (UID: \"e6ffc610-2a29-4b62-abc6-2cc0568c8bc1\") " Dec 04 10:18:41 crc kubenswrapper[4831]: I1204 10:18:41.798990 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6ffc610-2a29-4b62-abc6-2cc0568c8bc1-utilities" (OuterVolumeSpecName: "utilities") pod "e6ffc610-2a29-4b62-abc6-2cc0568c8bc1" (UID: "e6ffc610-2a29-4b62-abc6-2cc0568c8bc1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:18:41 crc kubenswrapper[4831]: I1204 10:18:41.802366 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6ffc610-2a29-4b62-abc6-2cc0568c8bc1-kube-api-access-tb4fz" (OuterVolumeSpecName: "kube-api-access-tb4fz") pod "e6ffc610-2a29-4b62-abc6-2cc0568c8bc1" (UID: "e6ffc610-2a29-4b62-abc6-2cc0568c8bc1"). InnerVolumeSpecName "kube-api-access-tb4fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:18:41 crc kubenswrapper[4831]: I1204 10:18:41.841190 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6ffc610-2a29-4b62-abc6-2cc0568c8bc1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6ffc610-2a29-4b62-abc6-2cc0568c8bc1" (UID: "e6ffc610-2a29-4b62-abc6-2cc0568c8bc1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:18:41 crc kubenswrapper[4831]: I1204 10:18:41.897507 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2nwlb" event={"ID":"d493a63f-9d1a-4995-8698-c3b75c46d69f","Type":"ContainerDied","Data":"98ed8bafb6fd6deef417dd7fea6fac8fd18d995cf61be09e6819f65959725801"} Dec 04 10:18:41 crc kubenswrapper[4831]: I1204 10:18:41.897770 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2nwlb" Dec 04 10:18:41 crc kubenswrapper[4831]: I1204 10:18:41.898111 4831 scope.go:117] "RemoveContainer" containerID="4a558ae5d67c86868ad78b63eca19665484b05adc6602c36258c8b0d06d36e66" Dec 04 10:18:41 crc kubenswrapper[4831]: I1204 10:18:41.899008 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6ffc610-2a29-4b62-abc6-2cc0568c8bc1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:18:41 crc kubenswrapper[4831]: I1204 10:18:41.899090 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tb4fz\" (UniqueName: \"kubernetes.io/projected/e6ffc610-2a29-4b62-abc6-2cc0568c8bc1-kube-api-access-tb4fz\") on node \"crc\" DevicePath \"\"" Dec 04 10:18:41 crc kubenswrapper[4831]: I1204 10:18:41.899165 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6ffc610-2a29-4b62-abc6-2cc0568c8bc1-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:18:41 crc kubenswrapper[4831]: I1204 10:18:41.901546 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cdsf2" Dec 04 10:18:41 crc kubenswrapper[4831]: I1204 10:18:41.901707 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cdsf2" event={"ID":"e6ffc610-2a29-4b62-abc6-2cc0568c8bc1","Type":"ContainerDied","Data":"8ac161e7372733eb638a32be109e3ec5d29d6d4fcd912058b1f7d75d86fef5fc"} Dec 04 10:18:41 crc kubenswrapper[4831]: I1204 10:18:41.926525 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2nwlb"] Dec 04 10:18:41 crc kubenswrapper[4831]: I1204 10:18:41.934773 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2nwlb"] Dec 04 10:18:41 crc kubenswrapper[4831]: I1204 10:18:41.940089 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cdsf2"] Dec 04 10:18:41 crc kubenswrapper[4831]: I1204 10:18:41.942793 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cdsf2"] Dec 04 10:18:41 crc kubenswrapper[4831]: I1204 10:18:41.958759 4831 scope.go:117] "RemoveContainer" containerID="9fe9dec0237d3c55a331e8c0d1c8a37f50468981792b8c52df95e449ff350f24" Dec 04 10:18:41 crc kubenswrapper[4831]: I1204 10:18:41.973156 4831 scope.go:117] "RemoveContainer" containerID="d503e5db61636659044d128fb4d62d8422ad3b5040e1d73af8e853aa5ffda2fc" Dec 04 10:18:41 crc kubenswrapper[4831]: I1204 10:18:41.987846 4831 scope.go:117] "RemoveContainer" containerID="056a642aad6281e3a4ae8a9fd8c521078f8e1651e9eebfcc44ff896cc5e23e43" Dec 04 10:18:42 crc kubenswrapper[4831]: I1204 10:18:42.003406 4831 scope.go:117] "RemoveContainer" containerID="cac3ad17efef243893e9b0aba847f8cd57662629862857e4e66fc5d502535d16" Dec 04 10:18:42 crc kubenswrapper[4831]: I1204 10:18:42.021999 4831 scope.go:117] "RemoveContainer" containerID="39b244c2954a101a943244d0a98bb0600199e5dfb21b16b42d2856e5a42dd095" Dec 04 10:18:42 crc kubenswrapper[4831]: I1204 10:18:42.711420 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c7k4c"] Dec 04 10:18:42 crc kubenswrapper[4831]: I1204 10:18:42.711732 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-c7k4c" podUID="470670f0-d9fa-4aca-8086-a63b711191d1" containerName="registry-server" containerID="cri-o://ffae62b72edfa3152ace2371bb333b17c2024866896e064465545a1df1ac1199" gracePeriod=2 Dec 04 10:18:43 crc kubenswrapper[4831]: I1204 10:18:43.286165 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d493a63f-9d1a-4995-8698-c3b75c46d69f" path="/var/lib/kubelet/pods/d493a63f-9d1a-4995-8698-c3b75c46d69f/volumes" Dec 04 10:18:43 crc kubenswrapper[4831]: I1204 10:18:43.287885 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6ffc610-2a29-4b62-abc6-2cc0568c8bc1" path="/var/lib/kubelet/pods/e6ffc610-2a29-4b62-abc6-2cc0568c8bc1/volumes" Dec 04 10:18:43 crc kubenswrapper[4831]: I1204 10:18:43.919067 4831 generic.go:334] "Generic (PLEG): container finished" podID="470670f0-d9fa-4aca-8086-a63b711191d1" containerID="ffae62b72edfa3152ace2371bb333b17c2024866896e064465545a1df1ac1199" exitCode=0 Dec 04 10:18:43 crc kubenswrapper[4831]: I1204 10:18:43.919106 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c7k4c" event={"ID":"470670f0-d9fa-4aca-8086-a63b711191d1","Type":"ContainerDied","Data":"ffae62b72edfa3152ace2371bb333b17c2024866896e064465545a1df1ac1199"} Dec 04 10:18:44 crc kubenswrapper[4831]: I1204 10:18:44.549530 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c7k4c" Dec 04 10:18:44 crc kubenswrapper[4831]: I1204 10:18:44.741225 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/470670f0-d9fa-4aca-8086-a63b711191d1-utilities\") pod \"470670f0-d9fa-4aca-8086-a63b711191d1\" (UID: \"470670f0-d9fa-4aca-8086-a63b711191d1\") " Dec 04 10:18:44 crc kubenswrapper[4831]: I1204 10:18:44.741355 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/470670f0-d9fa-4aca-8086-a63b711191d1-catalog-content\") pod \"470670f0-d9fa-4aca-8086-a63b711191d1\" (UID: \"470670f0-d9fa-4aca-8086-a63b711191d1\") " Dec 04 10:18:44 crc kubenswrapper[4831]: I1204 10:18:44.741494 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2ptk\" (UniqueName: \"kubernetes.io/projected/470670f0-d9fa-4aca-8086-a63b711191d1-kube-api-access-g2ptk\") pod \"470670f0-d9fa-4aca-8086-a63b711191d1\" (UID: \"470670f0-d9fa-4aca-8086-a63b711191d1\") " Dec 04 10:18:44 crc kubenswrapper[4831]: I1204 10:18:44.742357 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/470670f0-d9fa-4aca-8086-a63b711191d1-utilities" (OuterVolumeSpecName: "utilities") pod "470670f0-d9fa-4aca-8086-a63b711191d1" (UID: "470670f0-d9fa-4aca-8086-a63b711191d1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:18:44 crc kubenswrapper[4831]: I1204 10:18:44.746115 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/470670f0-d9fa-4aca-8086-a63b711191d1-kube-api-access-g2ptk" (OuterVolumeSpecName: "kube-api-access-g2ptk") pod "470670f0-d9fa-4aca-8086-a63b711191d1" (UID: "470670f0-d9fa-4aca-8086-a63b711191d1"). InnerVolumeSpecName "kube-api-access-g2ptk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:18:44 crc kubenswrapper[4831]: I1204 10:18:44.843230 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2ptk\" (UniqueName: \"kubernetes.io/projected/470670f0-d9fa-4aca-8086-a63b711191d1-kube-api-access-g2ptk\") on node \"crc\" DevicePath \"\"" Dec 04 10:18:44 crc kubenswrapper[4831]: I1204 10:18:44.843268 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/470670f0-d9fa-4aca-8086-a63b711191d1-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:18:44 crc kubenswrapper[4831]: I1204 10:18:44.925529 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c7k4c" event={"ID":"470670f0-d9fa-4aca-8086-a63b711191d1","Type":"ContainerDied","Data":"2c4e1513bfb1d8789eff855f374410bd10bbe99eb9dd3a9364990461d369f014"} Dec 04 10:18:44 crc kubenswrapper[4831]: I1204 10:18:44.925584 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c7k4c" Dec 04 10:18:44 crc kubenswrapper[4831]: I1204 10:18:44.925599 4831 scope.go:117] "RemoveContainer" containerID="ffae62b72edfa3152ace2371bb333b17c2024866896e064465545a1df1ac1199" Dec 04 10:18:44 crc kubenswrapper[4831]: I1204 10:18:44.943887 4831 scope.go:117] "RemoveContainer" containerID="567715934c3b8fa65dc09b65835c65eb08ba4c73479c0be2de65cf01291db06f" Dec 04 10:18:44 crc kubenswrapper[4831]: I1204 10:18:44.970840 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/470670f0-d9fa-4aca-8086-a63b711191d1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "470670f0-d9fa-4aca-8086-a63b711191d1" (UID: "470670f0-d9fa-4aca-8086-a63b711191d1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:18:44 crc kubenswrapper[4831]: I1204 10:18:44.980262 4831 scope.go:117] "RemoveContainer" containerID="4d1beaf5fdb87acf1375d4ec15b2e393aa26e577d1bf40ae0476755eb6374ec7" Dec 04 10:18:45 crc kubenswrapper[4831]: I1204 10:18:45.045439 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/470670f0-d9fa-4aca-8086-a63b711191d1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:18:45 crc kubenswrapper[4831]: I1204 10:18:45.252224 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c7k4c"] Dec 04 10:18:45 crc kubenswrapper[4831]: I1204 10:18:45.259177 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-c7k4c"] Dec 04 10:18:45 crc kubenswrapper[4831]: I1204 10:18:45.282519 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="470670f0-d9fa-4aca-8086-a63b711191d1" path="/var/lib/kubelet/pods/470670f0-d9fa-4aca-8086-a63b711191d1/volumes" Dec 04 10:18:51 crc kubenswrapper[4831]: I1204 10:18:51.855834 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-d8prt"] Dec 04 10:18:51 crc kubenswrapper[4831]: I1204 10:18:51.971370 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:18:51 crc kubenswrapper[4831]: I1204 10:18:51.971610 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:18:51 crc kubenswrapper[4831]: I1204 10:18:51.971719 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" Dec 04 10:18:51 crc kubenswrapper[4831]: I1204 10:18:51.972303 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5399a139e421589fc50e4bad256402be123e37600f5c39eadf919ce767e9eeeb"} pod="openshift-machine-config-operator/machine-config-daemon-g76nn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 10:18:51 crc kubenswrapper[4831]: I1204 10:18:51.972476 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" containerID="cri-o://5399a139e421589fc50e4bad256402be123e37600f5c39eadf919ce767e9eeeb" gracePeriod=600 Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.498989 4831 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.499502 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320" gracePeriod=15 Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.499569 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc" gracePeriod=15 Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.499623 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8" gracePeriod=15 Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.499612 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5" gracePeriod=15 Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.499752 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f" gracePeriod=15 Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.509193 4831 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 04 10:18:52 crc kubenswrapper[4831]: E1204 10:18:52.509538 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.509564 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 04 10:18:52 crc kubenswrapper[4831]: E1204 10:18:52.509579 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.509593 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 04 10:18:52 crc kubenswrapper[4831]: E1204 10:18:52.509615 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.509628 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 04 10:18:52 crc kubenswrapper[4831]: E1204 10:18:52.509647 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27020e38-b505-48b1-8068-96c14dba1b9d" containerName="pruner" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.509683 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="27020e38-b505-48b1-8068-96c14dba1b9d" containerName="pruner" Dec 04 10:18:52 crc kubenswrapper[4831]: E1204 10:18:52.509700 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.509712 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 04 10:18:52 crc kubenswrapper[4831]: E1204 10:18:52.509724 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d493a63f-9d1a-4995-8698-c3b75c46d69f" containerName="extract-utilities" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.509736 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="d493a63f-9d1a-4995-8698-c3b75c46d69f" containerName="extract-utilities" Dec 04 10:18:52 crc kubenswrapper[4831]: E1204 10:18:52.509752 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1d2b5a4-7636-444a-88c3-38bd88a35f99" containerName="extract-content" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.509764 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1d2b5a4-7636-444a-88c3-38bd88a35f99" containerName="extract-content" Dec 04 10:18:52 crc kubenswrapper[4831]: E1204 10:18:52.509779 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d493a63f-9d1a-4995-8698-c3b75c46d69f" containerName="registry-server" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.509791 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="d493a63f-9d1a-4995-8698-c3b75c46d69f" containerName="registry-server" Dec 04 10:18:52 crc kubenswrapper[4831]: E1204 10:18:52.509806 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.509818 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 04 10:18:52 crc kubenswrapper[4831]: E1204 10:18:52.509837 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6ffc610-2a29-4b62-abc6-2cc0568c8bc1" containerName="registry-server" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.509850 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6ffc610-2a29-4b62-abc6-2cc0568c8bc1" containerName="registry-server" Dec 04 10:18:52 crc kubenswrapper[4831]: E1204 10:18:52.509864 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="470670f0-d9fa-4aca-8086-a63b711191d1" containerName="extract-utilities" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.509876 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="470670f0-d9fa-4aca-8086-a63b711191d1" containerName="extract-utilities" Dec 04 10:18:52 crc kubenswrapper[4831]: E1204 10:18:52.509895 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1d2b5a4-7636-444a-88c3-38bd88a35f99" containerName="registry-server" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.509907 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1d2b5a4-7636-444a-88c3-38bd88a35f99" containerName="registry-server" Dec 04 10:18:52 crc kubenswrapper[4831]: E1204 10:18:52.509925 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="470670f0-d9fa-4aca-8086-a63b711191d1" containerName="extract-content" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.509938 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="470670f0-d9fa-4aca-8086-a63b711191d1" containerName="extract-content" Dec 04 10:18:52 crc kubenswrapper[4831]: E1204 10:18:52.509952 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6ffc610-2a29-4b62-abc6-2cc0568c8bc1" containerName="extract-content" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.509963 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6ffc610-2a29-4b62-abc6-2cc0568c8bc1" containerName="extract-content" Dec 04 10:18:52 crc kubenswrapper[4831]: E1204 10:18:52.509980 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6ffc610-2a29-4b62-abc6-2cc0568c8bc1" containerName="extract-utilities" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.509992 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6ffc610-2a29-4b62-abc6-2cc0568c8bc1" containerName="extract-utilities" Dec 04 10:18:52 crc kubenswrapper[4831]: E1204 10:18:52.510010 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1d2b5a4-7636-444a-88c3-38bd88a35f99" containerName="extract-utilities" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.510021 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1d2b5a4-7636-444a-88c3-38bd88a35f99" containerName="extract-utilities" Dec 04 10:18:52 crc kubenswrapper[4831]: E1204 10:18:52.510040 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="470670f0-d9fa-4aca-8086-a63b711191d1" containerName="registry-server" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.510052 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="470670f0-d9fa-4aca-8086-a63b711191d1" containerName="registry-server" Dec 04 10:18:52 crc kubenswrapper[4831]: E1204 10:18:52.510069 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.510082 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 04 10:18:52 crc kubenswrapper[4831]: E1204 10:18:52.510096 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d493a63f-9d1a-4995-8698-c3b75c46d69f" containerName="extract-content" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.510108 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="d493a63f-9d1a-4995-8698-c3b75c46d69f" containerName="extract-content" Dec 04 10:18:52 crc kubenswrapper[4831]: E1204 10:18:52.510126 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.510138 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 04 10:18:52 crc kubenswrapper[4831]: E1204 10:18:52.510155 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33a67ea8-4808-4214-b17b-4f647b447bf3" containerName="pruner" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.510169 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="33a67ea8-4808-4214-b17b-4f647b447bf3" containerName="pruner" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.517157 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="d493a63f-9d1a-4995-8698-c3b75c46d69f" containerName="registry-server" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.517215 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.517243 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6ffc610-2a29-4b62-abc6-2cc0568c8bc1" containerName="registry-server" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.517269 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.517287 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.517303 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="27020e38-b505-48b1-8068-96c14dba1b9d" containerName="pruner" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.517320 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.517379 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="470670f0-d9fa-4aca-8086-a63b711191d1" containerName="registry-server" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.517403 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.517421 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1d2b5a4-7636-444a-88c3-38bd88a35f99" containerName="registry-server" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.517447 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="33a67ea8-4808-4214-b17b-4f647b447bf3" containerName="pruner" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.518042 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.532445 4831 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.533624 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.536006 4831 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Dec 04 10:18:52 crc kubenswrapper[4831]: E1204 10:18:52.552789 4831 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.146:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.666788 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.666829 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.666855 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.666871 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.666888 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.667294 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.667321 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.667346 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.768435 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.768563 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.768605 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.768618 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.768562 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.768678 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.768710 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.768728 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.768771 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.768789 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.768822 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.768829 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.768846 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.768883 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.768941 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.768952 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.859020 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 10:18:52 crc kubenswrapper[4831]: W1204 10:18:52.878755 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-f3f00c5e29c3745551562193cb6f1f405918504a216744661058b19272ef081f WatchSource:0}: Error finding container f3f00c5e29c3745551562193cb6f1f405918504a216744661058b19272ef081f: Status 404 returned error can't find the container with id f3f00c5e29c3745551562193cb6f1f405918504a216744661058b19272ef081f Dec 04 10:18:52 crc kubenswrapper[4831]: E1204 10:18:52.883502 4831 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.146:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187dfbd7b59a47dc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-04 10:18:52.882429916 +0000 UTC m=+229.831605260,LastTimestamp:2025-12-04 10:18:52.882429916 +0000 UTC m=+229.831605260,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.986106 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.988423 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.989339 4831 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc" exitCode=0 Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.989372 4831 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f" exitCode=0 Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.989383 4831 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5" exitCode=0 Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.989392 4831 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8" exitCode=2 Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.989459 4831 scope.go:117] "RemoveContainer" containerID="83f8e6a450164c8334b67a2c61d763577e60ba576660d04d2fdaa44c93c753b8" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.995812 4831 generic.go:334] "Generic (PLEG): container finished" podID="fe06f3bd-34a4-4f9c-9258-ca780b0a510b" containerID="73928568938a5f466568a9cc96eeada6f3f43226837fb06565369da84f65483b" exitCode=0 Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.995882 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"fe06f3bd-34a4-4f9c-9258-ca780b0a510b","Type":"ContainerDied","Data":"73928568938a5f466568a9cc96eeada6f3f43226837fb06565369da84f65483b"} Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.997025 4831 status_manager.go:851] "Failed to get status for pod" podUID="fe06f3bd-34a4-4f9c-9258-ca780b0a510b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Dec 04 10:18:52 crc kubenswrapper[4831]: I1204 10:18:52.997558 4831 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Dec 04 10:18:53 crc kubenswrapper[4831]: I1204 10:18:53.000334 4831 generic.go:334] "Generic (PLEG): container finished" podID="8475bb26-8864-4d49-935b-db7d4cb73387" containerID="5399a139e421589fc50e4bad256402be123e37600f5c39eadf919ce767e9eeeb" exitCode=0 Dec 04 10:18:53 crc kubenswrapper[4831]: I1204 10:18:53.000418 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" event={"ID":"8475bb26-8864-4d49-935b-db7d4cb73387","Type":"ContainerDied","Data":"5399a139e421589fc50e4bad256402be123e37600f5c39eadf919ce767e9eeeb"} Dec 04 10:18:53 crc kubenswrapper[4831]: I1204 10:18:53.000449 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" event={"ID":"8475bb26-8864-4d49-935b-db7d4cb73387","Type":"ContainerStarted","Data":"6f1a1ccac41bfbdc84a51f15b671049caf3ab62892dd1887a29c7fe8435d66ae"} Dec 04 10:18:53 crc kubenswrapper[4831]: I1204 10:18:53.001442 4831 status_manager.go:851] "Failed to get status for pod" podUID="fe06f3bd-34a4-4f9c-9258-ca780b0a510b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Dec 04 10:18:53 crc kubenswrapper[4831]: I1204 10:18:53.001761 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"f3f00c5e29c3745551562193cb6f1f405918504a216744661058b19272ef081f"} Dec 04 10:18:53 crc kubenswrapper[4831]: I1204 10:18:53.002133 4831 status_manager.go:851] "Failed to get status for pod" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-g76nn\": dial tcp 38.102.83.146:6443: connect: connection refused" Dec 04 10:18:53 crc kubenswrapper[4831]: I1204 10:18:53.002769 4831 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Dec 04 10:18:53 crc kubenswrapper[4831]: I1204 10:18:53.282588 4831 status_manager.go:851] "Failed to get status for pod" podUID="fe06f3bd-34a4-4f9c-9258-ca780b0a510b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Dec 04 10:18:53 crc kubenswrapper[4831]: I1204 10:18:53.283012 4831 status_manager.go:851] "Failed to get status for pod" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-g76nn\": dial tcp 38.102.83.146:6443: connect: connection refused" Dec 04 10:18:53 crc kubenswrapper[4831]: I1204 10:18:53.283444 4831 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Dec 04 10:18:54 crc kubenswrapper[4831]: I1204 10:18:54.010009 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"9ce0fce46321ec484701f16167b757cb2270b8a05acbc0a2c516d3e40a7374f7"} Dec 04 10:18:54 crc kubenswrapper[4831]: I1204 10:18:54.010599 4831 status_manager.go:851] "Failed to get status for pod" podUID="fe06f3bd-34a4-4f9c-9258-ca780b0a510b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Dec 04 10:18:54 crc kubenswrapper[4831]: E1204 10:18:54.010905 4831 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.146:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 10:18:54 crc kubenswrapper[4831]: I1204 10:18:54.010944 4831 status_manager.go:851] "Failed to get status for pod" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-g76nn\": dial tcp 38.102.83.146:6443: connect: connection refused" Dec 04 10:18:54 crc kubenswrapper[4831]: I1204 10:18:54.013083 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 04 10:18:54 crc kubenswrapper[4831]: I1204 10:18:54.274720 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 04 10:18:54 crc kubenswrapper[4831]: I1204 10:18:54.277710 4831 status_manager.go:851] "Failed to get status for pod" podUID="fe06f3bd-34a4-4f9c-9258-ca780b0a510b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Dec 04 10:18:54 crc kubenswrapper[4831]: I1204 10:18:54.278393 4831 status_manager.go:851] "Failed to get status for pod" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-g76nn\": dial tcp 38.102.83.146:6443: connect: connection refused" Dec 04 10:18:54 crc kubenswrapper[4831]: I1204 10:18:54.387706 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fe06f3bd-34a4-4f9c-9258-ca780b0a510b-kubelet-dir\") pod \"fe06f3bd-34a4-4f9c-9258-ca780b0a510b\" (UID: \"fe06f3bd-34a4-4f9c-9258-ca780b0a510b\") " Dec 04 10:18:54 crc kubenswrapper[4831]: I1204 10:18:54.387795 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fe06f3bd-34a4-4f9c-9258-ca780b0a510b-kube-api-access\") pod \"fe06f3bd-34a4-4f9c-9258-ca780b0a510b\" (UID: \"fe06f3bd-34a4-4f9c-9258-ca780b0a510b\") " Dec 04 10:18:54 crc kubenswrapper[4831]: I1204 10:18:54.387862 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fe06f3bd-34a4-4f9c-9258-ca780b0a510b-var-lock\") pod \"fe06f3bd-34a4-4f9c-9258-ca780b0a510b\" (UID: \"fe06f3bd-34a4-4f9c-9258-ca780b0a510b\") " Dec 04 10:18:54 crc kubenswrapper[4831]: I1204 10:18:54.388472 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe06f3bd-34a4-4f9c-9258-ca780b0a510b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "fe06f3bd-34a4-4f9c-9258-ca780b0a510b" (UID: "fe06f3bd-34a4-4f9c-9258-ca780b0a510b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:18:54 crc kubenswrapper[4831]: I1204 10:18:54.389203 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe06f3bd-34a4-4f9c-9258-ca780b0a510b-var-lock" (OuterVolumeSpecName: "var-lock") pod "fe06f3bd-34a4-4f9c-9258-ca780b0a510b" (UID: "fe06f3bd-34a4-4f9c-9258-ca780b0a510b"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:18:54 crc kubenswrapper[4831]: I1204 10:18:54.397969 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe06f3bd-34a4-4f9c-9258-ca780b0a510b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "fe06f3bd-34a4-4f9c-9258-ca780b0a510b" (UID: "fe06f3bd-34a4-4f9c-9258-ca780b0a510b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:18:54 crc kubenswrapper[4831]: I1204 10:18:54.489649 4831 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fe06f3bd-34a4-4f9c-9258-ca780b0a510b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 04 10:18:54 crc kubenswrapper[4831]: I1204 10:18:54.489736 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fe06f3bd-34a4-4f9c-9258-ca780b0a510b-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 10:18:54 crc kubenswrapper[4831]: I1204 10:18:54.489751 4831 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fe06f3bd-34a4-4f9c-9258-ca780b0a510b-var-lock\") on node \"crc\" DevicePath \"\"" Dec 04 10:18:55 crc kubenswrapper[4831]: I1204 10:18:55.022466 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 04 10:18:55 crc kubenswrapper[4831]: I1204 10:18:55.024377 4831 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320" exitCode=0 Dec 04 10:18:55 crc kubenswrapper[4831]: I1204 10:18:55.026055 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 04 10:18:55 crc kubenswrapper[4831]: I1204 10:18:55.026078 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"fe06f3bd-34a4-4f9c-9258-ca780b0a510b","Type":"ContainerDied","Data":"92fd6db70bd70204fa4e2acc4c172baefcdc78f0bd0fc9fb96bad3f3fc08a964"} Dec 04 10:18:55 crc kubenswrapper[4831]: I1204 10:18:55.026235 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92fd6db70bd70204fa4e2acc4c172baefcdc78f0bd0fc9fb96bad3f3fc08a964" Dec 04 10:18:55 crc kubenswrapper[4831]: E1204 10:18:55.026945 4831 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.146:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 10:18:55 crc kubenswrapper[4831]: I1204 10:18:55.050272 4831 status_manager.go:851] "Failed to get status for pod" podUID="fe06f3bd-34a4-4f9c-9258-ca780b0a510b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Dec 04 10:18:55 crc kubenswrapper[4831]: I1204 10:18:55.050992 4831 status_manager.go:851] "Failed to get status for pod" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-g76nn\": dial tcp 38.102.83.146:6443: connect: connection refused" Dec 04 10:18:55 crc kubenswrapper[4831]: I1204 10:18:55.272167 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 04 10:18:55 crc kubenswrapper[4831]: I1204 10:18:55.273361 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 10:18:55 crc kubenswrapper[4831]: I1204 10:18:55.273979 4831 status_manager.go:851] "Failed to get status for pod" podUID="fe06f3bd-34a4-4f9c-9258-ca780b0a510b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Dec 04 10:18:55 crc kubenswrapper[4831]: I1204 10:18:55.274534 4831 status_manager.go:851] "Failed to get status for pod" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-g76nn\": dial tcp 38.102.83.146:6443: connect: connection refused" Dec 04 10:18:55 crc kubenswrapper[4831]: I1204 10:18:55.274833 4831 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Dec 04 10:18:55 crc kubenswrapper[4831]: I1204 10:18:55.401797 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 04 10:18:55 crc kubenswrapper[4831]: I1204 10:18:55.401858 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 04 10:18:55 crc kubenswrapper[4831]: I1204 10:18:55.401887 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:18:55 crc kubenswrapper[4831]: I1204 10:18:55.401931 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 04 10:18:55 crc kubenswrapper[4831]: I1204 10:18:55.401900 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:18:55 crc kubenswrapper[4831]: I1204 10:18:55.402089 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:18:55 crc kubenswrapper[4831]: I1204 10:18:55.402288 4831 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 04 10:18:55 crc kubenswrapper[4831]: I1204 10:18:55.402306 4831 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 04 10:18:55 crc kubenswrapper[4831]: I1204 10:18:55.402316 4831 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 04 10:18:56 crc kubenswrapper[4831]: I1204 10:18:56.035798 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 04 10:18:56 crc kubenswrapper[4831]: I1204 10:18:56.038080 4831 scope.go:117] "RemoveContainer" containerID="0f02babaadd36618d6c35bd80cf6f69550c33587dc2005d72c013dc3c1eb9edc" Dec 04 10:18:56 crc kubenswrapper[4831]: I1204 10:18:56.038177 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 10:18:56 crc kubenswrapper[4831]: I1204 10:18:56.039107 4831 status_manager.go:851] "Failed to get status for pod" podUID="fe06f3bd-34a4-4f9c-9258-ca780b0a510b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Dec 04 10:18:56 crc kubenswrapper[4831]: I1204 10:18:56.039595 4831 status_manager.go:851] "Failed to get status for pod" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-g76nn\": dial tcp 38.102.83.146:6443: connect: connection refused" Dec 04 10:18:56 crc kubenswrapper[4831]: I1204 10:18:56.040224 4831 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Dec 04 10:18:56 crc kubenswrapper[4831]: I1204 10:18:56.059549 4831 scope.go:117] "RemoveContainer" containerID="e450a2b569444c983951262de966fe4eba71f5b08e5b0e944664b7576cceb15f" Dec 04 10:18:56 crc kubenswrapper[4831]: I1204 10:18:56.060786 4831 status_manager.go:851] "Failed to get status for pod" podUID="fe06f3bd-34a4-4f9c-9258-ca780b0a510b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Dec 04 10:18:56 crc kubenswrapper[4831]: I1204 10:18:56.061073 4831 status_manager.go:851] "Failed to get status for pod" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-g76nn\": dial tcp 38.102.83.146:6443: connect: connection refused" Dec 04 10:18:56 crc kubenswrapper[4831]: I1204 10:18:56.062043 4831 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Dec 04 10:18:56 crc kubenswrapper[4831]: I1204 10:18:56.077352 4831 scope.go:117] "RemoveContainer" containerID="d92e2ecc9874ffd995e182da7536d0827f1c07da4cfc8396626bb64d4fed75e5" Dec 04 10:18:56 crc kubenswrapper[4831]: I1204 10:18:56.091916 4831 scope.go:117] "RemoveContainer" containerID="8e48331ededcc0c7b1e8035966abd3cd9d9f1b1968ddb644fed1a30156fb75c8" Dec 04 10:18:56 crc kubenswrapper[4831]: I1204 10:18:56.102891 4831 scope.go:117] "RemoveContainer" containerID="8f5514618f60986357d3b79933f24dc63345132993350fd477062ec67e2c5320" Dec 04 10:18:56 crc kubenswrapper[4831]: I1204 10:18:56.121642 4831 scope.go:117] "RemoveContainer" containerID="8aa023ec56db40530ba6cf6c9d5f98e311683165cc8bc397e6f8de7442cff43a" Dec 04 10:18:57 crc kubenswrapper[4831]: I1204 10:18:57.286908 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 04 10:18:57 crc kubenswrapper[4831]: E1204 10:18:57.294278 4831 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.146:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187dfbd7b59a47dc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-04 10:18:52.882429916 +0000 UTC m=+229.831605260,LastTimestamp:2025-12-04 10:18:52.882429916 +0000 UTC m=+229.831605260,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 04 10:18:57 crc kubenswrapper[4831]: E1204 10:18:57.553619 4831 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Dec 04 10:18:57 crc kubenswrapper[4831]: E1204 10:18:57.554054 4831 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Dec 04 10:18:57 crc kubenswrapper[4831]: E1204 10:18:57.554747 4831 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Dec 04 10:18:57 crc kubenswrapper[4831]: E1204 10:18:57.555025 4831 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Dec 04 10:18:57 crc kubenswrapper[4831]: E1204 10:18:57.555292 4831 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Dec 04 10:18:57 crc kubenswrapper[4831]: I1204 10:18:57.555323 4831 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 04 10:18:57 crc kubenswrapper[4831]: E1204 10:18:57.555604 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="200ms" Dec 04 10:18:57 crc kubenswrapper[4831]: E1204 10:18:57.756061 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="400ms" Dec 04 10:18:58 crc kubenswrapper[4831]: E1204 10:18:58.157499 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="800ms" Dec 04 10:18:58 crc kubenswrapper[4831]: E1204 10:18:58.958473 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="1.6s" Dec 04 10:19:00 crc kubenswrapper[4831]: E1204 10:19:00.559208 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="3.2s" Dec 04 10:19:03 crc kubenswrapper[4831]: I1204 10:19:03.281897 4831 status_manager.go:851] "Failed to get status for pod" podUID="fe06f3bd-34a4-4f9c-9258-ca780b0a510b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Dec 04 10:19:03 crc kubenswrapper[4831]: I1204 10:19:03.282817 4831 status_manager.go:851] "Failed to get status for pod" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-g76nn\": dial tcp 38.102.83.146:6443: connect: connection refused" Dec 04 10:19:03 crc kubenswrapper[4831]: E1204 10:19:03.761477 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="6.4s" Dec 04 10:19:06 crc kubenswrapper[4831]: I1204 10:19:06.734255 4831 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 04 10:19:06 crc kubenswrapper[4831]: I1204 10:19:06.735068 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 04 10:19:07 crc kubenswrapper[4831]: I1204 10:19:07.110150 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 04 10:19:07 crc kubenswrapper[4831]: I1204 10:19:07.110226 4831 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="61e3c29b09715bd660a94365179740c46dd6bbeb04b333ecb5dadbf8cdea5cf4" exitCode=1 Dec 04 10:19:07 crc kubenswrapper[4831]: I1204 10:19:07.110270 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"61e3c29b09715bd660a94365179740c46dd6bbeb04b333ecb5dadbf8cdea5cf4"} Dec 04 10:19:07 crc kubenswrapper[4831]: I1204 10:19:07.110915 4831 scope.go:117] "RemoveContainer" containerID="61e3c29b09715bd660a94365179740c46dd6bbeb04b333ecb5dadbf8cdea5cf4" Dec 04 10:19:07 crc kubenswrapper[4831]: I1204 10:19:07.112497 4831 status_manager.go:851] "Failed to get status for pod" podUID="fe06f3bd-34a4-4f9c-9258-ca780b0a510b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Dec 04 10:19:07 crc kubenswrapper[4831]: I1204 10:19:07.113243 4831 status_manager.go:851] "Failed to get status for pod" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-g76nn\": dial tcp 38.102.83.146:6443: connect: connection refused" Dec 04 10:19:07 crc kubenswrapper[4831]: I1204 10:19:07.114848 4831 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Dec 04 10:19:07 crc kubenswrapper[4831]: I1204 10:19:07.278945 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 10:19:07 crc kubenswrapper[4831]: I1204 10:19:07.279879 4831 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Dec 04 10:19:07 crc kubenswrapper[4831]: I1204 10:19:07.280316 4831 status_manager.go:851] "Failed to get status for pod" podUID="fe06f3bd-34a4-4f9c-9258-ca780b0a510b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Dec 04 10:19:07 crc kubenswrapper[4831]: I1204 10:19:07.280510 4831 status_manager.go:851] "Failed to get status for pod" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-g76nn\": dial tcp 38.102.83.146:6443: connect: connection refused" Dec 04 10:19:07 crc kubenswrapper[4831]: I1204 10:19:07.292494 4831 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7266f967-3803-4ef3-9609-5a9c540a8305" Dec 04 10:19:07 crc kubenswrapper[4831]: I1204 10:19:07.292549 4831 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7266f967-3803-4ef3-9609-5a9c540a8305" Dec 04 10:19:07 crc kubenswrapper[4831]: E1204 10:19:07.293191 4831 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 10:19:07 crc kubenswrapper[4831]: I1204 10:19:07.293548 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 10:19:07 crc kubenswrapper[4831]: E1204 10:19:07.294862 4831 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.146:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187dfbd7b59a47dc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-04 10:18:52.882429916 +0000 UTC m=+229.831605260,LastTimestamp:2025-12-04 10:18:52.882429916 +0000 UTC m=+229.831605260,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 04 10:19:07 crc kubenswrapper[4831]: W1204 10:19:07.320187 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-278499df279272a52e4e5e2675ead9b90900c5e9176420af925a68b8d530913b WatchSource:0}: Error finding container 278499df279272a52e4e5e2675ead9b90900c5e9176420af925a68b8d530913b: Status 404 returned error can't find the container with id 278499df279272a52e4e5e2675ead9b90900c5e9176420af925a68b8d530913b Dec 04 10:19:08 crc kubenswrapper[4831]: I1204 10:19:08.119504 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 04 10:19:08 crc kubenswrapper[4831]: I1204 10:19:08.119922 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5348dd1ad1a3d1b11c7aecb7f65b37047420c85d6c48db265cb22ed98fdfd5e9"} Dec 04 10:19:08 crc kubenswrapper[4831]: I1204 10:19:08.121018 4831 status_manager.go:851] "Failed to get status for pod" podUID="fe06f3bd-34a4-4f9c-9258-ca780b0a510b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Dec 04 10:19:08 crc kubenswrapper[4831]: I1204 10:19:08.121415 4831 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="da816a037152837e3bfacb7fccbc8bc5fc44570592538aa05b9f17575ab443ca" exitCode=0 Dec 04 10:19:08 crc kubenswrapper[4831]: I1204 10:19:08.121407 4831 status_manager.go:851] "Failed to get status for pod" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-g76nn\": dial tcp 38.102.83.146:6443: connect: connection refused" Dec 04 10:19:08 crc kubenswrapper[4831]: I1204 10:19:08.121433 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"da816a037152837e3bfacb7fccbc8bc5fc44570592538aa05b9f17575ab443ca"} Dec 04 10:19:08 crc kubenswrapper[4831]: I1204 10:19:08.121529 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"278499df279272a52e4e5e2675ead9b90900c5e9176420af925a68b8d530913b"} Dec 04 10:19:08 crc kubenswrapper[4831]: I1204 10:19:08.121758 4831 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7266f967-3803-4ef3-9609-5a9c540a8305" Dec 04 10:19:08 crc kubenswrapper[4831]: I1204 10:19:08.121798 4831 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7266f967-3803-4ef3-9609-5a9c540a8305" Dec 04 10:19:08 crc kubenswrapper[4831]: I1204 10:19:08.121802 4831 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Dec 04 10:19:08 crc kubenswrapper[4831]: E1204 10:19:08.122095 4831 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 10:19:08 crc kubenswrapper[4831]: I1204 10:19:08.122123 4831 status_manager.go:851] "Failed to get status for pod" podUID="fe06f3bd-34a4-4f9c-9258-ca780b0a510b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Dec 04 10:19:08 crc kubenswrapper[4831]: I1204 10:19:08.122362 4831 status_manager.go:851] "Failed to get status for pod" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-g76nn\": dial tcp 38.102.83.146:6443: connect: connection refused" Dec 04 10:19:08 crc kubenswrapper[4831]: I1204 10:19:08.122629 4831 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Dec 04 10:19:08 crc kubenswrapper[4831]: I1204 10:19:08.811652 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 10:19:09 crc kubenswrapper[4831]: I1204 10:19:09.134280 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6c2a3d291ace0fb936d04f08365014fe0c440790efed07849bc36c1d89734386"} Dec 04 10:19:09 crc kubenswrapper[4831]: I1204 10:19:09.134327 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9b4de6563be327a7113a8f135e839b9dc7414339916f167e0f85120975f213be"} Dec 04 10:19:10 crc kubenswrapper[4831]: I1204 10:19:10.160233 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c2e024bc355a87c4bc63668ee1fcf5021a8a74217499da5c7126c74253ed5d6d"} Dec 04 10:19:10 crc kubenswrapper[4831]: I1204 10:19:10.160546 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"24648c7c6d897056451c4a5d09a3ae20c00f96b115a6d36f7ea59f119776cad9"} Dec 04 10:19:10 crc kubenswrapper[4831]: I1204 10:19:10.160557 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"93c923f7115897c697d08b17ce91bad1e747e136394b4d6ab7b1d8a31569c49f"} Dec 04 10:19:10 crc kubenswrapper[4831]: I1204 10:19:10.160823 4831 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7266f967-3803-4ef3-9609-5a9c540a8305" Dec 04 10:19:10 crc kubenswrapper[4831]: I1204 10:19:10.160836 4831 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7266f967-3803-4ef3-9609-5a9c540a8305" Dec 04 10:19:10 crc kubenswrapper[4831]: I1204 10:19:10.161032 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 10:19:12 crc kubenswrapper[4831]: I1204 10:19:12.294568 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 10:19:12 crc kubenswrapper[4831]: I1204 10:19:12.295036 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 10:19:12 crc kubenswrapper[4831]: I1204 10:19:12.302348 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 10:19:13 crc kubenswrapper[4831]: I1204 10:19:13.035437 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 10:19:13 crc kubenswrapper[4831]: I1204 10:19:13.035835 4831 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 04 10:19:13 crc kubenswrapper[4831]: I1204 10:19:13.035937 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 04 10:19:15 crc kubenswrapper[4831]: I1204 10:19:15.172408 4831 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 10:19:15 crc kubenswrapper[4831]: I1204 10:19:15.244277 4831 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="abc48b37-25af-43fd-aadf-a78931b0ab1a" Dec 04 10:19:16 crc kubenswrapper[4831]: I1204 10:19:16.189636 4831 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7266f967-3803-4ef3-9609-5a9c540a8305" Dec 04 10:19:16 crc kubenswrapper[4831]: I1204 10:19:16.189908 4831 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7266f967-3803-4ef3-9609-5a9c540a8305" Dec 04 10:19:16 crc kubenswrapper[4831]: I1204 10:19:16.192082 4831 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="abc48b37-25af-43fd-aadf-a78931b0ab1a" Dec 04 10:19:16 crc kubenswrapper[4831]: I1204 10:19:16.194049 4831 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://9b4de6563be327a7113a8f135e839b9dc7414339916f167e0f85120975f213be" Dec 04 10:19:16 crc kubenswrapper[4831]: I1204 10:19:16.194072 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 10:19:16 crc kubenswrapper[4831]: I1204 10:19:16.881708 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" podUID="7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3" containerName="oauth-openshift" containerID="cri-o://bb21fdf73e12ca17a408e36b8b2948b4de195f855aecec64550b9723bb582263" gracePeriod=15 Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.198873 4831 generic.go:334] "Generic (PLEG): container finished" podID="7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3" containerID="bb21fdf73e12ca17a408e36b8b2948b4de195f855aecec64550b9723bb582263" exitCode=0 Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.199429 4831 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7266f967-3803-4ef3-9609-5a9c540a8305" Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.199443 4831 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7266f967-3803-4ef3-9609-5a9c540a8305" Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.199053 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" event={"ID":"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3","Type":"ContainerDied","Data":"bb21fdf73e12ca17a408e36b8b2948b4de195f855aecec64550b9723bb582263"} Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.211631 4831 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="abc48b37-25af-43fd-aadf-a78931b0ab1a" Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.357650 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.517573 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-system-router-certs\") pod \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.517644 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-system-cliconfig\") pod \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.517720 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-system-ocp-branding-template\") pod \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.517759 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-system-session\") pod \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.517829 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-user-template-login\") pod \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.517870 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-user-idp-0-file-data\") pod \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.517908 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-system-serving-cert\") pod \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.517937 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-audit-dir\") pod \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.517970 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-system-service-ca\") pod \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.518027 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzs4p\" (UniqueName: \"kubernetes.io/projected/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-kube-api-access-gzs4p\") pod \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.518068 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-audit-policies\") pod \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.518100 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-system-trusted-ca-bundle\") pod \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.518144 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-user-template-error\") pod \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.518178 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-user-template-provider-selection\") pod \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\" (UID: \"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3\") " Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.518703 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3" (UID: "7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.519515 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3" (UID: "7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.519533 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3" (UID: "7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.520169 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3" (UID: "7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.520380 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3" (UID: "7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.524909 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3" (UID: "7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.525410 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-kube-api-access-gzs4p" (OuterVolumeSpecName: "kube-api-access-gzs4p") pod "7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3" (UID: "7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3"). InnerVolumeSpecName "kube-api-access-gzs4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.525495 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3" (UID: "7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.525702 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3" (UID: "7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.531526 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3" (UID: "7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.531689 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3" (UID: "7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.532351 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3" (UID: "7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.532912 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3" (UID: "7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.533108 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3" (UID: "7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.619417 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.619462 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.619474 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.619484 4831 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.619496 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.619508 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzs4p\" (UniqueName: \"kubernetes.io/projected/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-kube-api-access-gzs4p\") on node \"crc\" DevicePath \"\"" Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.619518 4831 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.619526 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.619537 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.619547 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.619556 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.619565 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.619574 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 04 10:19:17 crc kubenswrapper[4831]: I1204 10:19:17.619582 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 04 10:19:18 crc kubenswrapper[4831]: I1204 10:19:18.205711 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" event={"ID":"7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3","Type":"ContainerDied","Data":"0904d2e17a1c0aa53a8c8f48f89bf14d1fbb84e936d2676480dca09a225514d0"} Dec 04 10:19:18 crc kubenswrapper[4831]: I1204 10:19:18.205769 4831 scope.go:117] "RemoveContainer" containerID="bb21fdf73e12ca17a408e36b8b2948b4de195f855aecec64550b9723bb582263" Dec 04 10:19:18 crc kubenswrapper[4831]: I1204 10:19:18.205806 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-d8prt" Dec 04 10:19:23 crc kubenswrapper[4831]: I1204 10:19:23.042579 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 10:19:23 crc kubenswrapper[4831]: I1204 10:19:23.048250 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 10:19:24 crc kubenswrapper[4831]: I1204 10:19:24.333894 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 04 10:19:25 crc kubenswrapper[4831]: I1204 10:19:25.533984 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 04 10:19:25 crc kubenswrapper[4831]: I1204 10:19:25.611468 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 04 10:19:25 crc kubenswrapper[4831]: I1204 10:19:25.687796 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 04 10:19:26 crc kubenswrapper[4831]: I1204 10:19:26.006034 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 04 10:19:26 crc kubenswrapper[4831]: I1204 10:19:26.512504 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 04 10:19:26 crc kubenswrapper[4831]: I1204 10:19:26.522502 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 04 10:19:26 crc kubenswrapper[4831]: I1204 10:19:26.762531 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 04 10:19:26 crc kubenswrapper[4831]: I1204 10:19:26.820825 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 04 10:19:26 crc kubenswrapper[4831]: I1204 10:19:26.826366 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 04 10:19:27 crc kubenswrapper[4831]: I1204 10:19:27.105784 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 04 10:19:27 crc kubenswrapper[4831]: I1204 10:19:27.140373 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 04 10:19:27 crc kubenswrapper[4831]: I1204 10:19:27.203095 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 04 10:19:27 crc kubenswrapper[4831]: I1204 10:19:27.457033 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 04 10:19:27 crc kubenswrapper[4831]: I1204 10:19:27.606749 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 04 10:19:27 crc kubenswrapper[4831]: I1204 10:19:27.821534 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 04 10:19:27 crc kubenswrapper[4831]: I1204 10:19:27.869572 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 04 10:19:27 crc kubenswrapper[4831]: I1204 10:19:27.880858 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 04 10:19:27 crc kubenswrapper[4831]: I1204 10:19:27.953828 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 04 10:19:27 crc kubenswrapper[4831]: I1204 10:19:27.986410 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 04 10:19:28 crc kubenswrapper[4831]: I1204 10:19:28.052643 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 04 10:19:28 crc kubenswrapper[4831]: I1204 10:19:28.090903 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 04 10:19:28 crc kubenswrapper[4831]: I1204 10:19:28.213400 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 04 10:19:28 crc kubenswrapper[4831]: I1204 10:19:28.225890 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 04 10:19:28 crc kubenswrapper[4831]: I1204 10:19:28.246235 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 04 10:19:28 crc kubenswrapper[4831]: I1204 10:19:28.280726 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 04 10:19:28 crc kubenswrapper[4831]: I1204 10:19:28.336773 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 04 10:19:28 crc kubenswrapper[4831]: I1204 10:19:28.348060 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 04 10:19:28 crc kubenswrapper[4831]: I1204 10:19:28.419713 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 04 10:19:28 crc kubenswrapper[4831]: I1204 10:19:28.464717 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 04 10:19:28 crc kubenswrapper[4831]: I1204 10:19:28.794475 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 04 10:19:28 crc kubenswrapper[4831]: I1204 10:19:28.821587 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 04 10:19:28 crc kubenswrapper[4831]: I1204 10:19:28.850642 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 04 10:19:28 crc kubenswrapper[4831]: I1204 10:19:28.886763 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 04 10:19:28 crc kubenswrapper[4831]: I1204 10:19:28.920439 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 04 10:19:28 crc kubenswrapper[4831]: I1204 10:19:28.980979 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 04 10:19:28 crc kubenswrapper[4831]: I1204 10:19:28.998307 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 04 10:19:29 crc kubenswrapper[4831]: I1204 10:19:29.052457 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 04 10:19:29 crc kubenswrapper[4831]: I1204 10:19:29.096961 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 04 10:19:29 crc kubenswrapper[4831]: I1204 10:19:29.135891 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 04 10:19:29 crc kubenswrapper[4831]: I1204 10:19:29.143222 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 04 10:19:29 crc kubenswrapper[4831]: I1204 10:19:29.155478 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 04 10:19:29 crc kubenswrapper[4831]: I1204 10:19:29.183572 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 04 10:19:29 crc kubenswrapper[4831]: I1204 10:19:29.234305 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 04 10:19:29 crc kubenswrapper[4831]: I1204 10:19:29.235058 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 04 10:19:29 crc kubenswrapper[4831]: I1204 10:19:29.293824 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 04 10:19:29 crc kubenswrapper[4831]: I1204 10:19:29.360827 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 04 10:19:29 crc kubenswrapper[4831]: I1204 10:19:29.470799 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 04 10:19:29 crc kubenswrapper[4831]: I1204 10:19:29.501635 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 04 10:19:29 crc kubenswrapper[4831]: I1204 10:19:29.528377 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 04 10:19:29 crc kubenswrapper[4831]: I1204 10:19:29.553943 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 04 10:19:29 crc kubenswrapper[4831]: I1204 10:19:29.663251 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 04 10:19:29 crc kubenswrapper[4831]: I1204 10:19:29.664536 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 04 10:19:29 crc kubenswrapper[4831]: I1204 10:19:29.667244 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 04 10:19:29 crc kubenswrapper[4831]: I1204 10:19:29.831629 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 04 10:19:29 crc kubenswrapper[4831]: I1204 10:19:29.848063 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 04 10:19:29 crc kubenswrapper[4831]: I1204 10:19:29.943940 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 04 10:19:29 crc kubenswrapper[4831]: I1204 10:19:29.944701 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 04 10:19:29 crc kubenswrapper[4831]: I1204 10:19:29.982115 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 04 10:19:30 crc kubenswrapper[4831]: I1204 10:19:30.159859 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 04 10:19:30 crc kubenswrapper[4831]: I1204 10:19:30.168701 4831 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 04 10:19:30 crc kubenswrapper[4831]: I1204 10:19:30.235089 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 04 10:19:30 crc kubenswrapper[4831]: I1204 10:19:30.336715 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 04 10:19:30 crc kubenswrapper[4831]: I1204 10:19:30.369619 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 04 10:19:30 crc kubenswrapper[4831]: I1204 10:19:30.465696 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 04 10:19:30 crc kubenswrapper[4831]: I1204 10:19:30.473622 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 04 10:19:30 crc kubenswrapper[4831]: I1204 10:19:30.492043 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 04 10:19:30 crc kubenswrapper[4831]: I1204 10:19:30.520986 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 04 10:19:30 crc kubenswrapper[4831]: I1204 10:19:30.660469 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 04 10:19:30 crc kubenswrapper[4831]: I1204 10:19:30.736347 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 04 10:19:30 crc kubenswrapper[4831]: I1204 10:19:30.745459 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 04 10:19:30 crc kubenswrapper[4831]: I1204 10:19:30.749149 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 04 10:19:30 crc kubenswrapper[4831]: I1204 10:19:30.824133 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 04 10:19:30 crc kubenswrapper[4831]: I1204 10:19:30.892574 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 04 10:19:30 crc kubenswrapper[4831]: I1204 10:19:30.990423 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 04 10:19:31 crc kubenswrapper[4831]: I1204 10:19:31.170164 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 04 10:19:31 crc kubenswrapper[4831]: I1204 10:19:31.197106 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 04 10:19:31 crc kubenswrapper[4831]: I1204 10:19:31.380899 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 04 10:19:31 crc kubenswrapper[4831]: I1204 10:19:31.465695 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 04 10:19:31 crc kubenswrapper[4831]: I1204 10:19:31.469628 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 04 10:19:31 crc kubenswrapper[4831]: I1204 10:19:31.490705 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 04 10:19:31 crc kubenswrapper[4831]: I1204 10:19:31.492019 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 04 10:19:31 crc kubenswrapper[4831]: I1204 10:19:31.504229 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 04 10:19:31 crc kubenswrapper[4831]: I1204 10:19:31.505747 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 04 10:19:31 crc kubenswrapper[4831]: I1204 10:19:31.512975 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 04 10:19:31 crc kubenswrapper[4831]: I1204 10:19:31.513332 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 04 10:19:31 crc kubenswrapper[4831]: I1204 10:19:31.532322 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 04 10:19:31 crc kubenswrapper[4831]: I1204 10:19:31.576378 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 04 10:19:31 crc kubenswrapper[4831]: I1204 10:19:31.627345 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 04 10:19:31 crc kubenswrapper[4831]: I1204 10:19:31.645038 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 04 10:19:31 crc kubenswrapper[4831]: I1204 10:19:31.678290 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 04 10:19:31 crc kubenswrapper[4831]: I1204 10:19:31.686227 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 04 10:19:31 crc kubenswrapper[4831]: I1204 10:19:31.755025 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 04 10:19:31 crc kubenswrapper[4831]: I1204 10:19:31.790118 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 04 10:19:31 crc kubenswrapper[4831]: I1204 10:19:31.806731 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 04 10:19:31 crc kubenswrapper[4831]: I1204 10:19:31.884268 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 04 10:19:31 crc kubenswrapper[4831]: I1204 10:19:31.937299 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 04 10:19:32 crc kubenswrapper[4831]: I1204 10:19:32.074035 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 04 10:19:32 crc kubenswrapper[4831]: I1204 10:19:32.079170 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 04 10:19:32 crc kubenswrapper[4831]: I1204 10:19:32.136261 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 04 10:19:32 crc kubenswrapper[4831]: I1204 10:19:32.277475 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 04 10:19:32 crc kubenswrapper[4831]: I1204 10:19:32.309687 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 04 10:19:32 crc kubenswrapper[4831]: I1204 10:19:32.365897 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 04 10:19:32 crc kubenswrapper[4831]: I1204 10:19:32.386867 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 04 10:19:32 crc kubenswrapper[4831]: I1204 10:19:32.395357 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 04 10:19:32 crc kubenswrapper[4831]: I1204 10:19:32.485466 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 04 10:19:32 crc kubenswrapper[4831]: I1204 10:19:32.592860 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 04 10:19:32 crc kubenswrapper[4831]: I1204 10:19:32.684377 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 04 10:19:32 crc kubenswrapper[4831]: I1204 10:19:32.737630 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 04 10:19:32 crc kubenswrapper[4831]: I1204 10:19:32.747953 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 04 10:19:32 crc kubenswrapper[4831]: I1204 10:19:32.832416 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 04 10:19:32 crc kubenswrapper[4831]: I1204 10:19:32.879454 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 04 10:19:32 crc kubenswrapper[4831]: I1204 10:19:32.910849 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 04 10:19:32 crc kubenswrapper[4831]: I1204 10:19:32.953458 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 04 10:19:32 crc kubenswrapper[4831]: I1204 10:19:32.974739 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 04 10:19:32 crc kubenswrapper[4831]: I1204 10:19:32.991842 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 04 10:19:33 crc kubenswrapper[4831]: I1204 10:19:33.076811 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 04 10:19:33 crc kubenswrapper[4831]: I1204 10:19:33.087975 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 04 10:19:33 crc kubenswrapper[4831]: I1204 10:19:33.188925 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 04 10:19:33 crc kubenswrapper[4831]: I1204 10:19:33.324189 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 04 10:19:33 crc kubenswrapper[4831]: I1204 10:19:33.439023 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 04 10:19:33 crc kubenswrapper[4831]: I1204 10:19:33.482609 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 04 10:19:33 crc kubenswrapper[4831]: I1204 10:19:33.495735 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 04 10:19:33 crc kubenswrapper[4831]: I1204 10:19:33.513420 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 04 10:19:33 crc kubenswrapper[4831]: I1204 10:19:33.546164 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 04 10:19:33 crc kubenswrapper[4831]: I1204 10:19:33.558693 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 04 10:19:33 crc kubenswrapper[4831]: I1204 10:19:33.647589 4831 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 04 10:19:33 crc kubenswrapper[4831]: I1204 10:19:33.790918 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 04 10:19:33 crc kubenswrapper[4831]: I1204 10:19:33.827736 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 04 10:19:33 crc kubenswrapper[4831]: I1204 10:19:33.889593 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 04 10:19:33 crc kubenswrapper[4831]: I1204 10:19:33.906001 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 04 10:19:33 crc kubenswrapper[4831]: I1204 10:19:33.944278 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 04 10:19:33 crc kubenswrapper[4831]: I1204 10:19:33.961820 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 04 10:19:33 crc kubenswrapper[4831]: I1204 10:19:33.982999 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 04 10:19:33 crc kubenswrapper[4831]: I1204 10:19:33.994521 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 04 10:19:34 crc kubenswrapper[4831]: I1204 10:19:34.175032 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 04 10:19:34 crc kubenswrapper[4831]: I1204 10:19:34.250899 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 04 10:19:34 crc kubenswrapper[4831]: I1204 10:19:34.298118 4831 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 04 10:19:34 crc kubenswrapper[4831]: I1204 10:19:34.387209 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 04 10:19:34 crc kubenswrapper[4831]: I1204 10:19:34.437619 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 04 10:19:34 crc kubenswrapper[4831]: I1204 10:19:34.448014 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 04 10:19:34 crc kubenswrapper[4831]: I1204 10:19:34.448979 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 04 10:19:34 crc kubenswrapper[4831]: I1204 10:19:34.465756 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 04 10:19:34 crc kubenswrapper[4831]: I1204 10:19:34.481242 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 04 10:19:34 crc kubenswrapper[4831]: I1204 10:19:34.530603 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 04 10:19:34 crc kubenswrapper[4831]: I1204 10:19:34.613539 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 04 10:19:34 crc kubenswrapper[4831]: I1204 10:19:34.615130 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 04 10:19:34 crc kubenswrapper[4831]: I1204 10:19:34.717708 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 04 10:19:34 crc kubenswrapper[4831]: I1204 10:19:34.767734 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 04 10:19:34 crc kubenswrapper[4831]: I1204 10:19:34.830984 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 04 10:19:34 crc kubenswrapper[4831]: I1204 10:19:34.832446 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 04 10:19:34 crc kubenswrapper[4831]: I1204 10:19:34.902452 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 04 10:19:34 crc kubenswrapper[4831]: I1204 10:19:34.954809 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 04 10:19:35 crc kubenswrapper[4831]: I1204 10:19:35.017724 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 04 10:19:35 crc kubenswrapper[4831]: I1204 10:19:35.091286 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 04 10:19:35 crc kubenswrapper[4831]: I1204 10:19:35.130812 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 04 10:19:35 crc kubenswrapper[4831]: I1204 10:19:35.179405 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 04 10:19:35 crc kubenswrapper[4831]: I1204 10:19:35.286450 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 04 10:19:35 crc kubenswrapper[4831]: I1204 10:19:35.311781 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 04 10:19:35 crc kubenswrapper[4831]: I1204 10:19:35.328943 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 04 10:19:35 crc kubenswrapper[4831]: I1204 10:19:35.354894 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 04 10:19:35 crc kubenswrapper[4831]: I1204 10:19:35.356514 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 04 10:19:35 crc kubenswrapper[4831]: I1204 10:19:35.409501 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 04 10:19:35 crc kubenswrapper[4831]: I1204 10:19:35.492597 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 04 10:19:35 crc kubenswrapper[4831]: I1204 10:19:35.557576 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 04 10:19:35 crc kubenswrapper[4831]: I1204 10:19:35.561026 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 04 10:19:35 crc kubenswrapper[4831]: I1204 10:19:35.714635 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 04 10:19:35 crc kubenswrapper[4831]: I1204 10:19:35.969144 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 04 10:19:36 crc kubenswrapper[4831]: I1204 10:19:36.048065 4831 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 04 10:19:36 crc kubenswrapper[4831]: I1204 10:19:36.203542 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 04 10:19:36 crc kubenswrapper[4831]: I1204 10:19:36.220288 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 04 10:19:36 crc kubenswrapper[4831]: I1204 10:19:36.263644 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 04 10:19:36 crc kubenswrapper[4831]: I1204 10:19:36.272945 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 04 10:19:36 crc kubenswrapper[4831]: I1204 10:19:36.351269 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 04 10:19:36 crc kubenswrapper[4831]: I1204 10:19:36.405753 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 04 10:19:36 crc kubenswrapper[4831]: I1204 10:19:36.567448 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 04 10:19:36 crc kubenswrapper[4831]: I1204 10:19:36.582062 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 04 10:19:36 crc kubenswrapper[4831]: I1204 10:19:36.652356 4831 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 04 10:19:36 crc kubenswrapper[4831]: I1204 10:19:36.701259 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 04 10:19:36 crc kubenswrapper[4831]: I1204 10:19:36.841611 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 04 10:19:36 crc kubenswrapper[4831]: I1204 10:19:36.934429 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 04 10:19:36 crc kubenswrapper[4831]: I1204 10:19:36.954457 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 04 10:19:36 crc kubenswrapper[4831]: I1204 10:19:36.987889 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 04 10:19:36 crc kubenswrapper[4831]: I1204 10:19:36.997846 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 04 10:19:37 crc kubenswrapper[4831]: I1204 10:19:37.015768 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 04 10:19:37 crc kubenswrapper[4831]: I1204 10:19:37.180800 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 04 10:19:37 crc kubenswrapper[4831]: I1204 10:19:37.221242 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 04 10:19:37 crc kubenswrapper[4831]: I1204 10:19:37.223016 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 04 10:19:37 crc kubenswrapper[4831]: I1204 10:19:37.252017 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 04 10:19:37 crc kubenswrapper[4831]: I1204 10:19:37.406551 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 04 10:19:37 crc kubenswrapper[4831]: I1204 10:19:37.510472 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 04 10:19:37 crc kubenswrapper[4831]: I1204 10:19:37.686768 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 04 10:19:37 crc kubenswrapper[4831]: I1204 10:19:37.688336 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 04 10:19:37 crc kubenswrapper[4831]: I1204 10:19:37.703095 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 04 10:19:37 crc kubenswrapper[4831]: I1204 10:19:37.948750 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 04 10:19:37 crc kubenswrapper[4831]: I1204 10:19:37.954649 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 04 10:19:37 crc kubenswrapper[4831]: I1204 10:19:37.966206 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 04 10:19:37 crc kubenswrapper[4831]: I1204 10:19:37.967235 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.009149 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.080979 4831 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.085229 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-d8prt"] Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.085292 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-759d657776-4w4bv","openshift-kube-apiserver/kube-apiserver-crc"] Dec 04 10:19:38 crc kubenswrapper[4831]: E1204 10:19:38.085481 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe06f3bd-34a4-4f9c-9258-ca780b0a510b" containerName="installer" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.085498 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe06f3bd-34a4-4f9c-9258-ca780b0a510b" containerName="installer" Dec 04 10:19:38 crc kubenswrapper[4831]: E1204 10:19:38.085522 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3" containerName="oauth-openshift" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.085531 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3" containerName="oauth-openshift" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.085636 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3" containerName="oauth-openshift" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.085657 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe06f3bd-34a4-4f9c-9258-ca780b0a510b" containerName="installer" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.085688 4831 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7266f967-3803-4ef3-9609-5a9c540a8305" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.085709 4831 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7266f967-3803-4ef3-9609-5a9c540a8305" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.085999 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.088422 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.089375 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.089684 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.089914 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.091342 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.092201 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.092899 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.092931 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.092902 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.094763 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.095708 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.096177 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.101144 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.101566 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.112560 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.117008 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.119976 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.132915 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=23.132895093 podStartE2EDuration="23.132895093s" podCreationTimestamp="2025-12-04 10:19:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:19:38.129106476 +0000 UTC m=+275.078281790" watchObservedRunningTime="2025-12-04 10:19:38.132895093 +0000 UTC m=+275.082070417" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.146947 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.196644 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/197f9172-2f0e-4100-a5d4-97af0f22ecf0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-759d657776-4w4bv\" (UID: \"197f9172-2f0e-4100-a5d4-97af0f22ecf0\") " pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.196736 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/197f9172-2f0e-4100-a5d4-97af0f22ecf0-v4-0-config-user-template-error\") pod \"oauth-openshift-759d657776-4w4bv\" (UID: \"197f9172-2f0e-4100-a5d4-97af0f22ecf0\") " pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.196768 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/197f9172-2f0e-4100-a5d4-97af0f22ecf0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-759d657776-4w4bv\" (UID: \"197f9172-2f0e-4100-a5d4-97af0f22ecf0\") " pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.196790 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/197f9172-2f0e-4100-a5d4-97af0f22ecf0-v4-0-config-system-router-certs\") pod \"oauth-openshift-759d657776-4w4bv\" (UID: \"197f9172-2f0e-4100-a5d4-97af0f22ecf0\") " pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.196818 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/197f9172-2f0e-4100-a5d4-97af0f22ecf0-v4-0-config-user-template-login\") pod \"oauth-openshift-759d657776-4w4bv\" (UID: \"197f9172-2f0e-4100-a5d4-97af0f22ecf0\") " pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.196876 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/197f9172-2f0e-4100-a5d4-97af0f22ecf0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-759d657776-4w4bv\" (UID: \"197f9172-2f0e-4100-a5d4-97af0f22ecf0\") " pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.196936 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/197f9172-2f0e-4100-a5d4-97af0f22ecf0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-759d657776-4w4bv\" (UID: \"197f9172-2f0e-4100-a5d4-97af0f22ecf0\") " pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.196974 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/197f9172-2f0e-4100-a5d4-97af0f22ecf0-audit-policies\") pod \"oauth-openshift-759d657776-4w4bv\" (UID: \"197f9172-2f0e-4100-a5d4-97af0f22ecf0\") " pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.196997 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/197f9172-2f0e-4100-a5d4-97af0f22ecf0-v4-0-config-system-session\") pod \"oauth-openshift-759d657776-4w4bv\" (UID: \"197f9172-2f0e-4100-a5d4-97af0f22ecf0\") " pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.197021 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/197f9172-2f0e-4100-a5d4-97af0f22ecf0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-759d657776-4w4bv\" (UID: \"197f9172-2f0e-4100-a5d4-97af0f22ecf0\") " pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.197134 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/197f9172-2f0e-4100-a5d4-97af0f22ecf0-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-759d657776-4w4bv\" (UID: \"197f9172-2f0e-4100-a5d4-97af0f22ecf0\") " pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.197208 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz85t\" (UniqueName: \"kubernetes.io/projected/197f9172-2f0e-4100-a5d4-97af0f22ecf0-kube-api-access-zz85t\") pod \"oauth-openshift-759d657776-4w4bv\" (UID: \"197f9172-2f0e-4100-a5d4-97af0f22ecf0\") " pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.197269 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/197f9172-2f0e-4100-a5d4-97af0f22ecf0-audit-dir\") pod \"oauth-openshift-759d657776-4w4bv\" (UID: \"197f9172-2f0e-4100-a5d4-97af0f22ecf0\") " pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.197296 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/197f9172-2f0e-4100-a5d4-97af0f22ecf0-v4-0-config-system-service-ca\") pod \"oauth-openshift-759d657776-4w4bv\" (UID: \"197f9172-2f0e-4100-a5d4-97af0f22ecf0\") " pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.249979 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.259820 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.299272 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/197f9172-2f0e-4100-a5d4-97af0f22ecf0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-759d657776-4w4bv\" (UID: \"197f9172-2f0e-4100-a5d4-97af0f22ecf0\") " pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.299389 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/197f9172-2f0e-4100-a5d4-97af0f22ecf0-audit-policies\") pod \"oauth-openshift-759d657776-4w4bv\" (UID: \"197f9172-2f0e-4100-a5d4-97af0f22ecf0\") " pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.299737 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/197f9172-2f0e-4100-a5d4-97af0f22ecf0-v4-0-config-system-session\") pod \"oauth-openshift-759d657776-4w4bv\" (UID: \"197f9172-2f0e-4100-a5d4-97af0f22ecf0\") " pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.299785 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/197f9172-2f0e-4100-a5d4-97af0f22ecf0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-759d657776-4w4bv\" (UID: \"197f9172-2f0e-4100-a5d4-97af0f22ecf0\") " pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.299833 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/197f9172-2f0e-4100-a5d4-97af0f22ecf0-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-759d657776-4w4bv\" (UID: \"197f9172-2f0e-4100-a5d4-97af0f22ecf0\") " pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.299877 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz85t\" (UniqueName: \"kubernetes.io/projected/197f9172-2f0e-4100-a5d4-97af0f22ecf0-kube-api-access-zz85t\") pod \"oauth-openshift-759d657776-4w4bv\" (UID: \"197f9172-2f0e-4100-a5d4-97af0f22ecf0\") " pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.299918 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/197f9172-2f0e-4100-a5d4-97af0f22ecf0-audit-dir\") pod \"oauth-openshift-759d657776-4w4bv\" (UID: \"197f9172-2f0e-4100-a5d4-97af0f22ecf0\") " pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.299953 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/197f9172-2f0e-4100-a5d4-97af0f22ecf0-v4-0-config-system-service-ca\") pod \"oauth-openshift-759d657776-4w4bv\" (UID: \"197f9172-2f0e-4100-a5d4-97af0f22ecf0\") " pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.300025 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/197f9172-2f0e-4100-a5d4-97af0f22ecf0-v4-0-config-user-template-error\") pod \"oauth-openshift-759d657776-4w4bv\" (UID: \"197f9172-2f0e-4100-a5d4-97af0f22ecf0\") " pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.300068 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/197f9172-2f0e-4100-a5d4-97af0f22ecf0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-759d657776-4w4bv\" (UID: \"197f9172-2f0e-4100-a5d4-97af0f22ecf0\") " pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.300105 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/197f9172-2f0e-4100-a5d4-97af0f22ecf0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-759d657776-4w4bv\" (UID: \"197f9172-2f0e-4100-a5d4-97af0f22ecf0\") " pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.300137 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/197f9172-2f0e-4100-a5d4-97af0f22ecf0-v4-0-config-system-router-certs\") pod \"oauth-openshift-759d657776-4w4bv\" (UID: \"197f9172-2f0e-4100-a5d4-97af0f22ecf0\") " pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.300170 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/197f9172-2f0e-4100-a5d4-97af0f22ecf0-v4-0-config-user-template-login\") pod \"oauth-openshift-759d657776-4w4bv\" (UID: \"197f9172-2f0e-4100-a5d4-97af0f22ecf0\") " pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.300214 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/197f9172-2f0e-4100-a5d4-97af0f22ecf0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-759d657776-4w4bv\" (UID: \"197f9172-2f0e-4100-a5d4-97af0f22ecf0\") " pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.300565 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/197f9172-2f0e-4100-a5d4-97af0f22ecf0-v4-0-config-system-service-ca\") pod \"oauth-openshift-759d657776-4w4bv\" (UID: \"197f9172-2f0e-4100-a5d4-97af0f22ecf0\") " pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.300020 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/197f9172-2f0e-4100-a5d4-97af0f22ecf0-audit-dir\") pod \"oauth-openshift-759d657776-4w4bv\" (UID: \"197f9172-2f0e-4100-a5d4-97af0f22ecf0\") " pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.301274 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/197f9172-2f0e-4100-a5d4-97af0f22ecf0-audit-policies\") pod \"oauth-openshift-759d657776-4w4bv\" (UID: \"197f9172-2f0e-4100-a5d4-97af0f22ecf0\") " pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.301868 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/197f9172-2f0e-4100-a5d4-97af0f22ecf0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-759d657776-4w4bv\" (UID: \"197f9172-2f0e-4100-a5d4-97af0f22ecf0\") " pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.302583 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/197f9172-2f0e-4100-a5d4-97af0f22ecf0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-759d657776-4w4bv\" (UID: \"197f9172-2f0e-4100-a5d4-97af0f22ecf0\") " pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.306077 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/197f9172-2f0e-4100-a5d4-97af0f22ecf0-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-759d657776-4w4bv\" (UID: \"197f9172-2f0e-4100-a5d4-97af0f22ecf0\") " pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.306555 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/197f9172-2f0e-4100-a5d4-97af0f22ecf0-v4-0-config-user-template-error\") pod \"oauth-openshift-759d657776-4w4bv\" (UID: \"197f9172-2f0e-4100-a5d4-97af0f22ecf0\") " pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.306760 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/197f9172-2f0e-4100-a5d4-97af0f22ecf0-v4-0-config-user-template-login\") pod \"oauth-openshift-759d657776-4w4bv\" (UID: \"197f9172-2f0e-4100-a5d4-97af0f22ecf0\") " pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.307539 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/197f9172-2f0e-4100-a5d4-97af0f22ecf0-v4-0-config-system-router-certs\") pod \"oauth-openshift-759d657776-4w4bv\" (UID: \"197f9172-2f0e-4100-a5d4-97af0f22ecf0\") " pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.307634 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/197f9172-2f0e-4100-a5d4-97af0f22ecf0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-759d657776-4w4bv\" (UID: \"197f9172-2f0e-4100-a5d4-97af0f22ecf0\") " pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.308465 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/197f9172-2f0e-4100-a5d4-97af0f22ecf0-v4-0-config-system-session\") pod \"oauth-openshift-759d657776-4w4bv\" (UID: \"197f9172-2f0e-4100-a5d4-97af0f22ecf0\") " pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.310343 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/197f9172-2f0e-4100-a5d4-97af0f22ecf0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-759d657776-4w4bv\" (UID: \"197f9172-2f0e-4100-a5d4-97af0f22ecf0\") " pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.314427 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/197f9172-2f0e-4100-a5d4-97af0f22ecf0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-759d657776-4w4bv\" (UID: \"197f9172-2f0e-4100-a5d4-97af0f22ecf0\") " pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.338036 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz85t\" (UniqueName: \"kubernetes.io/projected/197f9172-2f0e-4100-a5d4-97af0f22ecf0-kube-api-access-zz85t\") pod \"oauth-openshift-759d657776-4w4bv\" (UID: \"197f9172-2f0e-4100-a5d4-97af0f22ecf0\") " pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.407081 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.433224 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.438816 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.448728 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.487428 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.640372 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-759d657776-4w4bv"] Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.642790 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.775263 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 04 10:19:38 crc kubenswrapper[4831]: I1204 10:19:38.795872 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 04 10:19:39 crc kubenswrapper[4831]: I1204 10:19:39.048363 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 04 10:19:39 crc kubenswrapper[4831]: I1204 10:19:39.291272 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3" path="/var/lib/kubelet/pods/7fa1b6b7-1b8d-4c8c-96ac-9ed7c2297cc3/volumes" Dec 04 10:19:39 crc kubenswrapper[4831]: I1204 10:19:39.327606 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" event={"ID":"197f9172-2f0e-4100-a5d4-97af0f22ecf0","Type":"ContainerStarted","Data":"dd9cc81b79fb8e7be210371959933f0bba89dc2cfa0ae24f59ddf52b69b55ff5"} Dec 04 10:19:39 crc kubenswrapper[4831]: I1204 10:19:39.327869 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:39 crc kubenswrapper[4831]: I1204 10:19:39.327912 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" event={"ID":"197f9172-2f0e-4100-a5d4-97af0f22ecf0","Type":"ContainerStarted","Data":"29fa7c3c3a91a25835855a9b1471c8060540341247ac9ebc0fe94e44839e10b0"} Dec 04 10:19:39 crc kubenswrapper[4831]: I1204 10:19:39.350646 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 04 10:19:39 crc kubenswrapper[4831]: I1204 10:19:39.361772 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" podStartSLOduration=48.361756854 podStartE2EDuration="48.361756854s" podCreationTimestamp="2025-12-04 10:18:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:19:39.361475506 +0000 UTC m=+276.310650830" watchObservedRunningTime="2025-12-04 10:19:39.361756854 +0000 UTC m=+276.310932168" Dec 04 10:19:39 crc kubenswrapper[4831]: I1204 10:19:39.434239 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 04 10:19:39 crc kubenswrapper[4831]: I1204 10:19:39.465809 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 04 10:19:39 crc kubenswrapper[4831]: I1204 10:19:39.469026 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 04 10:19:39 crc kubenswrapper[4831]: I1204 10:19:39.496617 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 04 10:19:39 crc kubenswrapper[4831]: I1204 10:19:39.550618 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-759d657776-4w4bv" Dec 04 10:19:39 crc kubenswrapper[4831]: I1204 10:19:39.753996 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 04 10:19:39 crc kubenswrapper[4831]: I1204 10:19:39.810786 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 04 10:19:39 crc kubenswrapper[4831]: I1204 10:19:39.880369 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 04 10:19:39 crc kubenswrapper[4831]: I1204 10:19:39.888837 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 04 10:19:39 crc kubenswrapper[4831]: I1204 10:19:39.890583 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 04 10:19:39 crc kubenswrapper[4831]: I1204 10:19:39.961983 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 04 10:19:40 crc kubenswrapper[4831]: I1204 10:19:40.057785 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 04 10:19:40 crc kubenswrapper[4831]: I1204 10:19:40.084379 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 04 10:19:40 crc kubenswrapper[4831]: I1204 10:19:40.114334 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 04 10:19:40 crc kubenswrapper[4831]: I1204 10:19:40.288381 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 04 10:19:40 crc kubenswrapper[4831]: I1204 10:19:40.341973 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 04 10:19:40 crc kubenswrapper[4831]: I1204 10:19:40.678357 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 04 10:19:40 crc kubenswrapper[4831]: I1204 10:19:40.687306 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 04 10:19:40 crc kubenswrapper[4831]: I1204 10:19:40.760974 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 04 10:19:49 crc kubenswrapper[4831]: I1204 10:19:49.130460 4831 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 04 10:19:49 crc kubenswrapper[4831]: I1204 10:19:49.131404 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://9ce0fce46321ec484701f16167b757cb2270b8a05acbc0a2c516d3e40a7374f7" gracePeriod=5 Dec 04 10:19:52 crc kubenswrapper[4831]: I1204 10:19:52.349241 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.149880 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k6xp2"] Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.151409 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-k6xp2" podUID="cdf24e44-d00c-424c-aa3c-3e7fdb1a2997" containerName="registry-server" containerID="cri-o://398a5435e0416adc6d4938d7ad0d91b9ca52042c42d9094b63038b7167258736" gracePeriod=30 Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.169940 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nbm8n"] Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.170433 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nbm8n" podUID="16d53c94-9ca3-4b5a-b33d-829f44de5367" containerName="registry-server" containerID="cri-o://771ce5120d6ca6775253a55e3f28ae12556da44b973d47e5e26e216dd46c01f4" gracePeriod=30 Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.205148 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qp6r2"] Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.205414 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-qp6r2" podUID="73f9aaec-7f63-4909-9ffe-6b073e0225d9" containerName="marketplace-operator" containerID="cri-o://26e109597f23457a82547a3ff0779067d116483ccf3d0aeb35eb8b1215b510eb" gracePeriod=30 Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.221847 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xgjhm"] Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.222252 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xgjhm" podUID="0164870d-aba7-43b6-b798-93ec48968837" containerName="registry-server" containerID="cri-o://d9581d7f0482b6774ac9be45016a2394459c4c00b1484d3ee835cfc5a8446204" gracePeriod=30 Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.229550 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bbqht"] Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.230057 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bbqht" podUID="069ef39a-dfc9-4a4c-acf4-758475a8a7b0" containerName="registry-server" containerID="cri-o://fd98962a5f834818a07e310d528c6e9ddbaa8d27ba5da5a19eb238fa134473d4" gracePeriod=30 Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.233771 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jpjfl"] Dec 04 10:19:54 crc kubenswrapper[4831]: E1204 10:19:54.234108 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.234129 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.234269 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.234807 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jpjfl" Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.236373 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jpjfl"] Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.312637 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/398ad5ca-6c9b-4503-a64e-c31a5e34205a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jpjfl\" (UID: \"398ad5ca-6c9b-4503-a64e-c31a5e34205a\") " pod="openshift-marketplace/marketplace-operator-79b997595-jpjfl" Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.312719 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz2mx\" (UniqueName: \"kubernetes.io/projected/398ad5ca-6c9b-4503-a64e-c31a5e34205a-kube-api-access-rz2mx\") pod \"marketplace-operator-79b997595-jpjfl\" (UID: \"398ad5ca-6c9b-4503-a64e-c31a5e34205a\") " pod="openshift-marketplace/marketplace-operator-79b997595-jpjfl" Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.312773 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/398ad5ca-6c9b-4503-a64e-c31a5e34205a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jpjfl\" (UID: \"398ad5ca-6c9b-4503-a64e-c31a5e34205a\") " pod="openshift-marketplace/marketplace-operator-79b997595-jpjfl" Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.413340 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/398ad5ca-6c9b-4503-a64e-c31a5e34205a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jpjfl\" (UID: \"398ad5ca-6c9b-4503-a64e-c31a5e34205a\") " pod="openshift-marketplace/marketplace-operator-79b997595-jpjfl" Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.413387 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz2mx\" (UniqueName: \"kubernetes.io/projected/398ad5ca-6c9b-4503-a64e-c31a5e34205a-kube-api-access-rz2mx\") pod \"marketplace-operator-79b997595-jpjfl\" (UID: \"398ad5ca-6c9b-4503-a64e-c31a5e34205a\") " pod="openshift-marketplace/marketplace-operator-79b997595-jpjfl" Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.413443 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/398ad5ca-6c9b-4503-a64e-c31a5e34205a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jpjfl\" (UID: \"398ad5ca-6c9b-4503-a64e-c31a5e34205a\") " pod="openshift-marketplace/marketplace-operator-79b997595-jpjfl" Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.415527 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/398ad5ca-6c9b-4503-a64e-c31a5e34205a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jpjfl\" (UID: \"398ad5ca-6c9b-4503-a64e-c31a5e34205a\") " pod="openshift-marketplace/marketplace-operator-79b997595-jpjfl" Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.419525 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/398ad5ca-6c9b-4503-a64e-c31a5e34205a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jpjfl\" (UID: \"398ad5ca-6c9b-4503-a64e-c31a5e34205a\") " pod="openshift-marketplace/marketplace-operator-79b997595-jpjfl" Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.419810 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.419876 4831 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="9ce0fce46321ec484701f16167b757cb2270b8a05acbc0a2c516d3e40a7374f7" exitCode=137 Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.423024 4831 generic.go:334] "Generic (PLEG): container finished" podID="cdf24e44-d00c-424c-aa3c-3e7fdb1a2997" containerID="398a5435e0416adc6d4938d7ad0d91b9ca52042c42d9094b63038b7167258736" exitCode=0 Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.423077 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6xp2" event={"ID":"cdf24e44-d00c-424c-aa3c-3e7fdb1a2997","Type":"ContainerDied","Data":"398a5435e0416adc6d4938d7ad0d91b9ca52042c42d9094b63038b7167258736"} Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.430105 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz2mx\" (UniqueName: \"kubernetes.io/projected/398ad5ca-6c9b-4503-a64e-c31a5e34205a-kube-api-access-rz2mx\") pod \"marketplace-operator-79b997595-jpjfl\" (UID: \"398ad5ca-6c9b-4503-a64e-c31a5e34205a\") " pod="openshift-marketplace/marketplace-operator-79b997595-jpjfl" Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.485953 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.486167 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.606710 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jpjfl" Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.615343 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.615455 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.615514 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.615612 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.615642 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.615718 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.615731 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.615866 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.615945 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.616241 4831 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.616261 4831 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.616271 4831 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.616281 4831 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.623402 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.717145 4831 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.849238 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jpjfl"] Dec 04 10:19:54 crc kubenswrapper[4831]: W1204 10:19:54.860114 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod398ad5ca_6c9b_4503_a64e_c31a5e34205a.slice/crio-9ec63d0b0515521e5a0ce5bb0a2afca91c01aafeffb7f819fb57e0e023a20665 WatchSource:0}: Error finding container 9ec63d0b0515521e5a0ce5bb0a2afca91c01aafeffb7f819fb57e0e023a20665: Status 404 returned error can't find the container with id 9ec63d0b0515521e5a0ce5bb0a2afca91c01aafeffb7f819fb57e0e023a20665 Dec 04 10:19:54 crc kubenswrapper[4831]: I1204 10:19:54.989646 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k6xp2" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.066589 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qp6r2" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.088993 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nbm8n" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.120475 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdf24e44-d00c-424c-aa3c-3e7fdb1a2997-catalog-content\") pod \"cdf24e44-d00c-424c-aa3c-3e7fdb1a2997\" (UID: \"cdf24e44-d00c-424c-aa3c-3e7fdb1a2997\") " Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.120530 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xc589\" (UniqueName: \"kubernetes.io/projected/cdf24e44-d00c-424c-aa3c-3e7fdb1a2997-kube-api-access-xc589\") pod \"cdf24e44-d00c-424c-aa3c-3e7fdb1a2997\" (UID: \"cdf24e44-d00c-424c-aa3c-3e7fdb1a2997\") " Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.120634 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdf24e44-d00c-424c-aa3c-3e7fdb1a2997-utilities\") pod \"cdf24e44-d00c-424c-aa3c-3e7fdb1a2997\" (UID: \"cdf24e44-d00c-424c-aa3c-3e7fdb1a2997\") " Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.121627 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdf24e44-d00c-424c-aa3c-3e7fdb1a2997-utilities" (OuterVolumeSpecName: "utilities") pod "cdf24e44-d00c-424c-aa3c-3e7fdb1a2997" (UID: "cdf24e44-d00c-424c-aa3c-3e7fdb1a2997"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.121814 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xgjhm" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.127967 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdf24e44-d00c-424c-aa3c-3e7fdb1a2997-kube-api-access-xc589" (OuterVolumeSpecName: "kube-api-access-xc589") pod "cdf24e44-d00c-424c-aa3c-3e7fdb1a2997" (UID: "cdf24e44-d00c-424c-aa3c-3e7fdb1a2997"). InnerVolumeSpecName "kube-api-access-xc589". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.139303 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bbqht" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.171748 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdf24e44-d00c-424c-aa3c-3e7fdb1a2997-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cdf24e44-d00c-424c-aa3c-3e7fdb1a2997" (UID: "cdf24e44-d00c-424c-aa3c-3e7fdb1a2997"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.221718 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/73f9aaec-7f63-4909-9ffe-6b073e0225d9-marketplace-operator-metrics\") pod \"73f9aaec-7f63-4909-9ffe-6b073e0225d9\" (UID: \"73f9aaec-7f63-4909-9ffe-6b073e0225d9\") " Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.221814 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16d53c94-9ca3-4b5a-b33d-829f44de5367-catalog-content\") pod \"16d53c94-9ca3-4b5a-b33d-829f44de5367\" (UID: \"16d53c94-9ca3-4b5a-b33d-829f44de5367\") " Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.221851 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mlp2\" (UniqueName: \"kubernetes.io/projected/16d53c94-9ca3-4b5a-b33d-829f44de5367-kube-api-access-7mlp2\") pod \"16d53c94-9ca3-4b5a-b33d-829f44de5367\" (UID: \"16d53c94-9ca3-4b5a-b33d-829f44de5367\") " Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.221886 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73f9aaec-7f63-4909-9ffe-6b073e0225d9-marketplace-trusted-ca\") pod \"73f9aaec-7f63-4909-9ffe-6b073e0225d9\" (UID: \"73f9aaec-7f63-4909-9ffe-6b073e0225d9\") " Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.222002 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/069ef39a-dfc9-4a4c-acf4-758475a8a7b0-utilities\") pod \"069ef39a-dfc9-4a4c-acf4-758475a8a7b0\" (UID: \"069ef39a-dfc9-4a4c-acf4-758475a8a7b0\") " Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.222056 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16d53c94-9ca3-4b5a-b33d-829f44de5367-utilities\") pod \"16d53c94-9ca3-4b5a-b33d-829f44de5367\" (UID: \"16d53c94-9ca3-4b5a-b33d-829f44de5367\") " Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.222109 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbjdt\" (UniqueName: \"kubernetes.io/projected/73f9aaec-7f63-4909-9ffe-6b073e0225d9-kube-api-access-hbjdt\") pod \"73f9aaec-7f63-4909-9ffe-6b073e0225d9\" (UID: \"73f9aaec-7f63-4909-9ffe-6b073e0225d9\") " Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.222322 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdf24e44-d00c-424c-aa3c-3e7fdb1a2997-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.222348 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdf24e44-d00c-424c-aa3c-3e7fdb1a2997-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.222361 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xc589\" (UniqueName: \"kubernetes.io/projected/cdf24e44-d00c-424c-aa3c-3e7fdb1a2997-kube-api-access-xc589\") on node \"crc\" DevicePath \"\"" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.223073 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73f9aaec-7f63-4909-9ffe-6b073e0225d9-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "73f9aaec-7f63-4909-9ffe-6b073e0225d9" (UID: "73f9aaec-7f63-4909-9ffe-6b073e0225d9"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.223130 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/069ef39a-dfc9-4a4c-acf4-758475a8a7b0-utilities" (OuterVolumeSpecName: "utilities") pod "069ef39a-dfc9-4a4c-acf4-758475a8a7b0" (UID: "069ef39a-dfc9-4a4c-acf4-758475a8a7b0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.223813 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16d53c94-9ca3-4b5a-b33d-829f44de5367-utilities" (OuterVolumeSpecName: "utilities") pod "16d53c94-9ca3-4b5a-b33d-829f44de5367" (UID: "16d53c94-9ca3-4b5a-b33d-829f44de5367"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.224881 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73f9aaec-7f63-4909-9ffe-6b073e0225d9-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "73f9aaec-7f63-4909-9ffe-6b073e0225d9" (UID: "73f9aaec-7f63-4909-9ffe-6b073e0225d9"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.225880 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16d53c94-9ca3-4b5a-b33d-829f44de5367-kube-api-access-7mlp2" (OuterVolumeSpecName: "kube-api-access-7mlp2") pod "16d53c94-9ca3-4b5a-b33d-829f44de5367" (UID: "16d53c94-9ca3-4b5a-b33d-829f44de5367"). InnerVolumeSpecName "kube-api-access-7mlp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.226377 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73f9aaec-7f63-4909-9ffe-6b073e0225d9-kube-api-access-hbjdt" (OuterVolumeSpecName: "kube-api-access-hbjdt") pod "73f9aaec-7f63-4909-9ffe-6b073e0225d9" (UID: "73f9aaec-7f63-4909-9ffe-6b073e0225d9"). InnerVolumeSpecName "kube-api-access-hbjdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.274624 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16d53c94-9ca3-4b5a-b33d-829f44de5367-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "16d53c94-9ca3-4b5a-b33d-829f44de5367" (UID: "16d53c94-9ca3-4b5a-b33d-829f44de5367"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.283863 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.322736 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sc9hn\" (UniqueName: \"kubernetes.io/projected/0164870d-aba7-43b6-b798-93ec48968837-kube-api-access-sc9hn\") pod \"0164870d-aba7-43b6-b798-93ec48968837\" (UID: \"0164870d-aba7-43b6-b798-93ec48968837\") " Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.322778 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0164870d-aba7-43b6-b798-93ec48968837-catalog-content\") pod \"0164870d-aba7-43b6-b798-93ec48968837\" (UID: \"0164870d-aba7-43b6-b798-93ec48968837\") " Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.322816 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndrmg\" (UniqueName: \"kubernetes.io/projected/069ef39a-dfc9-4a4c-acf4-758475a8a7b0-kube-api-access-ndrmg\") pod \"069ef39a-dfc9-4a4c-acf4-758475a8a7b0\" (UID: \"069ef39a-dfc9-4a4c-acf4-758475a8a7b0\") " Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.322867 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0164870d-aba7-43b6-b798-93ec48968837-utilities\") pod \"0164870d-aba7-43b6-b798-93ec48968837\" (UID: \"0164870d-aba7-43b6-b798-93ec48968837\") " Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.322886 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/069ef39a-dfc9-4a4c-acf4-758475a8a7b0-catalog-content\") pod \"069ef39a-dfc9-4a4c-acf4-758475a8a7b0\" (UID: \"069ef39a-dfc9-4a4c-acf4-758475a8a7b0\") " Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.323011 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16d53c94-9ca3-4b5a-b33d-829f44de5367-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.323021 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mlp2\" (UniqueName: \"kubernetes.io/projected/16d53c94-9ca3-4b5a-b33d-829f44de5367-kube-api-access-7mlp2\") on node \"crc\" DevicePath \"\"" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.323031 4831 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73f9aaec-7f63-4909-9ffe-6b073e0225d9-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.323039 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/069ef39a-dfc9-4a4c-acf4-758475a8a7b0-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.323047 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16d53c94-9ca3-4b5a-b33d-829f44de5367-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.323055 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbjdt\" (UniqueName: \"kubernetes.io/projected/73f9aaec-7f63-4909-9ffe-6b073e0225d9-kube-api-access-hbjdt\") on node \"crc\" DevicePath \"\"" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.323062 4831 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/73f9aaec-7f63-4909-9ffe-6b073e0225d9-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.323763 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0164870d-aba7-43b6-b798-93ec48968837-utilities" (OuterVolumeSpecName: "utilities") pod "0164870d-aba7-43b6-b798-93ec48968837" (UID: "0164870d-aba7-43b6-b798-93ec48968837"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.326921 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0164870d-aba7-43b6-b798-93ec48968837-kube-api-access-sc9hn" (OuterVolumeSpecName: "kube-api-access-sc9hn") pod "0164870d-aba7-43b6-b798-93ec48968837" (UID: "0164870d-aba7-43b6-b798-93ec48968837"). InnerVolumeSpecName "kube-api-access-sc9hn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.339536 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0164870d-aba7-43b6-b798-93ec48968837-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0164870d-aba7-43b6-b798-93ec48968837" (UID: "0164870d-aba7-43b6-b798-93ec48968837"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.347908 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/069ef39a-dfc9-4a4c-acf4-758475a8a7b0-kube-api-access-ndrmg" (OuterVolumeSpecName: "kube-api-access-ndrmg") pod "069ef39a-dfc9-4a4c-acf4-758475a8a7b0" (UID: "069ef39a-dfc9-4a4c-acf4-758475a8a7b0"). InnerVolumeSpecName "kube-api-access-ndrmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.423925 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0164870d-aba7-43b6-b798-93ec48968837-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.423959 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sc9hn\" (UniqueName: \"kubernetes.io/projected/0164870d-aba7-43b6-b798-93ec48968837-kube-api-access-sc9hn\") on node \"crc\" DevicePath \"\"" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.423973 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0164870d-aba7-43b6-b798-93ec48968837-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.423982 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndrmg\" (UniqueName: \"kubernetes.io/projected/069ef39a-dfc9-4a4c-acf4-758475a8a7b0-kube-api-access-ndrmg\") on node \"crc\" DevicePath \"\"" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.428566 4831 generic.go:334] "Generic (PLEG): container finished" podID="16d53c94-9ca3-4b5a-b33d-829f44de5367" containerID="771ce5120d6ca6775253a55e3f28ae12556da44b973d47e5e26e216dd46c01f4" exitCode=0 Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.428650 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nbm8n" event={"ID":"16d53c94-9ca3-4b5a-b33d-829f44de5367","Type":"ContainerDied","Data":"771ce5120d6ca6775253a55e3f28ae12556da44b973d47e5e26e216dd46c01f4"} Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.428731 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nbm8n" event={"ID":"16d53c94-9ca3-4b5a-b33d-829f44de5367","Type":"ContainerDied","Data":"f29c3128e2572671a588f18c9de98595288b3b9ba61f830595e990fb06bf345c"} Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.428735 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nbm8n" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.428755 4831 scope.go:117] "RemoveContainer" containerID="771ce5120d6ca6775253a55e3f28ae12556da44b973d47e5e26e216dd46c01f4" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.431838 4831 generic.go:334] "Generic (PLEG): container finished" podID="069ef39a-dfc9-4a4c-acf4-758475a8a7b0" containerID="fd98962a5f834818a07e310d528c6e9ddbaa8d27ba5da5a19eb238fa134473d4" exitCode=0 Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.431973 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bbqht" event={"ID":"069ef39a-dfc9-4a4c-acf4-758475a8a7b0","Type":"ContainerDied","Data":"fd98962a5f834818a07e310d528c6e9ddbaa8d27ba5da5a19eb238fa134473d4"} Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.432016 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bbqht" event={"ID":"069ef39a-dfc9-4a4c-acf4-758475a8a7b0","Type":"ContainerDied","Data":"7e7f82c3d9f35b88b7583ada3d87c557c2b15de840cfc783e45152c8b8e4c1fe"} Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.432037 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bbqht" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.433356 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jpjfl" event={"ID":"398ad5ca-6c9b-4503-a64e-c31a5e34205a","Type":"ContainerStarted","Data":"60e4cda9ddc3e9a7849f9c3481d10b4e36dede9fc104effb1dc9305c502eec83"} Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.433383 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jpjfl" event={"ID":"398ad5ca-6c9b-4503-a64e-c31a5e34205a","Type":"ContainerStarted","Data":"9ec63d0b0515521e5a0ce5bb0a2afca91c01aafeffb7f819fb57e0e023a20665"} Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.433579 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-jpjfl" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.435134 4831 generic.go:334] "Generic (PLEG): container finished" podID="0164870d-aba7-43b6-b798-93ec48968837" containerID="d9581d7f0482b6774ac9be45016a2394459c4c00b1484d3ee835cfc5a8446204" exitCode=0 Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.435196 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xgjhm" event={"ID":"0164870d-aba7-43b6-b798-93ec48968837","Type":"ContainerDied","Data":"d9581d7f0482b6774ac9be45016a2394459c4c00b1484d3ee835cfc5a8446204"} Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.435369 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xgjhm" event={"ID":"0164870d-aba7-43b6-b798-93ec48968837","Type":"ContainerDied","Data":"3600c2a4350f451bd62d5c3fce4de9543305f2144642dd5effa9f8d1d91128ee"} Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.435209 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xgjhm" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.437558 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/069ef39a-dfc9-4a4c-acf4-758475a8a7b0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "069ef39a-dfc9-4a4c-acf4-758475a8a7b0" (UID: "069ef39a-dfc9-4a4c-acf4-758475a8a7b0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.437763 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6xp2" event={"ID":"cdf24e44-d00c-424c-aa3c-3e7fdb1a2997","Type":"ContainerDied","Data":"c11fdb5031e0611630e2779ccd2df35caf362f5c79525779de53c51647211fd1"} Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.437794 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k6xp2" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.438854 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-jpjfl" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.439295 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.439392 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.442455 4831 generic.go:334] "Generic (PLEG): container finished" podID="73f9aaec-7f63-4909-9ffe-6b073e0225d9" containerID="26e109597f23457a82547a3ff0779067d116483ccf3d0aeb35eb8b1215b510eb" exitCode=0 Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.442490 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qp6r2" event={"ID":"73f9aaec-7f63-4909-9ffe-6b073e0225d9","Type":"ContainerDied","Data":"26e109597f23457a82547a3ff0779067d116483ccf3d0aeb35eb8b1215b510eb"} Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.442496 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qp6r2" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.442511 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qp6r2" event={"ID":"73f9aaec-7f63-4909-9ffe-6b073e0225d9","Type":"ContainerDied","Data":"89985e9b156ba7785d44196e744bc00ab4e4ca268a9a5c073e98447b1fc519aa"} Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.453421 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-jpjfl" podStartSLOduration=1.453405285 podStartE2EDuration="1.453405285s" podCreationTimestamp="2025-12-04 10:19:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:19:55.452012445 +0000 UTC m=+292.401187759" watchObservedRunningTime="2025-12-04 10:19:55.453405285 +0000 UTC m=+292.402580609" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.456895 4831 scope.go:117] "RemoveContainer" containerID="e429cd7acf7a841733146b57baf4d335bd58700bbff2acd50cd7c5e20d635e6c" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.464970 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nbm8n"] Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.469931 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nbm8n"] Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.480530 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xgjhm"] Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.483892 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xgjhm"] Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.490243 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k6xp2"] Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.492418 4831 scope.go:117] "RemoveContainer" containerID="1558e7adba5c0f215d8ec18f9990aa8d5fa8a37a15b71744737e363e88e0d2a2" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.495843 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-k6xp2"] Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.514412 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qp6r2"] Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.517716 4831 scope.go:117] "RemoveContainer" containerID="771ce5120d6ca6775253a55e3f28ae12556da44b973d47e5e26e216dd46c01f4" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.519274 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qp6r2"] Dec 04 10:19:55 crc kubenswrapper[4831]: E1204 10:19:55.519330 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"771ce5120d6ca6775253a55e3f28ae12556da44b973d47e5e26e216dd46c01f4\": container with ID starting with 771ce5120d6ca6775253a55e3f28ae12556da44b973d47e5e26e216dd46c01f4 not found: ID does not exist" containerID="771ce5120d6ca6775253a55e3f28ae12556da44b973d47e5e26e216dd46c01f4" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.519375 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"771ce5120d6ca6775253a55e3f28ae12556da44b973d47e5e26e216dd46c01f4"} err="failed to get container status \"771ce5120d6ca6775253a55e3f28ae12556da44b973d47e5e26e216dd46c01f4\": rpc error: code = NotFound desc = could not find container \"771ce5120d6ca6775253a55e3f28ae12556da44b973d47e5e26e216dd46c01f4\": container with ID starting with 771ce5120d6ca6775253a55e3f28ae12556da44b973d47e5e26e216dd46c01f4 not found: ID does not exist" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.519407 4831 scope.go:117] "RemoveContainer" containerID="e429cd7acf7a841733146b57baf4d335bd58700bbff2acd50cd7c5e20d635e6c" Dec 04 10:19:55 crc kubenswrapper[4831]: E1204 10:19:55.519769 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e429cd7acf7a841733146b57baf4d335bd58700bbff2acd50cd7c5e20d635e6c\": container with ID starting with e429cd7acf7a841733146b57baf4d335bd58700bbff2acd50cd7c5e20d635e6c not found: ID does not exist" containerID="e429cd7acf7a841733146b57baf4d335bd58700bbff2acd50cd7c5e20d635e6c" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.519855 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e429cd7acf7a841733146b57baf4d335bd58700bbff2acd50cd7c5e20d635e6c"} err="failed to get container status \"e429cd7acf7a841733146b57baf4d335bd58700bbff2acd50cd7c5e20d635e6c\": rpc error: code = NotFound desc = could not find container \"e429cd7acf7a841733146b57baf4d335bd58700bbff2acd50cd7c5e20d635e6c\": container with ID starting with e429cd7acf7a841733146b57baf4d335bd58700bbff2acd50cd7c5e20d635e6c not found: ID does not exist" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.519928 4831 scope.go:117] "RemoveContainer" containerID="1558e7adba5c0f215d8ec18f9990aa8d5fa8a37a15b71744737e363e88e0d2a2" Dec 04 10:19:55 crc kubenswrapper[4831]: E1204 10:19:55.520298 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1558e7adba5c0f215d8ec18f9990aa8d5fa8a37a15b71744737e363e88e0d2a2\": container with ID starting with 1558e7adba5c0f215d8ec18f9990aa8d5fa8a37a15b71744737e363e88e0d2a2 not found: ID does not exist" containerID="1558e7adba5c0f215d8ec18f9990aa8d5fa8a37a15b71744737e363e88e0d2a2" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.520392 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1558e7adba5c0f215d8ec18f9990aa8d5fa8a37a15b71744737e363e88e0d2a2"} err="failed to get container status \"1558e7adba5c0f215d8ec18f9990aa8d5fa8a37a15b71744737e363e88e0d2a2\": rpc error: code = NotFound desc = could not find container \"1558e7adba5c0f215d8ec18f9990aa8d5fa8a37a15b71744737e363e88e0d2a2\": container with ID starting with 1558e7adba5c0f215d8ec18f9990aa8d5fa8a37a15b71744737e363e88e0d2a2 not found: ID does not exist" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.520482 4831 scope.go:117] "RemoveContainer" containerID="fd98962a5f834818a07e310d528c6e9ddbaa8d27ba5da5a19eb238fa134473d4" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.524937 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/069ef39a-dfc9-4a4c-acf4-758475a8a7b0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.532049 4831 scope.go:117] "RemoveContainer" containerID="2a385cb47e0c776be64864678c3446b6ec49aa0ce5bd5986a1e6eb8253d4d2af" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.543822 4831 scope.go:117] "RemoveContainer" containerID="0058f6b7a61d2ab2d50cc0839901fe460e913d30fe8348c2345bba8a1af2ea7b" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.553839 4831 scope.go:117] "RemoveContainer" containerID="fd98962a5f834818a07e310d528c6e9ddbaa8d27ba5da5a19eb238fa134473d4" Dec 04 10:19:55 crc kubenswrapper[4831]: E1204 10:19:55.554253 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd98962a5f834818a07e310d528c6e9ddbaa8d27ba5da5a19eb238fa134473d4\": container with ID starting with fd98962a5f834818a07e310d528c6e9ddbaa8d27ba5da5a19eb238fa134473d4 not found: ID does not exist" containerID="fd98962a5f834818a07e310d528c6e9ddbaa8d27ba5da5a19eb238fa134473d4" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.554288 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd98962a5f834818a07e310d528c6e9ddbaa8d27ba5da5a19eb238fa134473d4"} err="failed to get container status \"fd98962a5f834818a07e310d528c6e9ddbaa8d27ba5da5a19eb238fa134473d4\": rpc error: code = NotFound desc = could not find container \"fd98962a5f834818a07e310d528c6e9ddbaa8d27ba5da5a19eb238fa134473d4\": container with ID starting with fd98962a5f834818a07e310d528c6e9ddbaa8d27ba5da5a19eb238fa134473d4 not found: ID does not exist" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.554313 4831 scope.go:117] "RemoveContainer" containerID="2a385cb47e0c776be64864678c3446b6ec49aa0ce5bd5986a1e6eb8253d4d2af" Dec 04 10:19:55 crc kubenswrapper[4831]: E1204 10:19:55.555466 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a385cb47e0c776be64864678c3446b6ec49aa0ce5bd5986a1e6eb8253d4d2af\": container with ID starting with 2a385cb47e0c776be64864678c3446b6ec49aa0ce5bd5986a1e6eb8253d4d2af not found: ID does not exist" containerID="2a385cb47e0c776be64864678c3446b6ec49aa0ce5bd5986a1e6eb8253d4d2af" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.555502 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a385cb47e0c776be64864678c3446b6ec49aa0ce5bd5986a1e6eb8253d4d2af"} err="failed to get container status \"2a385cb47e0c776be64864678c3446b6ec49aa0ce5bd5986a1e6eb8253d4d2af\": rpc error: code = NotFound desc = could not find container \"2a385cb47e0c776be64864678c3446b6ec49aa0ce5bd5986a1e6eb8253d4d2af\": container with ID starting with 2a385cb47e0c776be64864678c3446b6ec49aa0ce5bd5986a1e6eb8253d4d2af not found: ID does not exist" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.555520 4831 scope.go:117] "RemoveContainer" containerID="0058f6b7a61d2ab2d50cc0839901fe460e913d30fe8348c2345bba8a1af2ea7b" Dec 04 10:19:55 crc kubenswrapper[4831]: E1204 10:19:55.556034 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0058f6b7a61d2ab2d50cc0839901fe460e913d30fe8348c2345bba8a1af2ea7b\": container with ID starting with 0058f6b7a61d2ab2d50cc0839901fe460e913d30fe8348c2345bba8a1af2ea7b not found: ID does not exist" containerID="0058f6b7a61d2ab2d50cc0839901fe460e913d30fe8348c2345bba8a1af2ea7b" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.556086 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0058f6b7a61d2ab2d50cc0839901fe460e913d30fe8348c2345bba8a1af2ea7b"} err="failed to get container status \"0058f6b7a61d2ab2d50cc0839901fe460e913d30fe8348c2345bba8a1af2ea7b\": rpc error: code = NotFound desc = could not find container \"0058f6b7a61d2ab2d50cc0839901fe460e913d30fe8348c2345bba8a1af2ea7b\": container with ID starting with 0058f6b7a61d2ab2d50cc0839901fe460e913d30fe8348c2345bba8a1af2ea7b not found: ID does not exist" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.556120 4831 scope.go:117] "RemoveContainer" containerID="d9581d7f0482b6774ac9be45016a2394459c4c00b1484d3ee835cfc5a8446204" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.567770 4831 scope.go:117] "RemoveContainer" containerID="cac80f8593894e36e4f9857cacc4e0368924db553aceea716589908a51699e70" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.578013 4831 scope.go:117] "RemoveContainer" containerID="6b6c94559811f8faf79684dcc5dfcd29da386088365131a2a6ffa4e9e9cc5ee1" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.589339 4831 scope.go:117] "RemoveContainer" containerID="d9581d7f0482b6774ac9be45016a2394459c4c00b1484d3ee835cfc5a8446204" Dec 04 10:19:55 crc kubenswrapper[4831]: E1204 10:19:55.589890 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9581d7f0482b6774ac9be45016a2394459c4c00b1484d3ee835cfc5a8446204\": container with ID starting with d9581d7f0482b6774ac9be45016a2394459c4c00b1484d3ee835cfc5a8446204 not found: ID does not exist" containerID="d9581d7f0482b6774ac9be45016a2394459c4c00b1484d3ee835cfc5a8446204" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.589920 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9581d7f0482b6774ac9be45016a2394459c4c00b1484d3ee835cfc5a8446204"} err="failed to get container status \"d9581d7f0482b6774ac9be45016a2394459c4c00b1484d3ee835cfc5a8446204\": rpc error: code = NotFound desc = could not find container \"d9581d7f0482b6774ac9be45016a2394459c4c00b1484d3ee835cfc5a8446204\": container with ID starting with d9581d7f0482b6774ac9be45016a2394459c4c00b1484d3ee835cfc5a8446204 not found: ID does not exist" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.589940 4831 scope.go:117] "RemoveContainer" containerID="cac80f8593894e36e4f9857cacc4e0368924db553aceea716589908a51699e70" Dec 04 10:19:55 crc kubenswrapper[4831]: E1204 10:19:55.590198 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cac80f8593894e36e4f9857cacc4e0368924db553aceea716589908a51699e70\": container with ID starting with cac80f8593894e36e4f9857cacc4e0368924db553aceea716589908a51699e70 not found: ID does not exist" containerID="cac80f8593894e36e4f9857cacc4e0368924db553aceea716589908a51699e70" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.590223 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cac80f8593894e36e4f9857cacc4e0368924db553aceea716589908a51699e70"} err="failed to get container status \"cac80f8593894e36e4f9857cacc4e0368924db553aceea716589908a51699e70\": rpc error: code = NotFound desc = could not find container \"cac80f8593894e36e4f9857cacc4e0368924db553aceea716589908a51699e70\": container with ID starting with cac80f8593894e36e4f9857cacc4e0368924db553aceea716589908a51699e70 not found: ID does not exist" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.590238 4831 scope.go:117] "RemoveContainer" containerID="6b6c94559811f8faf79684dcc5dfcd29da386088365131a2a6ffa4e9e9cc5ee1" Dec 04 10:19:55 crc kubenswrapper[4831]: E1204 10:19:55.592080 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b6c94559811f8faf79684dcc5dfcd29da386088365131a2a6ffa4e9e9cc5ee1\": container with ID starting with 6b6c94559811f8faf79684dcc5dfcd29da386088365131a2a6ffa4e9e9cc5ee1 not found: ID does not exist" containerID="6b6c94559811f8faf79684dcc5dfcd29da386088365131a2a6ffa4e9e9cc5ee1" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.592113 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b6c94559811f8faf79684dcc5dfcd29da386088365131a2a6ffa4e9e9cc5ee1"} err="failed to get container status \"6b6c94559811f8faf79684dcc5dfcd29da386088365131a2a6ffa4e9e9cc5ee1\": rpc error: code = NotFound desc = could not find container \"6b6c94559811f8faf79684dcc5dfcd29da386088365131a2a6ffa4e9e9cc5ee1\": container with ID starting with 6b6c94559811f8faf79684dcc5dfcd29da386088365131a2a6ffa4e9e9cc5ee1 not found: ID does not exist" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.592142 4831 scope.go:117] "RemoveContainer" containerID="398a5435e0416adc6d4938d7ad0d91b9ca52042c42d9094b63038b7167258736" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.605854 4831 scope.go:117] "RemoveContainer" containerID="f65497c5935b108d68c9c9bdfab5b9984be8d376d71d1770615e21ddff93b5e1" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.618579 4831 scope.go:117] "RemoveContainer" containerID="63f91e592ea006ce92a36cf5e1f99851ccdbd4e1979bf367ba56187011e52ec5" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.631424 4831 scope.go:117] "RemoveContainer" containerID="9ce0fce46321ec484701f16167b757cb2270b8a05acbc0a2c516d3e40a7374f7" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.646704 4831 scope.go:117] "RemoveContainer" containerID="26e109597f23457a82547a3ff0779067d116483ccf3d0aeb35eb8b1215b510eb" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.661273 4831 scope.go:117] "RemoveContainer" containerID="26e109597f23457a82547a3ff0779067d116483ccf3d0aeb35eb8b1215b510eb" Dec 04 10:19:55 crc kubenswrapper[4831]: E1204 10:19:55.661730 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26e109597f23457a82547a3ff0779067d116483ccf3d0aeb35eb8b1215b510eb\": container with ID starting with 26e109597f23457a82547a3ff0779067d116483ccf3d0aeb35eb8b1215b510eb not found: ID does not exist" containerID="26e109597f23457a82547a3ff0779067d116483ccf3d0aeb35eb8b1215b510eb" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.661785 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26e109597f23457a82547a3ff0779067d116483ccf3d0aeb35eb8b1215b510eb"} err="failed to get container status \"26e109597f23457a82547a3ff0779067d116483ccf3d0aeb35eb8b1215b510eb\": rpc error: code = NotFound desc = could not find container \"26e109597f23457a82547a3ff0779067d116483ccf3d0aeb35eb8b1215b510eb\": container with ID starting with 26e109597f23457a82547a3ff0779067d116483ccf3d0aeb35eb8b1215b510eb not found: ID does not exist" Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.771080 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bbqht"] Dec 04 10:19:55 crc kubenswrapper[4831]: I1204 10:19:55.780003 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bbqht"] Dec 04 10:19:57 crc kubenswrapper[4831]: I1204 10:19:57.283841 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0164870d-aba7-43b6-b798-93ec48968837" path="/var/lib/kubelet/pods/0164870d-aba7-43b6-b798-93ec48968837/volumes" Dec 04 10:19:57 crc kubenswrapper[4831]: I1204 10:19:57.284468 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="069ef39a-dfc9-4a4c-acf4-758475a8a7b0" path="/var/lib/kubelet/pods/069ef39a-dfc9-4a4c-acf4-758475a8a7b0/volumes" Dec 04 10:19:57 crc kubenswrapper[4831]: I1204 10:19:57.285026 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16d53c94-9ca3-4b5a-b33d-829f44de5367" path="/var/lib/kubelet/pods/16d53c94-9ca3-4b5a-b33d-829f44de5367/volumes" Dec 04 10:19:57 crc kubenswrapper[4831]: I1204 10:19:57.285611 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73f9aaec-7f63-4909-9ffe-6b073e0225d9" path="/var/lib/kubelet/pods/73f9aaec-7f63-4909-9ffe-6b073e0225d9/volumes" Dec 04 10:19:57 crc kubenswrapper[4831]: I1204 10:19:57.286055 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdf24e44-d00c-424c-aa3c-3e7fdb1a2997" path="/var/lib/kubelet/pods/cdf24e44-d00c-424c-aa3c-3e7fdb1a2997/volumes" Dec 04 10:20:06 crc kubenswrapper[4831]: I1204 10:20:06.299270 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 04 10:20:07 crc kubenswrapper[4831]: I1204 10:20:07.417971 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 04 10:20:07 crc kubenswrapper[4831]: I1204 10:20:07.466954 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 04 10:20:14 crc kubenswrapper[4831]: I1204 10:20:14.147746 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 04 10:20:16 crc kubenswrapper[4831]: I1204 10:20:16.139063 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 04 10:20:31 crc kubenswrapper[4831]: I1204 10:20:31.667287 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-qssm4"] Dec 04 10:20:31 crc kubenswrapper[4831]: I1204 10:20:31.668215 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-qssm4" podUID="2b483323-5ed6-40b5-b256-c9de7033e4eb" containerName="controller-manager" containerID="cri-o://66e55b3027ad31a707d9bf24e1630fb49945d54e22f6f0f95452c76e4067ee58" gracePeriod=30 Dec 04 10:20:31 crc kubenswrapper[4831]: I1204 10:20:31.773902 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rsttb"] Dec 04 10:20:31 crc kubenswrapper[4831]: I1204 10:20:31.774467 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rsttb" podUID="edd157b6-46a0-4a10-94fb-670544f743ca" containerName="route-controller-manager" containerID="cri-o://1dd9407ee7daf190ffbc1948c448cbf5edd5f023b937130a5bdcff4a053f3013" gracePeriod=30 Dec 04 10:20:32 crc kubenswrapper[4831]: I1204 10:20:32.012519 4831 generic.go:334] "Generic (PLEG): container finished" podID="2b483323-5ed6-40b5-b256-c9de7033e4eb" containerID="66e55b3027ad31a707d9bf24e1630fb49945d54e22f6f0f95452c76e4067ee58" exitCode=0 Dec 04 10:20:32 crc kubenswrapper[4831]: I1204 10:20:32.012572 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-qssm4" event={"ID":"2b483323-5ed6-40b5-b256-c9de7033e4eb","Type":"ContainerDied","Data":"66e55b3027ad31a707d9bf24e1630fb49945d54e22f6f0f95452c76e4067ee58"} Dec 04 10:20:32 crc kubenswrapper[4831]: I1204 10:20:32.016173 4831 generic.go:334] "Generic (PLEG): container finished" podID="edd157b6-46a0-4a10-94fb-670544f743ca" containerID="1dd9407ee7daf190ffbc1948c448cbf5edd5f023b937130a5bdcff4a053f3013" exitCode=0 Dec 04 10:20:32 crc kubenswrapper[4831]: I1204 10:20:32.016212 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rsttb" event={"ID":"edd157b6-46a0-4a10-94fb-670544f743ca","Type":"ContainerDied","Data":"1dd9407ee7daf190ffbc1948c448cbf5edd5f023b937130a5bdcff4a053f3013"} Dec 04 10:20:32 crc kubenswrapper[4831]: I1204 10:20:32.053434 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-qssm4" Dec 04 10:20:32 crc kubenswrapper[4831]: I1204 10:20:32.088714 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rsttb" Dec 04 10:20:32 crc kubenswrapper[4831]: I1204 10:20:32.223595 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edd157b6-46a0-4a10-94fb-670544f743ca-config\") pod \"edd157b6-46a0-4a10-94fb-670544f743ca\" (UID: \"edd157b6-46a0-4a10-94fb-670544f743ca\") " Dec 04 10:20:32 crc kubenswrapper[4831]: I1204 10:20:32.223724 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kttjg\" (UniqueName: \"kubernetes.io/projected/edd157b6-46a0-4a10-94fb-670544f743ca-kube-api-access-kttjg\") pod \"edd157b6-46a0-4a10-94fb-670544f743ca\" (UID: \"edd157b6-46a0-4a10-94fb-670544f743ca\") " Dec 04 10:20:32 crc kubenswrapper[4831]: I1204 10:20:32.224913 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b483323-5ed6-40b5-b256-c9de7033e4eb-serving-cert\") pod \"2b483323-5ed6-40b5-b256-c9de7033e4eb\" (UID: \"2b483323-5ed6-40b5-b256-c9de7033e4eb\") " Dec 04 10:20:32 crc kubenswrapper[4831]: I1204 10:20:32.225004 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edd157b6-46a0-4a10-94fb-670544f743ca-serving-cert\") pod \"edd157b6-46a0-4a10-94fb-670544f743ca\" (UID: \"edd157b6-46a0-4a10-94fb-670544f743ca\") " Dec 04 10:20:32 crc kubenswrapper[4831]: I1204 10:20:32.225044 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/edd157b6-46a0-4a10-94fb-670544f743ca-client-ca\") pod \"edd157b6-46a0-4a10-94fb-670544f743ca\" (UID: \"edd157b6-46a0-4a10-94fb-670544f743ca\") " Dec 04 10:20:32 crc kubenswrapper[4831]: I1204 10:20:32.225115 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b483323-5ed6-40b5-b256-c9de7033e4eb-config\") pod \"2b483323-5ed6-40b5-b256-c9de7033e4eb\" (UID: \"2b483323-5ed6-40b5-b256-c9de7033e4eb\") " Dec 04 10:20:32 crc kubenswrapper[4831]: I1204 10:20:32.225141 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2b483323-5ed6-40b5-b256-c9de7033e4eb-client-ca\") pod \"2b483323-5ed6-40b5-b256-c9de7033e4eb\" (UID: \"2b483323-5ed6-40b5-b256-c9de7033e4eb\") " Dec 04 10:20:32 crc kubenswrapper[4831]: I1204 10:20:32.225133 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edd157b6-46a0-4a10-94fb-670544f743ca-config" (OuterVolumeSpecName: "config") pod "edd157b6-46a0-4a10-94fb-670544f743ca" (UID: "edd157b6-46a0-4a10-94fb-670544f743ca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:20:32 crc kubenswrapper[4831]: I1204 10:20:32.225168 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcdwq\" (UniqueName: \"kubernetes.io/projected/2b483323-5ed6-40b5-b256-c9de7033e4eb-kube-api-access-hcdwq\") pod \"2b483323-5ed6-40b5-b256-c9de7033e4eb\" (UID: \"2b483323-5ed6-40b5-b256-c9de7033e4eb\") " Dec 04 10:20:32 crc kubenswrapper[4831]: I1204 10:20:32.225201 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2b483323-5ed6-40b5-b256-c9de7033e4eb-proxy-ca-bundles\") pod \"2b483323-5ed6-40b5-b256-c9de7033e4eb\" (UID: \"2b483323-5ed6-40b5-b256-c9de7033e4eb\") " Dec 04 10:20:32 crc kubenswrapper[4831]: I1204 10:20:32.225398 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edd157b6-46a0-4a10-94fb-670544f743ca-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:20:32 crc kubenswrapper[4831]: I1204 10:20:32.225780 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b483323-5ed6-40b5-b256-c9de7033e4eb-client-ca" (OuterVolumeSpecName: "client-ca") pod "2b483323-5ed6-40b5-b256-c9de7033e4eb" (UID: "2b483323-5ed6-40b5-b256-c9de7033e4eb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:20:32 crc kubenswrapper[4831]: I1204 10:20:32.226113 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b483323-5ed6-40b5-b256-c9de7033e4eb-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2b483323-5ed6-40b5-b256-c9de7033e4eb" (UID: "2b483323-5ed6-40b5-b256-c9de7033e4eb"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:20:32 crc kubenswrapper[4831]: I1204 10:20:32.226156 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edd157b6-46a0-4a10-94fb-670544f743ca-client-ca" (OuterVolumeSpecName: "client-ca") pod "edd157b6-46a0-4a10-94fb-670544f743ca" (UID: "edd157b6-46a0-4a10-94fb-670544f743ca"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:20:32 crc kubenswrapper[4831]: I1204 10:20:32.226216 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b483323-5ed6-40b5-b256-c9de7033e4eb-config" (OuterVolumeSpecName: "config") pod "2b483323-5ed6-40b5-b256-c9de7033e4eb" (UID: "2b483323-5ed6-40b5-b256-c9de7033e4eb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:20:32 crc kubenswrapper[4831]: I1204 10:20:32.231109 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edd157b6-46a0-4a10-94fb-670544f743ca-kube-api-access-kttjg" (OuterVolumeSpecName: "kube-api-access-kttjg") pod "edd157b6-46a0-4a10-94fb-670544f743ca" (UID: "edd157b6-46a0-4a10-94fb-670544f743ca"). InnerVolumeSpecName "kube-api-access-kttjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:20:32 crc kubenswrapper[4831]: I1204 10:20:32.231114 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b483323-5ed6-40b5-b256-c9de7033e4eb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2b483323-5ed6-40b5-b256-c9de7033e4eb" (UID: "2b483323-5ed6-40b5-b256-c9de7033e4eb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:20:32 crc kubenswrapper[4831]: I1204 10:20:32.231153 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edd157b6-46a0-4a10-94fb-670544f743ca-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "edd157b6-46a0-4a10-94fb-670544f743ca" (UID: "edd157b6-46a0-4a10-94fb-670544f743ca"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:20:32 crc kubenswrapper[4831]: I1204 10:20:32.231863 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b483323-5ed6-40b5-b256-c9de7033e4eb-kube-api-access-hcdwq" (OuterVolumeSpecName: "kube-api-access-hcdwq") pod "2b483323-5ed6-40b5-b256-c9de7033e4eb" (UID: "2b483323-5ed6-40b5-b256-c9de7033e4eb"). InnerVolumeSpecName "kube-api-access-hcdwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:20:32 crc kubenswrapper[4831]: I1204 10:20:32.326398 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b483323-5ed6-40b5-b256-c9de7033e4eb-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:20:32 crc kubenswrapper[4831]: I1204 10:20:32.326450 4831 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2b483323-5ed6-40b5-b256-c9de7033e4eb-client-ca\") on node \"crc\" DevicePath \"\"" Dec 04 10:20:32 crc kubenswrapper[4831]: I1204 10:20:32.326473 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcdwq\" (UniqueName: \"kubernetes.io/projected/2b483323-5ed6-40b5-b256-c9de7033e4eb-kube-api-access-hcdwq\") on node \"crc\" DevicePath \"\"" Dec 04 10:20:32 crc kubenswrapper[4831]: I1204 10:20:32.326492 4831 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2b483323-5ed6-40b5-b256-c9de7033e4eb-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 04 10:20:32 crc kubenswrapper[4831]: I1204 10:20:32.326508 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kttjg\" (UniqueName: \"kubernetes.io/projected/edd157b6-46a0-4a10-94fb-670544f743ca-kube-api-access-kttjg\") on node \"crc\" DevicePath \"\"" Dec 04 10:20:32 crc kubenswrapper[4831]: I1204 10:20:32.326525 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b483323-5ed6-40b5-b256-c9de7033e4eb-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 10:20:32 crc kubenswrapper[4831]: I1204 10:20:32.326541 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edd157b6-46a0-4a10-94fb-670544f743ca-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 10:20:32 crc kubenswrapper[4831]: I1204 10:20:32.326556 4831 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/edd157b6-46a0-4a10-94fb-670544f743ca-client-ca\") on node \"crc\" DevicePath \"\"" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.027459 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-qssm4" event={"ID":"2b483323-5ed6-40b5-b256-c9de7033e4eb","Type":"ContainerDied","Data":"a4ab4eb18963e5eb18ffad30f6ac3e43e26a1ea255f8496c712d742a8bc03e3f"} Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.027516 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-qssm4" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.027527 4831 scope.go:117] "RemoveContainer" containerID="66e55b3027ad31a707d9bf24e1630fb49945d54e22f6f0f95452c76e4067ee58" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.030548 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rsttb" event={"ID":"edd157b6-46a0-4a10-94fb-670544f743ca","Type":"ContainerDied","Data":"b73534be47bcd81604328d43ec7c3924fb6734d07024444576849fe63cbeb56e"} Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.030643 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rsttb" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.057501 4831 scope.go:117] "RemoveContainer" containerID="1dd9407ee7daf190ffbc1948c448cbf5edd5f023b937130a5bdcff4a053f3013" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.079511 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-qssm4"] Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.083649 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-qssm4"] Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.090188 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rsttb"] Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.093301 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rsttb"] Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.192226 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5f4c649f4c-7ndrp"] Dec 04 10:20:33 crc kubenswrapper[4831]: E1204 10:20:33.192636 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdf24e44-d00c-424c-aa3c-3e7fdb1a2997" containerName="registry-server" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.192700 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdf24e44-d00c-424c-aa3c-3e7fdb1a2997" containerName="registry-server" Dec 04 10:20:33 crc kubenswrapper[4831]: E1204 10:20:33.192731 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16d53c94-9ca3-4b5a-b33d-829f44de5367" containerName="extract-content" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.192749 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="16d53c94-9ca3-4b5a-b33d-829f44de5367" containerName="extract-content" Dec 04 10:20:33 crc kubenswrapper[4831]: E1204 10:20:33.192775 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="069ef39a-dfc9-4a4c-acf4-758475a8a7b0" containerName="registry-server" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.192791 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="069ef39a-dfc9-4a4c-acf4-758475a8a7b0" containerName="registry-server" Dec 04 10:20:33 crc kubenswrapper[4831]: E1204 10:20:33.192821 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0164870d-aba7-43b6-b798-93ec48968837" containerName="extract-utilities" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.192837 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="0164870d-aba7-43b6-b798-93ec48968837" containerName="extract-utilities" Dec 04 10:20:33 crc kubenswrapper[4831]: E1204 10:20:33.192857 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b483323-5ed6-40b5-b256-c9de7033e4eb" containerName="controller-manager" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.192873 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b483323-5ed6-40b5-b256-c9de7033e4eb" containerName="controller-manager" Dec 04 10:20:33 crc kubenswrapper[4831]: E1204 10:20:33.192892 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="069ef39a-dfc9-4a4c-acf4-758475a8a7b0" containerName="extract-utilities" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.192908 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="069ef39a-dfc9-4a4c-acf4-758475a8a7b0" containerName="extract-utilities" Dec 04 10:20:33 crc kubenswrapper[4831]: E1204 10:20:33.192933 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0164870d-aba7-43b6-b798-93ec48968837" containerName="registry-server" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.192949 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="0164870d-aba7-43b6-b798-93ec48968837" containerName="registry-server" Dec 04 10:20:33 crc kubenswrapper[4831]: E1204 10:20:33.192970 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdf24e44-d00c-424c-aa3c-3e7fdb1a2997" containerName="extract-utilities" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.192994 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdf24e44-d00c-424c-aa3c-3e7fdb1a2997" containerName="extract-utilities" Dec 04 10:20:33 crc kubenswrapper[4831]: E1204 10:20:33.193012 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdf24e44-d00c-424c-aa3c-3e7fdb1a2997" containerName="extract-content" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.193028 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdf24e44-d00c-424c-aa3c-3e7fdb1a2997" containerName="extract-content" Dec 04 10:20:33 crc kubenswrapper[4831]: E1204 10:20:33.193054 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16d53c94-9ca3-4b5a-b33d-829f44de5367" containerName="extract-utilities" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.193072 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="16d53c94-9ca3-4b5a-b33d-829f44de5367" containerName="extract-utilities" Dec 04 10:20:33 crc kubenswrapper[4831]: E1204 10:20:33.193090 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16d53c94-9ca3-4b5a-b33d-829f44de5367" containerName="registry-server" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.193107 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="16d53c94-9ca3-4b5a-b33d-829f44de5367" containerName="registry-server" Dec 04 10:20:33 crc kubenswrapper[4831]: E1204 10:20:33.193133 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="069ef39a-dfc9-4a4c-acf4-758475a8a7b0" containerName="extract-content" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.193148 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="069ef39a-dfc9-4a4c-acf4-758475a8a7b0" containerName="extract-content" Dec 04 10:20:33 crc kubenswrapper[4831]: E1204 10:20:33.193172 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73f9aaec-7f63-4909-9ffe-6b073e0225d9" containerName="marketplace-operator" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.193215 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="73f9aaec-7f63-4909-9ffe-6b073e0225d9" containerName="marketplace-operator" Dec 04 10:20:33 crc kubenswrapper[4831]: E1204 10:20:33.193242 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0164870d-aba7-43b6-b798-93ec48968837" containerName="extract-content" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.193258 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="0164870d-aba7-43b6-b798-93ec48968837" containerName="extract-content" Dec 04 10:20:33 crc kubenswrapper[4831]: E1204 10:20:33.193282 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edd157b6-46a0-4a10-94fb-670544f743ca" containerName="route-controller-manager" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.193299 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="edd157b6-46a0-4a10-94fb-670544f743ca" containerName="route-controller-manager" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.193522 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b483323-5ed6-40b5-b256-c9de7033e4eb" containerName="controller-manager" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.193560 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="edd157b6-46a0-4a10-94fb-670544f743ca" containerName="route-controller-manager" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.193585 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="16d53c94-9ca3-4b5a-b33d-829f44de5367" containerName="registry-server" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.193613 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdf24e44-d00c-424c-aa3c-3e7fdb1a2997" containerName="registry-server" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.193637 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="069ef39a-dfc9-4a4c-acf4-758475a8a7b0" containerName="registry-server" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.193695 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="0164870d-aba7-43b6-b798-93ec48968837" containerName="registry-server" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.193720 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="73f9aaec-7f63-4909-9ffe-6b073e0225d9" containerName="marketplace-operator" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.194348 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57d4b786c5-wfxht"] Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.194550 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f4c649f4c-7ndrp" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.196098 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57d4b786c5-wfxht" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.198279 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.198778 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.199052 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.199528 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.199835 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.200175 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.200347 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.200884 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.202274 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.202277 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.202560 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.202680 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.213029 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57d4b786c5-wfxht"] Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.264629 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc4bk\" (UniqueName: \"kubernetes.io/projected/e969b640-c96e-4d25-82a4-d16e05b4a242-kube-api-access-kc4bk\") pod \"route-controller-manager-57d4b786c5-wfxht\" (UID: \"e969b640-c96e-4d25-82a4-d16e05b4a242\") " pod="openshift-route-controller-manager/route-controller-manager-57d4b786c5-wfxht" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.264694 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e969b640-c96e-4d25-82a4-d16e05b4a242-serving-cert\") pod \"route-controller-manager-57d4b786c5-wfxht\" (UID: \"e969b640-c96e-4d25-82a4-d16e05b4a242\") " pod="openshift-route-controller-manager/route-controller-manager-57d4b786c5-wfxht" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.264733 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46-config\") pod \"controller-manager-5f4c649f4c-7ndrp\" (UID: \"2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46\") " pod="openshift-controller-manager/controller-manager-5f4c649f4c-7ndrp" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.264762 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46-client-ca\") pod \"controller-manager-5f4c649f4c-7ndrp\" (UID: \"2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46\") " pod="openshift-controller-manager/controller-manager-5f4c649f4c-7ndrp" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.264811 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46-serving-cert\") pod \"controller-manager-5f4c649f4c-7ndrp\" (UID: \"2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46\") " pod="openshift-controller-manager/controller-manager-5f4c649f4c-7ndrp" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.264835 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e969b640-c96e-4d25-82a4-d16e05b4a242-client-ca\") pod \"route-controller-manager-57d4b786c5-wfxht\" (UID: \"e969b640-c96e-4d25-82a4-d16e05b4a242\") " pod="openshift-route-controller-manager/route-controller-manager-57d4b786c5-wfxht" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.264857 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46-proxy-ca-bundles\") pod \"controller-manager-5f4c649f4c-7ndrp\" (UID: \"2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46\") " pod="openshift-controller-manager/controller-manager-5f4c649f4c-7ndrp" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.264885 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e969b640-c96e-4d25-82a4-d16e05b4a242-config\") pod \"route-controller-manager-57d4b786c5-wfxht\" (UID: \"e969b640-c96e-4d25-82a4-d16e05b4a242\") " pod="openshift-route-controller-manager/route-controller-manager-57d4b786c5-wfxht" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.264913 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk9nw\" (UniqueName: \"kubernetes.io/projected/2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46-kube-api-access-mk9nw\") pod \"controller-manager-5f4c649f4c-7ndrp\" (UID: \"2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46\") " pod="openshift-controller-manager/controller-manager-5f4c649f4c-7ndrp" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.270802 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5f4c649f4c-7ndrp"] Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.273525 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.287451 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b483323-5ed6-40b5-b256-c9de7033e4eb" path="/var/lib/kubelet/pods/2b483323-5ed6-40b5-b256-c9de7033e4eb/volumes" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.288367 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edd157b6-46a0-4a10-94fb-670544f743ca" path="/var/lib/kubelet/pods/edd157b6-46a0-4a10-94fb-670544f743ca/volumes" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.367817 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46-config\") pod \"controller-manager-5f4c649f4c-7ndrp\" (UID: \"2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46\") " pod="openshift-controller-manager/controller-manager-5f4c649f4c-7ndrp" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.367884 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46-client-ca\") pod \"controller-manager-5f4c649f4c-7ndrp\" (UID: \"2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46\") " pod="openshift-controller-manager/controller-manager-5f4c649f4c-7ndrp" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.368003 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46-serving-cert\") pod \"controller-manager-5f4c649f4c-7ndrp\" (UID: \"2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46\") " pod="openshift-controller-manager/controller-manager-5f4c649f4c-7ndrp" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.368039 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e969b640-c96e-4d25-82a4-d16e05b4a242-client-ca\") pod \"route-controller-manager-57d4b786c5-wfxht\" (UID: \"e969b640-c96e-4d25-82a4-d16e05b4a242\") " pod="openshift-route-controller-manager/route-controller-manager-57d4b786c5-wfxht" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.368940 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46-proxy-ca-bundles\") pod \"controller-manager-5f4c649f4c-7ndrp\" (UID: \"2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46\") " pod="openshift-controller-manager/controller-manager-5f4c649f4c-7ndrp" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.369029 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e969b640-c96e-4d25-82a4-d16e05b4a242-config\") pod \"route-controller-manager-57d4b786c5-wfxht\" (UID: \"e969b640-c96e-4d25-82a4-d16e05b4a242\") " pod="openshift-route-controller-manager/route-controller-manager-57d4b786c5-wfxht" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.369431 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mk9nw\" (UniqueName: \"kubernetes.io/projected/2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46-kube-api-access-mk9nw\") pod \"controller-manager-5f4c649f4c-7ndrp\" (UID: \"2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46\") " pod="openshift-controller-manager/controller-manager-5f4c649f4c-7ndrp" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.369499 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc4bk\" (UniqueName: \"kubernetes.io/projected/e969b640-c96e-4d25-82a4-d16e05b4a242-kube-api-access-kc4bk\") pod \"route-controller-manager-57d4b786c5-wfxht\" (UID: \"e969b640-c96e-4d25-82a4-d16e05b4a242\") " pod="openshift-route-controller-manager/route-controller-manager-57d4b786c5-wfxht" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.369523 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e969b640-c96e-4d25-82a4-d16e05b4a242-serving-cert\") pod \"route-controller-manager-57d4b786c5-wfxht\" (UID: \"e969b640-c96e-4d25-82a4-d16e05b4a242\") " pod="openshift-route-controller-manager/route-controller-manager-57d4b786c5-wfxht" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.370004 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46-proxy-ca-bundles\") pod \"controller-manager-5f4c649f4c-7ndrp\" (UID: \"2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46\") " pod="openshift-controller-manager/controller-manager-5f4c649f4c-7ndrp" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.370156 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46-client-ca\") pod \"controller-manager-5f4c649f4c-7ndrp\" (UID: \"2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46\") " pod="openshift-controller-manager/controller-manager-5f4c649f4c-7ndrp" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.370402 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46-config\") pod \"controller-manager-5f4c649f4c-7ndrp\" (UID: \"2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46\") " pod="openshift-controller-manager/controller-manager-5f4c649f4c-7ndrp" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.370856 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e969b640-c96e-4d25-82a4-d16e05b4a242-config\") pod \"route-controller-manager-57d4b786c5-wfxht\" (UID: \"e969b640-c96e-4d25-82a4-d16e05b4a242\") " pod="openshift-route-controller-manager/route-controller-manager-57d4b786c5-wfxht" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.371477 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e969b640-c96e-4d25-82a4-d16e05b4a242-client-ca\") pod \"route-controller-manager-57d4b786c5-wfxht\" (UID: \"e969b640-c96e-4d25-82a4-d16e05b4a242\") " pod="openshift-route-controller-manager/route-controller-manager-57d4b786c5-wfxht" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.374422 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e969b640-c96e-4d25-82a4-d16e05b4a242-serving-cert\") pod \"route-controller-manager-57d4b786c5-wfxht\" (UID: \"e969b640-c96e-4d25-82a4-d16e05b4a242\") " pod="openshift-route-controller-manager/route-controller-manager-57d4b786c5-wfxht" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.376787 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46-serving-cert\") pod \"controller-manager-5f4c649f4c-7ndrp\" (UID: \"2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46\") " pod="openshift-controller-manager/controller-manager-5f4c649f4c-7ndrp" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.386054 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk9nw\" (UniqueName: \"kubernetes.io/projected/2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46-kube-api-access-mk9nw\") pod \"controller-manager-5f4c649f4c-7ndrp\" (UID: \"2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46\") " pod="openshift-controller-manager/controller-manager-5f4c649f4c-7ndrp" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.389555 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc4bk\" (UniqueName: \"kubernetes.io/projected/e969b640-c96e-4d25-82a4-d16e05b4a242-kube-api-access-kc4bk\") pod \"route-controller-manager-57d4b786c5-wfxht\" (UID: \"e969b640-c96e-4d25-82a4-d16e05b4a242\") " pod="openshift-route-controller-manager/route-controller-manager-57d4b786c5-wfxht" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.573322 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f4c649f4c-7ndrp" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.586877 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57d4b786c5-wfxht" Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.847291 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5f4c649f4c-7ndrp"] Dec 04 10:20:33 crc kubenswrapper[4831]: I1204 10:20:33.898993 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57d4b786c5-wfxht"] Dec 04 10:20:33 crc kubenswrapper[4831]: W1204 10:20:33.919378 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode969b640_c96e_4d25_82a4_d16e05b4a242.slice/crio-b142aa60cdd8eed7470a690b0933ce8d0b3dd0a772aea65b74e9af023b0fbd66 WatchSource:0}: Error finding container b142aa60cdd8eed7470a690b0933ce8d0b3dd0a772aea65b74e9af023b0fbd66: Status 404 returned error can't find the container with id b142aa60cdd8eed7470a690b0933ce8d0b3dd0a772aea65b74e9af023b0fbd66 Dec 04 10:20:34 crc kubenswrapper[4831]: I1204 10:20:34.038623 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57d4b786c5-wfxht" event={"ID":"e969b640-c96e-4d25-82a4-d16e05b4a242","Type":"ContainerStarted","Data":"b142aa60cdd8eed7470a690b0933ce8d0b3dd0a772aea65b74e9af023b0fbd66"} Dec 04 10:20:34 crc kubenswrapper[4831]: I1204 10:20:34.043025 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f4c649f4c-7ndrp" event={"ID":"2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46","Type":"ContainerStarted","Data":"29a863cbef0df7ee329d8adc6dd4c3ff4f54ae34da49ba8210cd5f6b1fb38467"} Dec 04 10:20:34 crc kubenswrapper[4831]: I1204 10:20:34.043074 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f4c649f4c-7ndrp" event={"ID":"2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46","Type":"ContainerStarted","Data":"da2d3675ec6a76653f34c20e185cac885ec4610dc3a9fc1a9cc3501d0287edb7"} Dec 04 10:20:34 crc kubenswrapper[4831]: I1204 10:20:34.043351 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5f4c649f4c-7ndrp" Dec 04 10:20:34 crc kubenswrapper[4831]: I1204 10:20:34.045466 4831 patch_prober.go:28] interesting pod/controller-manager-5f4c649f4c-7ndrp container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" start-of-body= Dec 04 10:20:34 crc kubenswrapper[4831]: I1204 10:20:34.045542 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5f4c649f4c-7ndrp" podUID="2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" Dec 04 10:20:34 crc kubenswrapper[4831]: I1204 10:20:34.069909 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5f4c649f4c-7ndrp" podStartSLOduration=3.069879867 podStartE2EDuration="3.069879867s" podCreationTimestamp="2025-12-04 10:20:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:20:34.062424698 +0000 UTC m=+331.011600032" watchObservedRunningTime="2025-12-04 10:20:34.069879867 +0000 UTC m=+331.019055211" Dec 04 10:20:35 crc kubenswrapper[4831]: I1204 10:20:35.057573 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57d4b786c5-wfxht" event={"ID":"e969b640-c96e-4d25-82a4-d16e05b4a242","Type":"ContainerStarted","Data":"30066b72712944614a2b824e33f38e9c06bbaadd3eea9aee23f85c2fe59c6065"} Dec 04 10:20:35 crc kubenswrapper[4831]: I1204 10:20:35.061466 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5f4c649f4c-7ndrp" Dec 04 10:20:35 crc kubenswrapper[4831]: I1204 10:20:35.072480 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-57d4b786c5-wfxht" podStartSLOduration=4.072452605 podStartE2EDuration="4.072452605s" podCreationTimestamp="2025-12-04 10:20:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:20:35.072374935 +0000 UTC m=+332.021550249" watchObservedRunningTime="2025-12-04 10:20:35.072452605 +0000 UTC m=+332.021627919" Dec 04 10:20:36 crc kubenswrapper[4831]: I1204 10:20:36.062860 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-57d4b786c5-wfxht" Dec 04 10:20:36 crc kubenswrapper[4831]: I1204 10:20:36.070058 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-57d4b786c5-wfxht" Dec 04 10:20:51 crc kubenswrapper[4831]: I1204 10:20:51.629931 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57d4b786c5-wfxht"] Dec 04 10:20:51 crc kubenswrapper[4831]: I1204 10:20:51.631793 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-57d4b786c5-wfxht" podUID="e969b640-c96e-4d25-82a4-d16e05b4a242" containerName="route-controller-manager" containerID="cri-o://30066b72712944614a2b824e33f38e9c06bbaadd3eea9aee23f85c2fe59c6065" gracePeriod=30 Dec 04 10:20:52 crc kubenswrapper[4831]: I1204 10:20:52.176256 4831 generic.go:334] "Generic (PLEG): container finished" podID="e969b640-c96e-4d25-82a4-d16e05b4a242" containerID="30066b72712944614a2b824e33f38e9c06bbaadd3eea9aee23f85c2fe59c6065" exitCode=0 Dec 04 10:20:52 crc kubenswrapper[4831]: I1204 10:20:52.176335 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57d4b786c5-wfxht" event={"ID":"e969b640-c96e-4d25-82a4-d16e05b4a242","Type":"ContainerDied","Data":"30066b72712944614a2b824e33f38e9c06bbaadd3eea9aee23f85c2fe59c6065"} Dec 04 10:20:52 crc kubenswrapper[4831]: I1204 10:20:52.645660 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57d4b786c5-wfxht" Dec 04 10:20:52 crc kubenswrapper[4831]: I1204 10:20:52.696364 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-797948b588-vhwb9"] Dec 04 10:20:52 crc kubenswrapper[4831]: E1204 10:20:52.696686 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e969b640-c96e-4d25-82a4-d16e05b4a242" containerName="route-controller-manager" Dec 04 10:20:52 crc kubenswrapper[4831]: I1204 10:20:52.696702 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e969b640-c96e-4d25-82a4-d16e05b4a242" containerName="route-controller-manager" Dec 04 10:20:52 crc kubenswrapper[4831]: I1204 10:20:52.696834 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e969b640-c96e-4d25-82a4-d16e05b4a242" containerName="route-controller-manager" Dec 04 10:20:52 crc kubenswrapper[4831]: I1204 10:20:52.697317 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-797948b588-vhwb9" Dec 04 10:20:52 crc kubenswrapper[4831]: I1204 10:20:52.701344 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-797948b588-vhwb9"] Dec 04 10:20:52 crc kubenswrapper[4831]: I1204 10:20:52.765001 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e969b640-c96e-4d25-82a4-d16e05b4a242-config\") pod \"e969b640-c96e-4d25-82a4-d16e05b4a242\" (UID: \"e969b640-c96e-4d25-82a4-d16e05b4a242\") " Dec 04 10:20:52 crc kubenswrapper[4831]: I1204 10:20:52.765091 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kc4bk\" (UniqueName: \"kubernetes.io/projected/e969b640-c96e-4d25-82a4-d16e05b4a242-kube-api-access-kc4bk\") pod \"e969b640-c96e-4d25-82a4-d16e05b4a242\" (UID: \"e969b640-c96e-4d25-82a4-d16e05b4a242\") " Dec 04 10:20:52 crc kubenswrapper[4831]: I1204 10:20:52.765376 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e969b640-c96e-4d25-82a4-d16e05b4a242-serving-cert\") pod \"e969b640-c96e-4d25-82a4-d16e05b4a242\" (UID: \"e969b640-c96e-4d25-82a4-d16e05b4a242\") " Dec 04 10:20:52 crc kubenswrapper[4831]: I1204 10:20:52.765419 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e969b640-c96e-4d25-82a4-d16e05b4a242-client-ca\") pod \"e969b640-c96e-4d25-82a4-d16e05b4a242\" (UID: \"e969b640-c96e-4d25-82a4-d16e05b4a242\") " Dec 04 10:20:52 crc kubenswrapper[4831]: I1204 10:20:52.765517 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/551d34f8-f233-42af-b4a2-5b93f26daf67-serving-cert\") pod \"route-controller-manager-797948b588-vhwb9\" (UID: \"551d34f8-f233-42af-b4a2-5b93f26daf67\") " pod="openshift-route-controller-manager/route-controller-manager-797948b588-vhwb9" Dec 04 10:20:52 crc kubenswrapper[4831]: I1204 10:20:52.765545 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/551d34f8-f233-42af-b4a2-5b93f26daf67-config\") pod \"route-controller-manager-797948b588-vhwb9\" (UID: \"551d34f8-f233-42af-b4a2-5b93f26daf67\") " pod="openshift-route-controller-manager/route-controller-manager-797948b588-vhwb9" Dec 04 10:20:52 crc kubenswrapper[4831]: I1204 10:20:52.765580 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpvzv\" (UniqueName: \"kubernetes.io/projected/551d34f8-f233-42af-b4a2-5b93f26daf67-kube-api-access-fpvzv\") pod \"route-controller-manager-797948b588-vhwb9\" (UID: \"551d34f8-f233-42af-b4a2-5b93f26daf67\") " pod="openshift-route-controller-manager/route-controller-manager-797948b588-vhwb9" Dec 04 10:20:52 crc kubenswrapper[4831]: I1204 10:20:52.765605 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/551d34f8-f233-42af-b4a2-5b93f26daf67-client-ca\") pod \"route-controller-manager-797948b588-vhwb9\" (UID: \"551d34f8-f233-42af-b4a2-5b93f26daf67\") " pod="openshift-route-controller-manager/route-controller-manager-797948b588-vhwb9" Dec 04 10:20:52 crc kubenswrapper[4831]: I1204 10:20:52.766158 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e969b640-c96e-4d25-82a4-d16e05b4a242-config" (OuterVolumeSpecName: "config") pod "e969b640-c96e-4d25-82a4-d16e05b4a242" (UID: "e969b640-c96e-4d25-82a4-d16e05b4a242"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:20:52 crc kubenswrapper[4831]: I1204 10:20:52.766346 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e969b640-c96e-4d25-82a4-d16e05b4a242-client-ca" (OuterVolumeSpecName: "client-ca") pod "e969b640-c96e-4d25-82a4-d16e05b4a242" (UID: "e969b640-c96e-4d25-82a4-d16e05b4a242"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:20:52 crc kubenswrapper[4831]: I1204 10:20:52.771287 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e969b640-c96e-4d25-82a4-d16e05b4a242-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e969b640-c96e-4d25-82a4-d16e05b4a242" (UID: "e969b640-c96e-4d25-82a4-d16e05b4a242"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:20:52 crc kubenswrapper[4831]: I1204 10:20:52.772280 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e969b640-c96e-4d25-82a4-d16e05b4a242-kube-api-access-kc4bk" (OuterVolumeSpecName: "kube-api-access-kc4bk") pod "e969b640-c96e-4d25-82a4-d16e05b4a242" (UID: "e969b640-c96e-4d25-82a4-d16e05b4a242"). InnerVolumeSpecName "kube-api-access-kc4bk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:20:52 crc kubenswrapper[4831]: I1204 10:20:52.866741 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/551d34f8-f233-42af-b4a2-5b93f26daf67-serving-cert\") pod \"route-controller-manager-797948b588-vhwb9\" (UID: \"551d34f8-f233-42af-b4a2-5b93f26daf67\") " pod="openshift-route-controller-manager/route-controller-manager-797948b588-vhwb9" Dec 04 10:20:52 crc kubenswrapper[4831]: I1204 10:20:52.866788 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/551d34f8-f233-42af-b4a2-5b93f26daf67-config\") pod \"route-controller-manager-797948b588-vhwb9\" (UID: \"551d34f8-f233-42af-b4a2-5b93f26daf67\") " pod="openshift-route-controller-manager/route-controller-manager-797948b588-vhwb9" Dec 04 10:20:52 crc kubenswrapper[4831]: I1204 10:20:52.866834 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpvzv\" (UniqueName: \"kubernetes.io/projected/551d34f8-f233-42af-b4a2-5b93f26daf67-kube-api-access-fpvzv\") pod \"route-controller-manager-797948b588-vhwb9\" (UID: \"551d34f8-f233-42af-b4a2-5b93f26daf67\") " pod="openshift-route-controller-manager/route-controller-manager-797948b588-vhwb9" Dec 04 10:20:52 crc kubenswrapper[4831]: I1204 10:20:52.866860 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/551d34f8-f233-42af-b4a2-5b93f26daf67-client-ca\") pod \"route-controller-manager-797948b588-vhwb9\" (UID: \"551d34f8-f233-42af-b4a2-5b93f26daf67\") " pod="openshift-route-controller-manager/route-controller-manager-797948b588-vhwb9" Dec 04 10:20:52 crc kubenswrapper[4831]: I1204 10:20:52.866947 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kc4bk\" (UniqueName: \"kubernetes.io/projected/e969b640-c96e-4d25-82a4-d16e05b4a242-kube-api-access-kc4bk\") on node \"crc\" DevicePath \"\"" Dec 04 10:20:52 crc kubenswrapper[4831]: I1204 10:20:52.866958 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e969b640-c96e-4d25-82a4-d16e05b4a242-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 10:20:52 crc kubenswrapper[4831]: I1204 10:20:52.866981 4831 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e969b640-c96e-4d25-82a4-d16e05b4a242-client-ca\") on node \"crc\" DevicePath \"\"" Dec 04 10:20:52 crc kubenswrapper[4831]: I1204 10:20:52.866992 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e969b640-c96e-4d25-82a4-d16e05b4a242-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:20:52 crc kubenswrapper[4831]: I1204 10:20:52.867906 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/551d34f8-f233-42af-b4a2-5b93f26daf67-client-ca\") pod \"route-controller-manager-797948b588-vhwb9\" (UID: \"551d34f8-f233-42af-b4a2-5b93f26daf67\") " pod="openshift-route-controller-manager/route-controller-manager-797948b588-vhwb9" Dec 04 10:20:52 crc kubenswrapper[4831]: I1204 10:20:52.868179 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/551d34f8-f233-42af-b4a2-5b93f26daf67-config\") pod \"route-controller-manager-797948b588-vhwb9\" (UID: \"551d34f8-f233-42af-b4a2-5b93f26daf67\") " pod="openshift-route-controller-manager/route-controller-manager-797948b588-vhwb9" Dec 04 10:20:52 crc kubenswrapper[4831]: I1204 10:20:52.873041 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/551d34f8-f233-42af-b4a2-5b93f26daf67-serving-cert\") pod \"route-controller-manager-797948b588-vhwb9\" (UID: \"551d34f8-f233-42af-b4a2-5b93f26daf67\") " pod="openshift-route-controller-manager/route-controller-manager-797948b588-vhwb9" Dec 04 10:20:52 crc kubenswrapper[4831]: I1204 10:20:52.895225 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpvzv\" (UniqueName: \"kubernetes.io/projected/551d34f8-f233-42af-b4a2-5b93f26daf67-kube-api-access-fpvzv\") pod \"route-controller-manager-797948b588-vhwb9\" (UID: \"551d34f8-f233-42af-b4a2-5b93f26daf67\") " pod="openshift-route-controller-manager/route-controller-manager-797948b588-vhwb9" Dec 04 10:20:53 crc kubenswrapper[4831]: I1204 10:20:53.017444 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-797948b588-vhwb9" Dec 04 10:20:53 crc kubenswrapper[4831]: I1204 10:20:53.182652 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57d4b786c5-wfxht" event={"ID":"e969b640-c96e-4d25-82a4-d16e05b4a242","Type":"ContainerDied","Data":"b142aa60cdd8eed7470a690b0933ce8d0b3dd0a772aea65b74e9af023b0fbd66"} Dec 04 10:20:53 crc kubenswrapper[4831]: I1204 10:20:53.182721 4831 scope.go:117] "RemoveContainer" containerID="30066b72712944614a2b824e33f38e9c06bbaadd3eea9aee23f85c2fe59c6065" Dec 04 10:20:53 crc kubenswrapper[4831]: I1204 10:20:53.182835 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57d4b786c5-wfxht" Dec 04 10:20:53 crc kubenswrapper[4831]: I1204 10:20:53.250135 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57d4b786c5-wfxht"] Dec 04 10:20:53 crc kubenswrapper[4831]: I1204 10:20:53.253910 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57d4b786c5-wfxht"] Dec 04 10:20:53 crc kubenswrapper[4831]: I1204 10:20:53.289401 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e969b640-c96e-4d25-82a4-d16e05b4a242" path="/var/lib/kubelet/pods/e969b640-c96e-4d25-82a4-d16e05b4a242/volumes" Dec 04 10:20:53 crc kubenswrapper[4831]: I1204 10:20:53.503908 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-797948b588-vhwb9"] Dec 04 10:20:54 crc kubenswrapper[4831]: I1204 10:20:54.189358 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-797948b588-vhwb9" event={"ID":"551d34f8-f233-42af-b4a2-5b93f26daf67","Type":"ContainerStarted","Data":"c9ac186aade49e359ad1360483374b83aa01b59418d3e627437f31a3564befad"} Dec 04 10:20:54 crc kubenswrapper[4831]: I1204 10:20:54.189394 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-797948b588-vhwb9" event={"ID":"551d34f8-f233-42af-b4a2-5b93f26daf67","Type":"ContainerStarted","Data":"8617b0a3a0da86b138f9dd979aa8649cdc04541455171e77ad22a6ba66409c02"} Dec 04 10:20:54 crc kubenswrapper[4831]: I1204 10:20:54.191585 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-797948b588-vhwb9" Dec 04 10:20:54 crc kubenswrapper[4831]: I1204 10:20:54.207074 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-797948b588-vhwb9" podStartSLOduration=3.207058143 podStartE2EDuration="3.207058143s" podCreationTimestamp="2025-12-04 10:20:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:20:54.204269298 +0000 UTC m=+351.153444632" watchObservedRunningTime="2025-12-04 10:20:54.207058143 +0000 UTC m=+351.156233457" Dec 04 10:20:54 crc kubenswrapper[4831]: I1204 10:20:54.263684 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-797948b588-vhwb9" Dec 04 10:21:07 crc kubenswrapper[4831]: I1204 10:21:07.913442 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-khfsg"] Dec 04 10:21:07 crc kubenswrapper[4831]: I1204 10:21:07.916250 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-khfsg" Dec 04 10:21:07 crc kubenswrapper[4831]: I1204 10:21:07.918845 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 04 10:21:07 crc kubenswrapper[4831]: I1204 10:21:07.927357 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-khfsg"] Dec 04 10:21:07 crc kubenswrapper[4831]: I1204 10:21:07.956735 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d852025-4ea6-4343-813c-2411dec5469f-catalog-content\") pod \"redhat-operators-khfsg\" (UID: \"1d852025-4ea6-4343-813c-2411dec5469f\") " pod="openshift-marketplace/redhat-operators-khfsg" Dec 04 10:21:07 crc kubenswrapper[4831]: I1204 10:21:07.956829 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5l2b\" (UniqueName: \"kubernetes.io/projected/1d852025-4ea6-4343-813c-2411dec5469f-kube-api-access-g5l2b\") pod \"redhat-operators-khfsg\" (UID: \"1d852025-4ea6-4343-813c-2411dec5469f\") " pod="openshift-marketplace/redhat-operators-khfsg" Dec 04 10:21:07 crc kubenswrapper[4831]: I1204 10:21:07.956864 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d852025-4ea6-4343-813c-2411dec5469f-utilities\") pod \"redhat-operators-khfsg\" (UID: \"1d852025-4ea6-4343-813c-2411dec5469f\") " pod="openshift-marketplace/redhat-operators-khfsg" Dec 04 10:21:08 crc kubenswrapper[4831]: I1204 10:21:08.058189 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d852025-4ea6-4343-813c-2411dec5469f-catalog-content\") pod \"redhat-operators-khfsg\" (UID: \"1d852025-4ea6-4343-813c-2411dec5469f\") " pod="openshift-marketplace/redhat-operators-khfsg" Dec 04 10:21:08 crc kubenswrapper[4831]: I1204 10:21:08.058576 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5l2b\" (UniqueName: \"kubernetes.io/projected/1d852025-4ea6-4343-813c-2411dec5469f-kube-api-access-g5l2b\") pod \"redhat-operators-khfsg\" (UID: \"1d852025-4ea6-4343-813c-2411dec5469f\") " pod="openshift-marketplace/redhat-operators-khfsg" Dec 04 10:21:08 crc kubenswrapper[4831]: I1204 10:21:08.058706 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d852025-4ea6-4343-813c-2411dec5469f-utilities\") pod \"redhat-operators-khfsg\" (UID: \"1d852025-4ea6-4343-813c-2411dec5469f\") " pod="openshift-marketplace/redhat-operators-khfsg" Dec 04 10:21:08 crc kubenswrapper[4831]: I1204 10:21:08.058771 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d852025-4ea6-4343-813c-2411dec5469f-catalog-content\") pod \"redhat-operators-khfsg\" (UID: \"1d852025-4ea6-4343-813c-2411dec5469f\") " pod="openshift-marketplace/redhat-operators-khfsg" Dec 04 10:21:08 crc kubenswrapper[4831]: I1204 10:21:08.059113 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d852025-4ea6-4343-813c-2411dec5469f-utilities\") pod \"redhat-operators-khfsg\" (UID: \"1d852025-4ea6-4343-813c-2411dec5469f\") " pod="openshift-marketplace/redhat-operators-khfsg" Dec 04 10:21:08 crc kubenswrapper[4831]: I1204 10:21:08.078729 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5l2b\" (UniqueName: \"kubernetes.io/projected/1d852025-4ea6-4343-813c-2411dec5469f-kube-api-access-g5l2b\") pod \"redhat-operators-khfsg\" (UID: \"1d852025-4ea6-4343-813c-2411dec5469f\") " pod="openshift-marketplace/redhat-operators-khfsg" Dec 04 10:21:08 crc kubenswrapper[4831]: I1204 10:21:08.106036 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-95vcr"] Dec 04 10:21:08 crc kubenswrapper[4831]: I1204 10:21:08.107192 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-95vcr" Dec 04 10:21:08 crc kubenswrapper[4831]: I1204 10:21:08.109134 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 04 10:21:08 crc kubenswrapper[4831]: I1204 10:21:08.113119 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-95vcr"] Dec 04 10:21:08 crc kubenswrapper[4831]: I1204 10:21:08.235201 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-khfsg" Dec 04 10:21:08 crc kubenswrapper[4831]: I1204 10:21:08.262916 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/910ac3eb-beda-4174-b3da-e3d708ffbcc3-catalog-content\") pod \"certified-operators-95vcr\" (UID: \"910ac3eb-beda-4174-b3da-e3d708ffbcc3\") " pod="openshift-marketplace/certified-operators-95vcr" Dec 04 10:21:08 crc kubenswrapper[4831]: I1204 10:21:08.263314 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/910ac3eb-beda-4174-b3da-e3d708ffbcc3-utilities\") pod \"certified-operators-95vcr\" (UID: \"910ac3eb-beda-4174-b3da-e3d708ffbcc3\") " pod="openshift-marketplace/certified-operators-95vcr" Dec 04 10:21:08 crc kubenswrapper[4831]: I1204 10:21:08.263387 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rhdg\" (UniqueName: \"kubernetes.io/projected/910ac3eb-beda-4174-b3da-e3d708ffbcc3-kube-api-access-4rhdg\") pod \"certified-operators-95vcr\" (UID: \"910ac3eb-beda-4174-b3da-e3d708ffbcc3\") " pod="openshift-marketplace/certified-operators-95vcr" Dec 04 10:21:08 crc kubenswrapper[4831]: I1204 10:21:08.364606 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/910ac3eb-beda-4174-b3da-e3d708ffbcc3-utilities\") pod \"certified-operators-95vcr\" (UID: \"910ac3eb-beda-4174-b3da-e3d708ffbcc3\") " pod="openshift-marketplace/certified-operators-95vcr" Dec 04 10:21:08 crc kubenswrapper[4831]: I1204 10:21:08.364724 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rhdg\" (UniqueName: \"kubernetes.io/projected/910ac3eb-beda-4174-b3da-e3d708ffbcc3-kube-api-access-4rhdg\") pod \"certified-operators-95vcr\" (UID: \"910ac3eb-beda-4174-b3da-e3d708ffbcc3\") " pod="openshift-marketplace/certified-operators-95vcr" Dec 04 10:21:08 crc kubenswrapper[4831]: I1204 10:21:08.364775 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/910ac3eb-beda-4174-b3da-e3d708ffbcc3-catalog-content\") pod \"certified-operators-95vcr\" (UID: \"910ac3eb-beda-4174-b3da-e3d708ffbcc3\") " pod="openshift-marketplace/certified-operators-95vcr" Dec 04 10:21:08 crc kubenswrapper[4831]: I1204 10:21:08.365277 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/910ac3eb-beda-4174-b3da-e3d708ffbcc3-catalog-content\") pod \"certified-operators-95vcr\" (UID: \"910ac3eb-beda-4174-b3da-e3d708ffbcc3\") " pod="openshift-marketplace/certified-operators-95vcr" Dec 04 10:21:08 crc kubenswrapper[4831]: I1204 10:21:08.365549 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/910ac3eb-beda-4174-b3da-e3d708ffbcc3-utilities\") pod \"certified-operators-95vcr\" (UID: \"910ac3eb-beda-4174-b3da-e3d708ffbcc3\") " pod="openshift-marketplace/certified-operators-95vcr" Dec 04 10:21:08 crc kubenswrapper[4831]: I1204 10:21:08.402174 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rhdg\" (UniqueName: \"kubernetes.io/projected/910ac3eb-beda-4174-b3da-e3d708ffbcc3-kube-api-access-4rhdg\") pod \"certified-operators-95vcr\" (UID: \"910ac3eb-beda-4174-b3da-e3d708ffbcc3\") " pod="openshift-marketplace/certified-operators-95vcr" Dec 04 10:21:08 crc kubenswrapper[4831]: I1204 10:21:08.464505 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-95vcr" Dec 04 10:21:08 crc kubenswrapper[4831]: I1204 10:21:08.657751 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-khfsg"] Dec 04 10:21:08 crc kubenswrapper[4831]: W1204 10:21:08.664098 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d852025_4ea6_4343_813c_2411dec5469f.slice/crio-accce8d746c317c477956a1416619daeca2bc6c656c9915b6ebe024bd70ea0f6 WatchSource:0}: Error finding container accce8d746c317c477956a1416619daeca2bc6c656c9915b6ebe024bd70ea0f6: Status 404 returned error can't find the container with id accce8d746c317c477956a1416619daeca2bc6c656c9915b6ebe024bd70ea0f6 Dec 04 10:21:08 crc kubenswrapper[4831]: I1204 10:21:08.874206 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-95vcr"] Dec 04 10:21:08 crc kubenswrapper[4831]: W1204 10:21:08.914393 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod910ac3eb_beda_4174_b3da_e3d708ffbcc3.slice/crio-8b66efe29932eb1bcf1763bab09a4a26e5e7c3a8dbe5f80b15823dc209435599 WatchSource:0}: Error finding container 8b66efe29932eb1bcf1763bab09a4a26e5e7c3a8dbe5f80b15823dc209435599: Status 404 returned error can't find the container with id 8b66efe29932eb1bcf1763bab09a4a26e5e7c3a8dbe5f80b15823dc209435599 Dec 04 10:21:09 crc kubenswrapper[4831]: I1204 10:21:09.267549 4831 generic.go:334] "Generic (PLEG): container finished" podID="910ac3eb-beda-4174-b3da-e3d708ffbcc3" containerID="0bee78cabe00080d02b1bca773182c9944448fc3ce1f17cd459ef74f76f6d231" exitCode=0 Dec 04 10:21:09 crc kubenswrapper[4831]: I1204 10:21:09.267649 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-95vcr" event={"ID":"910ac3eb-beda-4174-b3da-e3d708ffbcc3","Type":"ContainerDied","Data":"0bee78cabe00080d02b1bca773182c9944448fc3ce1f17cd459ef74f76f6d231"} Dec 04 10:21:09 crc kubenswrapper[4831]: I1204 10:21:09.268173 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-95vcr" event={"ID":"910ac3eb-beda-4174-b3da-e3d708ffbcc3","Type":"ContainerStarted","Data":"8b66efe29932eb1bcf1763bab09a4a26e5e7c3a8dbe5f80b15823dc209435599"} Dec 04 10:21:09 crc kubenswrapper[4831]: I1204 10:21:09.272291 4831 generic.go:334] "Generic (PLEG): container finished" podID="1d852025-4ea6-4343-813c-2411dec5469f" containerID="251a39f807f248bfd5666632d5ec8162729e52ab7619f6ff2ff098da14153c69" exitCode=0 Dec 04 10:21:09 crc kubenswrapper[4831]: I1204 10:21:09.272351 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khfsg" event={"ID":"1d852025-4ea6-4343-813c-2411dec5469f","Type":"ContainerDied","Data":"251a39f807f248bfd5666632d5ec8162729e52ab7619f6ff2ff098da14153c69"} Dec 04 10:21:09 crc kubenswrapper[4831]: I1204 10:21:09.272387 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khfsg" event={"ID":"1d852025-4ea6-4343-813c-2411dec5469f","Type":"ContainerStarted","Data":"accce8d746c317c477956a1416619daeca2bc6c656c9915b6ebe024bd70ea0f6"} Dec 04 10:21:09 crc kubenswrapper[4831]: I1204 10:21:09.710434 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-99lqd"] Dec 04 10:21:09 crc kubenswrapper[4831]: I1204 10:21:09.711572 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-99lqd" Dec 04 10:21:09 crc kubenswrapper[4831]: I1204 10:21:09.720112 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 04 10:21:09 crc kubenswrapper[4831]: I1204 10:21:09.727009 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-99lqd"] Dec 04 10:21:09 crc kubenswrapper[4831]: I1204 10:21:09.882446 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a0d708c-5b81-4c5d-9fd7-98e2aed6e7f1-catalog-content\") pod \"community-operators-99lqd\" (UID: \"6a0d708c-5b81-4c5d-9fd7-98e2aed6e7f1\") " pod="openshift-marketplace/community-operators-99lqd" Dec 04 10:21:09 crc kubenswrapper[4831]: I1204 10:21:09.882509 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwhdv\" (UniqueName: \"kubernetes.io/projected/6a0d708c-5b81-4c5d-9fd7-98e2aed6e7f1-kube-api-access-vwhdv\") pod \"community-operators-99lqd\" (UID: \"6a0d708c-5b81-4c5d-9fd7-98e2aed6e7f1\") " pod="openshift-marketplace/community-operators-99lqd" Dec 04 10:21:09 crc kubenswrapper[4831]: I1204 10:21:09.882561 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a0d708c-5b81-4c5d-9fd7-98e2aed6e7f1-utilities\") pod \"community-operators-99lqd\" (UID: \"6a0d708c-5b81-4c5d-9fd7-98e2aed6e7f1\") " pod="openshift-marketplace/community-operators-99lqd" Dec 04 10:21:09 crc kubenswrapper[4831]: I1204 10:21:09.983372 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a0d708c-5b81-4c5d-9fd7-98e2aed6e7f1-utilities\") pod \"community-operators-99lqd\" (UID: \"6a0d708c-5b81-4c5d-9fd7-98e2aed6e7f1\") " pod="openshift-marketplace/community-operators-99lqd" Dec 04 10:21:09 crc kubenswrapper[4831]: I1204 10:21:09.983453 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a0d708c-5b81-4c5d-9fd7-98e2aed6e7f1-catalog-content\") pod \"community-operators-99lqd\" (UID: \"6a0d708c-5b81-4c5d-9fd7-98e2aed6e7f1\") " pod="openshift-marketplace/community-operators-99lqd" Dec 04 10:21:09 crc kubenswrapper[4831]: I1204 10:21:09.983499 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwhdv\" (UniqueName: \"kubernetes.io/projected/6a0d708c-5b81-4c5d-9fd7-98e2aed6e7f1-kube-api-access-vwhdv\") pod \"community-operators-99lqd\" (UID: \"6a0d708c-5b81-4c5d-9fd7-98e2aed6e7f1\") " pod="openshift-marketplace/community-operators-99lqd" Dec 04 10:21:09 crc kubenswrapper[4831]: I1204 10:21:09.983894 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a0d708c-5b81-4c5d-9fd7-98e2aed6e7f1-utilities\") pod \"community-operators-99lqd\" (UID: \"6a0d708c-5b81-4c5d-9fd7-98e2aed6e7f1\") " pod="openshift-marketplace/community-operators-99lqd" Dec 04 10:21:09 crc kubenswrapper[4831]: I1204 10:21:09.985363 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a0d708c-5b81-4c5d-9fd7-98e2aed6e7f1-catalog-content\") pod \"community-operators-99lqd\" (UID: \"6a0d708c-5b81-4c5d-9fd7-98e2aed6e7f1\") " pod="openshift-marketplace/community-operators-99lqd" Dec 04 10:21:10 crc kubenswrapper[4831]: I1204 10:21:10.003104 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwhdv\" (UniqueName: \"kubernetes.io/projected/6a0d708c-5b81-4c5d-9fd7-98e2aed6e7f1-kube-api-access-vwhdv\") pod \"community-operators-99lqd\" (UID: \"6a0d708c-5b81-4c5d-9fd7-98e2aed6e7f1\") " pod="openshift-marketplace/community-operators-99lqd" Dec 04 10:21:10 crc kubenswrapper[4831]: I1204 10:21:10.029080 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-99lqd" Dec 04 10:21:10 crc kubenswrapper[4831]: I1204 10:21:10.279439 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khfsg" event={"ID":"1d852025-4ea6-4343-813c-2411dec5469f","Type":"ContainerStarted","Data":"d93b8fbea0178457775abc09c74f9b88958c916630351a7f9c004427630c6754"} Dec 04 10:21:10 crc kubenswrapper[4831]: I1204 10:21:10.288908 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-95vcr" event={"ID":"910ac3eb-beda-4174-b3da-e3d708ffbcc3","Type":"ContainerStarted","Data":"4081f195d518653a427bd15d2846c74ed76c6716d0aca6928625914895af06cd"} Dec 04 10:21:10 crc kubenswrapper[4831]: I1204 10:21:10.414840 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-99lqd"] Dec 04 10:21:10 crc kubenswrapper[4831]: I1204 10:21:10.702622 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j9qm7"] Dec 04 10:21:10 crc kubenswrapper[4831]: I1204 10:21:10.703868 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j9qm7" Dec 04 10:21:10 crc kubenswrapper[4831]: I1204 10:21:10.706415 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 04 10:21:10 crc kubenswrapper[4831]: I1204 10:21:10.721128 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j9qm7"] Dec 04 10:21:10 crc kubenswrapper[4831]: I1204 10:21:10.800870 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr22b\" (UniqueName: \"kubernetes.io/projected/a598e150-ad70-412c-bf06-9e7bd26a8422-kube-api-access-cr22b\") pod \"redhat-marketplace-j9qm7\" (UID: \"a598e150-ad70-412c-bf06-9e7bd26a8422\") " pod="openshift-marketplace/redhat-marketplace-j9qm7" Dec 04 10:21:10 crc kubenswrapper[4831]: I1204 10:21:10.800920 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a598e150-ad70-412c-bf06-9e7bd26a8422-utilities\") pod \"redhat-marketplace-j9qm7\" (UID: \"a598e150-ad70-412c-bf06-9e7bd26a8422\") " pod="openshift-marketplace/redhat-marketplace-j9qm7" Dec 04 10:21:10 crc kubenswrapper[4831]: I1204 10:21:10.800978 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a598e150-ad70-412c-bf06-9e7bd26a8422-catalog-content\") pod \"redhat-marketplace-j9qm7\" (UID: \"a598e150-ad70-412c-bf06-9e7bd26a8422\") " pod="openshift-marketplace/redhat-marketplace-j9qm7" Dec 04 10:21:10 crc kubenswrapper[4831]: I1204 10:21:10.901856 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a598e150-ad70-412c-bf06-9e7bd26a8422-catalog-content\") pod \"redhat-marketplace-j9qm7\" (UID: \"a598e150-ad70-412c-bf06-9e7bd26a8422\") " pod="openshift-marketplace/redhat-marketplace-j9qm7" Dec 04 10:21:10 crc kubenswrapper[4831]: I1204 10:21:10.901951 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr22b\" (UniqueName: \"kubernetes.io/projected/a598e150-ad70-412c-bf06-9e7bd26a8422-kube-api-access-cr22b\") pod \"redhat-marketplace-j9qm7\" (UID: \"a598e150-ad70-412c-bf06-9e7bd26a8422\") " pod="openshift-marketplace/redhat-marketplace-j9qm7" Dec 04 10:21:10 crc kubenswrapper[4831]: I1204 10:21:10.901980 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a598e150-ad70-412c-bf06-9e7bd26a8422-utilities\") pod \"redhat-marketplace-j9qm7\" (UID: \"a598e150-ad70-412c-bf06-9e7bd26a8422\") " pod="openshift-marketplace/redhat-marketplace-j9qm7" Dec 04 10:21:10 crc kubenswrapper[4831]: I1204 10:21:10.902383 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a598e150-ad70-412c-bf06-9e7bd26a8422-utilities\") pod \"redhat-marketplace-j9qm7\" (UID: \"a598e150-ad70-412c-bf06-9e7bd26a8422\") " pod="openshift-marketplace/redhat-marketplace-j9qm7" Dec 04 10:21:10 crc kubenswrapper[4831]: I1204 10:21:10.902396 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a598e150-ad70-412c-bf06-9e7bd26a8422-catalog-content\") pod \"redhat-marketplace-j9qm7\" (UID: \"a598e150-ad70-412c-bf06-9e7bd26a8422\") " pod="openshift-marketplace/redhat-marketplace-j9qm7" Dec 04 10:21:10 crc kubenswrapper[4831]: I1204 10:21:10.939535 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr22b\" (UniqueName: \"kubernetes.io/projected/a598e150-ad70-412c-bf06-9e7bd26a8422-kube-api-access-cr22b\") pod \"redhat-marketplace-j9qm7\" (UID: \"a598e150-ad70-412c-bf06-9e7bd26a8422\") " pod="openshift-marketplace/redhat-marketplace-j9qm7" Dec 04 10:21:11 crc kubenswrapper[4831]: I1204 10:21:11.025070 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j9qm7" Dec 04 10:21:11 crc kubenswrapper[4831]: I1204 10:21:11.294876 4831 generic.go:334] "Generic (PLEG): container finished" podID="910ac3eb-beda-4174-b3da-e3d708ffbcc3" containerID="4081f195d518653a427bd15d2846c74ed76c6716d0aca6928625914895af06cd" exitCode=0 Dec 04 10:21:11 crc kubenswrapper[4831]: I1204 10:21:11.294949 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-95vcr" event={"ID":"910ac3eb-beda-4174-b3da-e3d708ffbcc3","Type":"ContainerDied","Data":"4081f195d518653a427bd15d2846c74ed76c6716d0aca6928625914895af06cd"} Dec 04 10:21:11 crc kubenswrapper[4831]: I1204 10:21:11.298373 4831 generic.go:334] "Generic (PLEG): container finished" podID="1d852025-4ea6-4343-813c-2411dec5469f" containerID="d93b8fbea0178457775abc09c74f9b88958c916630351a7f9c004427630c6754" exitCode=0 Dec 04 10:21:11 crc kubenswrapper[4831]: I1204 10:21:11.298413 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khfsg" event={"ID":"1d852025-4ea6-4343-813c-2411dec5469f","Type":"ContainerDied","Data":"d93b8fbea0178457775abc09c74f9b88958c916630351a7f9c004427630c6754"} Dec 04 10:21:11 crc kubenswrapper[4831]: I1204 10:21:11.302032 4831 generic.go:334] "Generic (PLEG): container finished" podID="6a0d708c-5b81-4c5d-9fd7-98e2aed6e7f1" containerID="049e6c0bbf290bbaa703a4c01b60a522c36ab43bbf354e10461b85f4348fc82d" exitCode=0 Dec 04 10:21:11 crc kubenswrapper[4831]: I1204 10:21:11.302058 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-99lqd" event={"ID":"6a0d708c-5b81-4c5d-9fd7-98e2aed6e7f1","Type":"ContainerDied","Data":"049e6c0bbf290bbaa703a4c01b60a522c36ab43bbf354e10461b85f4348fc82d"} Dec 04 10:21:11 crc kubenswrapper[4831]: I1204 10:21:11.302073 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-99lqd" event={"ID":"6a0d708c-5b81-4c5d-9fd7-98e2aed6e7f1","Type":"ContainerStarted","Data":"0d4550ec296df6ec401694cb31fb526e5d9ee7efe1d31ea576ecf4b1428f46fd"} Dec 04 10:21:11 crc kubenswrapper[4831]: I1204 10:21:11.482012 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j9qm7"] Dec 04 10:21:11 crc kubenswrapper[4831]: I1204 10:21:11.643496 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5f4c649f4c-7ndrp"] Dec 04 10:21:11 crc kubenswrapper[4831]: I1204 10:21:11.643872 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5f4c649f4c-7ndrp" podUID="2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46" containerName="controller-manager" containerID="cri-o://29a863cbef0df7ee329d8adc6dd4c3ff4f54ae34da49ba8210cd5f6b1fb38467" gracePeriod=30 Dec 04 10:21:12 crc kubenswrapper[4831]: I1204 10:21:12.059685 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f4c649f4c-7ndrp" Dec 04 10:21:12 crc kubenswrapper[4831]: I1204 10:21:12.120940 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46-proxy-ca-bundles\") pod \"2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46\" (UID: \"2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46\") " Dec 04 10:21:12 crc kubenswrapper[4831]: I1204 10:21:12.121003 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46-config\") pod \"2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46\" (UID: \"2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46\") " Dec 04 10:21:12 crc kubenswrapper[4831]: I1204 10:21:12.121049 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46-serving-cert\") pod \"2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46\" (UID: \"2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46\") " Dec 04 10:21:12 crc kubenswrapper[4831]: I1204 10:21:12.121077 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mk9nw\" (UniqueName: \"kubernetes.io/projected/2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46-kube-api-access-mk9nw\") pod \"2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46\" (UID: \"2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46\") " Dec 04 10:21:12 crc kubenswrapper[4831]: I1204 10:21:12.121124 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46-client-ca\") pod \"2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46\" (UID: \"2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46\") " Dec 04 10:21:12 crc kubenswrapper[4831]: I1204 10:21:12.121819 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46-client-ca" (OuterVolumeSpecName: "client-ca") pod "2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46" (UID: "2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:21:12 crc kubenswrapper[4831]: I1204 10:21:12.121880 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46-config" (OuterVolumeSpecName: "config") pod "2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46" (UID: "2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:21:12 crc kubenswrapper[4831]: I1204 10:21:12.122109 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46" (UID: "2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:21:12 crc kubenswrapper[4831]: I1204 10:21:12.128624 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46" (UID: "2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:21:12 crc kubenswrapper[4831]: I1204 10:21:12.135991 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46-kube-api-access-mk9nw" (OuterVolumeSpecName: "kube-api-access-mk9nw") pod "2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46" (UID: "2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46"). InnerVolumeSpecName "kube-api-access-mk9nw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:21:12 crc kubenswrapper[4831]: I1204 10:21:12.222230 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mk9nw\" (UniqueName: \"kubernetes.io/projected/2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46-kube-api-access-mk9nw\") on node \"crc\" DevicePath \"\"" Dec 04 10:21:12 crc kubenswrapper[4831]: I1204 10:21:12.222530 4831 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46-client-ca\") on node \"crc\" DevicePath \"\"" Dec 04 10:21:12 crc kubenswrapper[4831]: I1204 10:21:12.222541 4831 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 04 10:21:12 crc kubenswrapper[4831]: I1204 10:21:12.222550 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:21:12 crc kubenswrapper[4831]: I1204 10:21:12.222559 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 10:21:12 crc kubenswrapper[4831]: I1204 10:21:12.309145 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khfsg" event={"ID":"1d852025-4ea6-4343-813c-2411dec5469f","Type":"ContainerStarted","Data":"9996eca8cbf6d8dc210164b28f6474a7c227ce3de1923b21eed7c063a027ca32"} Dec 04 10:21:12 crc kubenswrapper[4831]: I1204 10:21:12.310829 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-99lqd" event={"ID":"6a0d708c-5b81-4c5d-9fd7-98e2aed6e7f1","Type":"ContainerStarted","Data":"6330b887b6ce00be63531ec98a66f1486b5fba1407b7685a654939e64e83aaf1"} Dec 04 10:21:12 crc kubenswrapper[4831]: I1204 10:21:12.311928 4831 generic.go:334] "Generic (PLEG): container finished" podID="a598e150-ad70-412c-bf06-9e7bd26a8422" containerID="f369a2496bf92532cac2f67bcf63035c9e62d5337f535b385e0e54f4036a3378" exitCode=0 Dec 04 10:21:12 crc kubenswrapper[4831]: I1204 10:21:12.311978 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j9qm7" event={"ID":"a598e150-ad70-412c-bf06-9e7bd26a8422","Type":"ContainerDied","Data":"f369a2496bf92532cac2f67bcf63035c9e62d5337f535b385e0e54f4036a3378"} Dec 04 10:21:12 crc kubenswrapper[4831]: I1204 10:21:12.311994 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j9qm7" event={"ID":"a598e150-ad70-412c-bf06-9e7bd26a8422","Type":"ContainerStarted","Data":"60c3794b02000af6ef242b1421f9a11e6c2baa4236d854e0e322564ec7ea8be9"} Dec 04 10:21:12 crc kubenswrapper[4831]: I1204 10:21:12.315112 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-95vcr" event={"ID":"910ac3eb-beda-4174-b3da-e3d708ffbcc3","Type":"ContainerStarted","Data":"c29b58247a5c76d25a3faa5307261e37d79877037ecd94ccc3b1dd50d5503f1f"} Dec 04 10:21:12 crc kubenswrapper[4831]: I1204 10:21:12.318882 4831 generic.go:334] "Generic (PLEG): container finished" podID="2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46" containerID="29a863cbef0df7ee329d8adc6dd4c3ff4f54ae34da49ba8210cd5f6b1fb38467" exitCode=0 Dec 04 10:21:12 crc kubenswrapper[4831]: I1204 10:21:12.318913 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f4c649f4c-7ndrp" event={"ID":"2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46","Type":"ContainerDied","Data":"29a863cbef0df7ee329d8adc6dd4c3ff4f54ae34da49ba8210cd5f6b1fb38467"} Dec 04 10:21:12 crc kubenswrapper[4831]: I1204 10:21:12.318930 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f4c649f4c-7ndrp" event={"ID":"2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46","Type":"ContainerDied","Data":"da2d3675ec6a76653f34c20e185cac885ec4610dc3a9fc1a9cc3501d0287edb7"} Dec 04 10:21:12 crc kubenswrapper[4831]: I1204 10:21:12.318945 4831 scope.go:117] "RemoveContainer" containerID="29a863cbef0df7ee329d8adc6dd4c3ff4f54ae34da49ba8210cd5f6b1fb38467" Dec 04 10:21:12 crc kubenswrapper[4831]: I1204 10:21:12.318949 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f4c649f4c-7ndrp" Dec 04 10:21:12 crc kubenswrapper[4831]: I1204 10:21:12.329226 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-khfsg" podStartSLOduration=2.825639537 podStartE2EDuration="5.329208117s" podCreationTimestamp="2025-12-04 10:21:07 +0000 UTC" firstStartedPulling="2025-12-04 10:21:09.274254833 +0000 UTC m=+366.223430187" lastFinishedPulling="2025-12-04 10:21:11.777823453 +0000 UTC m=+368.726998767" observedRunningTime="2025-12-04 10:21:12.326980348 +0000 UTC m=+369.276155662" watchObservedRunningTime="2025-12-04 10:21:12.329208117 +0000 UTC m=+369.278383431" Dec 04 10:21:12 crc kubenswrapper[4831]: I1204 10:21:12.330461 4831 scope.go:117] "RemoveContainer" containerID="29a863cbef0df7ee329d8adc6dd4c3ff4f54ae34da49ba8210cd5f6b1fb38467" Dec 04 10:21:12 crc kubenswrapper[4831]: E1204 10:21:12.331754 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29a863cbef0df7ee329d8adc6dd4c3ff4f54ae34da49ba8210cd5f6b1fb38467\": container with ID starting with 29a863cbef0df7ee329d8adc6dd4c3ff4f54ae34da49ba8210cd5f6b1fb38467 not found: ID does not exist" containerID="29a863cbef0df7ee329d8adc6dd4c3ff4f54ae34da49ba8210cd5f6b1fb38467" Dec 04 10:21:12 crc kubenswrapper[4831]: I1204 10:21:12.331778 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29a863cbef0df7ee329d8adc6dd4c3ff4f54ae34da49ba8210cd5f6b1fb38467"} err="failed to get container status \"29a863cbef0df7ee329d8adc6dd4c3ff4f54ae34da49ba8210cd5f6b1fb38467\": rpc error: code = NotFound desc = could not find container \"29a863cbef0df7ee329d8adc6dd4c3ff4f54ae34da49ba8210cd5f6b1fb38467\": container with ID starting with 29a863cbef0df7ee329d8adc6dd4c3ff4f54ae34da49ba8210cd5f6b1fb38467 not found: ID does not exist" Dec 04 10:21:12 crc kubenswrapper[4831]: I1204 10:21:12.366715 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-95vcr" podStartSLOduration=1.814228172 podStartE2EDuration="4.366648663s" podCreationTimestamp="2025-12-04 10:21:08 +0000 UTC" firstStartedPulling="2025-12-04 10:21:09.269219388 +0000 UTC m=+366.218394742" lastFinishedPulling="2025-12-04 10:21:11.821639919 +0000 UTC m=+368.770815233" observedRunningTime="2025-12-04 10:21:12.364780663 +0000 UTC m=+369.313955977" watchObservedRunningTime="2025-12-04 10:21:12.366648663 +0000 UTC m=+369.315823977" Dec 04 10:21:12 crc kubenswrapper[4831]: I1204 10:21:12.420117 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5f4c649f4c-7ndrp"] Dec 04 10:21:12 crc kubenswrapper[4831]: I1204 10:21:12.425061 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5f4c649f4c-7ndrp"] Dec 04 10:21:13 crc kubenswrapper[4831]: I1204 10:21:13.223292 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-565d46959-f2mmf"] Dec 04 10:21:13 crc kubenswrapper[4831]: E1204 10:21:13.223502 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46" containerName="controller-manager" Dec 04 10:21:13 crc kubenswrapper[4831]: I1204 10:21:13.223515 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46" containerName="controller-manager" Dec 04 10:21:13 crc kubenswrapper[4831]: I1204 10:21:13.223616 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46" containerName="controller-manager" Dec 04 10:21:13 crc kubenswrapper[4831]: I1204 10:21:13.223962 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-565d46959-f2mmf" Dec 04 10:21:13 crc kubenswrapper[4831]: I1204 10:21:13.225683 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 04 10:21:13 crc kubenswrapper[4831]: I1204 10:21:13.226995 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 04 10:21:13 crc kubenswrapper[4831]: I1204 10:21:13.227370 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 04 10:21:13 crc kubenswrapper[4831]: I1204 10:21:13.227577 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 04 10:21:13 crc kubenswrapper[4831]: I1204 10:21:13.227686 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 04 10:21:13 crc kubenswrapper[4831]: I1204 10:21:13.227708 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 04 10:21:13 crc kubenswrapper[4831]: I1204 10:21:13.233267 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxkj4\" (UniqueName: \"kubernetes.io/projected/0ccb5509-f3aa-49c8-afbf-2b7b848da8be-kube-api-access-pxkj4\") pod \"controller-manager-565d46959-f2mmf\" (UID: \"0ccb5509-f3aa-49c8-afbf-2b7b848da8be\") " pod="openshift-controller-manager/controller-manager-565d46959-f2mmf" Dec 04 10:21:13 crc kubenswrapper[4831]: I1204 10:21:13.233313 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0ccb5509-f3aa-49c8-afbf-2b7b848da8be-proxy-ca-bundles\") pod \"controller-manager-565d46959-f2mmf\" (UID: \"0ccb5509-f3aa-49c8-afbf-2b7b848da8be\") " pod="openshift-controller-manager/controller-manager-565d46959-f2mmf" Dec 04 10:21:13 crc kubenswrapper[4831]: I1204 10:21:13.233351 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ccb5509-f3aa-49c8-afbf-2b7b848da8be-config\") pod \"controller-manager-565d46959-f2mmf\" (UID: \"0ccb5509-f3aa-49c8-afbf-2b7b848da8be\") " pod="openshift-controller-manager/controller-manager-565d46959-f2mmf" Dec 04 10:21:13 crc kubenswrapper[4831]: I1204 10:21:13.233434 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ccb5509-f3aa-49c8-afbf-2b7b848da8be-serving-cert\") pod \"controller-manager-565d46959-f2mmf\" (UID: \"0ccb5509-f3aa-49c8-afbf-2b7b848da8be\") " pod="openshift-controller-manager/controller-manager-565d46959-f2mmf" Dec 04 10:21:13 crc kubenswrapper[4831]: I1204 10:21:13.233486 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0ccb5509-f3aa-49c8-afbf-2b7b848da8be-client-ca\") pod \"controller-manager-565d46959-f2mmf\" (UID: \"0ccb5509-f3aa-49c8-afbf-2b7b848da8be\") " pod="openshift-controller-manager/controller-manager-565d46959-f2mmf" Dec 04 10:21:13 crc kubenswrapper[4831]: I1204 10:21:13.237701 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 04 10:21:13 crc kubenswrapper[4831]: I1204 10:21:13.242619 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-565d46959-f2mmf"] Dec 04 10:21:13 crc kubenswrapper[4831]: I1204 10:21:13.281794 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46" path="/var/lib/kubelet/pods/2d6e1a3b-2c37-457a-b436-ef7a6b3bcf46/volumes" Dec 04 10:21:13 crc kubenswrapper[4831]: I1204 10:21:13.327403 4831 generic.go:334] "Generic (PLEG): container finished" podID="6a0d708c-5b81-4c5d-9fd7-98e2aed6e7f1" containerID="6330b887b6ce00be63531ec98a66f1486b5fba1407b7685a654939e64e83aaf1" exitCode=0 Dec 04 10:21:13 crc kubenswrapper[4831]: I1204 10:21:13.327705 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-99lqd" event={"ID":"6a0d708c-5b81-4c5d-9fd7-98e2aed6e7f1","Type":"ContainerDied","Data":"6330b887b6ce00be63531ec98a66f1486b5fba1407b7685a654939e64e83aaf1"} Dec 04 10:21:13 crc kubenswrapper[4831]: I1204 10:21:13.330438 4831 generic.go:334] "Generic (PLEG): container finished" podID="a598e150-ad70-412c-bf06-9e7bd26a8422" containerID="4d35eeabb86505c049ff304693b71ff406c4980fd84cf278e6dea0d83604e7d0" exitCode=0 Dec 04 10:21:13 crc kubenswrapper[4831]: I1204 10:21:13.330756 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j9qm7" event={"ID":"a598e150-ad70-412c-bf06-9e7bd26a8422","Type":"ContainerDied","Data":"4d35eeabb86505c049ff304693b71ff406c4980fd84cf278e6dea0d83604e7d0"} Dec 04 10:21:13 crc kubenswrapper[4831]: I1204 10:21:13.335704 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxkj4\" (UniqueName: \"kubernetes.io/projected/0ccb5509-f3aa-49c8-afbf-2b7b848da8be-kube-api-access-pxkj4\") pod \"controller-manager-565d46959-f2mmf\" (UID: \"0ccb5509-f3aa-49c8-afbf-2b7b848da8be\") " pod="openshift-controller-manager/controller-manager-565d46959-f2mmf" Dec 04 10:21:13 crc kubenswrapper[4831]: I1204 10:21:13.335773 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0ccb5509-f3aa-49c8-afbf-2b7b848da8be-proxy-ca-bundles\") pod \"controller-manager-565d46959-f2mmf\" (UID: \"0ccb5509-f3aa-49c8-afbf-2b7b848da8be\") " pod="openshift-controller-manager/controller-manager-565d46959-f2mmf" Dec 04 10:21:13 crc kubenswrapper[4831]: I1204 10:21:13.335829 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ccb5509-f3aa-49c8-afbf-2b7b848da8be-config\") pod \"controller-manager-565d46959-f2mmf\" (UID: \"0ccb5509-f3aa-49c8-afbf-2b7b848da8be\") " pod="openshift-controller-manager/controller-manager-565d46959-f2mmf" Dec 04 10:21:13 crc kubenswrapper[4831]: I1204 10:21:13.335981 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ccb5509-f3aa-49c8-afbf-2b7b848da8be-serving-cert\") pod \"controller-manager-565d46959-f2mmf\" (UID: \"0ccb5509-f3aa-49c8-afbf-2b7b848da8be\") " pod="openshift-controller-manager/controller-manager-565d46959-f2mmf" Dec 04 10:21:13 crc kubenswrapper[4831]: I1204 10:21:13.336079 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0ccb5509-f3aa-49c8-afbf-2b7b848da8be-client-ca\") pod \"controller-manager-565d46959-f2mmf\" (UID: \"0ccb5509-f3aa-49c8-afbf-2b7b848da8be\") " pod="openshift-controller-manager/controller-manager-565d46959-f2mmf" Dec 04 10:21:13 crc kubenswrapper[4831]: I1204 10:21:13.338253 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0ccb5509-f3aa-49c8-afbf-2b7b848da8be-proxy-ca-bundles\") pod \"controller-manager-565d46959-f2mmf\" (UID: \"0ccb5509-f3aa-49c8-afbf-2b7b848da8be\") " pod="openshift-controller-manager/controller-manager-565d46959-f2mmf" Dec 04 10:21:13 crc kubenswrapper[4831]: I1204 10:21:13.338383 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ccb5509-f3aa-49c8-afbf-2b7b848da8be-config\") pod \"controller-manager-565d46959-f2mmf\" (UID: \"0ccb5509-f3aa-49c8-afbf-2b7b848da8be\") " pod="openshift-controller-manager/controller-manager-565d46959-f2mmf" Dec 04 10:21:13 crc kubenswrapper[4831]: I1204 10:21:13.338586 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0ccb5509-f3aa-49c8-afbf-2b7b848da8be-client-ca\") pod \"controller-manager-565d46959-f2mmf\" (UID: \"0ccb5509-f3aa-49c8-afbf-2b7b848da8be\") " pod="openshift-controller-manager/controller-manager-565d46959-f2mmf" Dec 04 10:21:13 crc kubenswrapper[4831]: I1204 10:21:13.343107 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ccb5509-f3aa-49c8-afbf-2b7b848da8be-serving-cert\") pod \"controller-manager-565d46959-f2mmf\" (UID: \"0ccb5509-f3aa-49c8-afbf-2b7b848da8be\") " pod="openshift-controller-manager/controller-manager-565d46959-f2mmf" Dec 04 10:21:13 crc kubenswrapper[4831]: I1204 10:21:13.358604 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxkj4\" (UniqueName: \"kubernetes.io/projected/0ccb5509-f3aa-49c8-afbf-2b7b848da8be-kube-api-access-pxkj4\") pod \"controller-manager-565d46959-f2mmf\" (UID: \"0ccb5509-f3aa-49c8-afbf-2b7b848da8be\") " pod="openshift-controller-manager/controller-manager-565d46959-f2mmf" Dec 04 10:21:13 crc kubenswrapper[4831]: I1204 10:21:13.542027 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-565d46959-f2mmf" Dec 04 10:21:13 crc kubenswrapper[4831]: I1204 10:21:13.747689 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-565d46959-f2mmf"] Dec 04 10:21:13 crc kubenswrapper[4831]: W1204 10:21:13.753470 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ccb5509_f3aa_49c8_afbf_2b7b848da8be.slice/crio-cdafea57981335eeee20e545ff0e1a2146639373617833cd35a988e6663f7f38 WatchSource:0}: Error finding container cdafea57981335eeee20e545ff0e1a2146639373617833cd35a988e6663f7f38: Status 404 returned error can't find the container with id cdafea57981335eeee20e545ff0e1a2146639373617833cd35a988e6663f7f38 Dec 04 10:21:14 crc kubenswrapper[4831]: I1204 10:21:14.338102 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-99lqd" event={"ID":"6a0d708c-5b81-4c5d-9fd7-98e2aed6e7f1","Type":"ContainerStarted","Data":"746ebc15d856e0168593e31f433e0e8a2d9e0b2763b151dbe89d9c9929717d6d"} Dec 04 10:21:14 crc kubenswrapper[4831]: I1204 10:21:14.339303 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-565d46959-f2mmf" event={"ID":"0ccb5509-f3aa-49c8-afbf-2b7b848da8be","Type":"ContainerStarted","Data":"10640f66e540c56a40c831a81c4b3af079d3d4d908798569fd6b4a20e6495501"} Dec 04 10:21:14 crc kubenswrapper[4831]: I1204 10:21:14.339328 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-565d46959-f2mmf" event={"ID":"0ccb5509-f3aa-49c8-afbf-2b7b848da8be","Type":"ContainerStarted","Data":"cdafea57981335eeee20e545ff0e1a2146639373617833cd35a988e6663f7f38"} Dec 04 10:21:14 crc kubenswrapper[4831]: I1204 10:21:14.339558 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-565d46959-f2mmf" Dec 04 10:21:14 crc kubenswrapper[4831]: I1204 10:21:14.341866 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j9qm7" event={"ID":"a598e150-ad70-412c-bf06-9e7bd26a8422","Type":"ContainerStarted","Data":"605906e2bb5d50a56f17f71e5e4aee7b63684bf3b8c2ff8a330920dd50f20f99"} Dec 04 10:21:14 crc kubenswrapper[4831]: I1204 10:21:14.349844 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-565d46959-f2mmf" Dec 04 10:21:14 crc kubenswrapper[4831]: I1204 10:21:14.358773 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-99lqd" podStartSLOduration=2.9144133290000003 podStartE2EDuration="5.35875382s" podCreationTimestamp="2025-12-04 10:21:09 +0000 UTC" firstStartedPulling="2025-12-04 10:21:11.303499687 +0000 UTC m=+368.252675021" lastFinishedPulling="2025-12-04 10:21:13.747840208 +0000 UTC m=+370.697015512" observedRunningTime="2025-12-04 10:21:14.356646933 +0000 UTC m=+371.305822257" watchObservedRunningTime="2025-12-04 10:21:14.35875382 +0000 UTC m=+371.307929134" Dec 04 10:21:14 crc kubenswrapper[4831]: I1204 10:21:14.378388 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j9qm7" podStartSLOduration=2.901873232 podStartE2EDuration="4.378367606s" podCreationTimestamp="2025-12-04 10:21:10 +0000 UTC" firstStartedPulling="2025-12-04 10:21:12.313353262 +0000 UTC m=+369.262528576" lastFinishedPulling="2025-12-04 10:21:13.789847636 +0000 UTC m=+370.739022950" observedRunningTime="2025-12-04 10:21:14.375356735 +0000 UTC m=+371.324532049" watchObservedRunningTime="2025-12-04 10:21:14.378367606 +0000 UTC m=+371.327542920" Dec 04 10:21:14 crc kubenswrapper[4831]: I1204 10:21:14.397370 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-565d46959-f2mmf" podStartSLOduration=3.397354356 podStartE2EDuration="3.397354356s" podCreationTimestamp="2025-12-04 10:21:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:21:14.393378859 +0000 UTC m=+371.342554183" watchObservedRunningTime="2025-12-04 10:21:14.397354356 +0000 UTC m=+371.346529670" Dec 04 10:21:15 crc kubenswrapper[4831]: I1204 10:21:15.969881 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-gh27z"] Dec 04 10:21:15 crc kubenswrapper[4831]: I1204 10:21:15.970730 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-gh27z" Dec 04 10:21:15 crc kubenswrapper[4831]: I1204 10:21:15.991902 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-gh27z"] Dec 04 10:21:16 crc kubenswrapper[4831]: I1204 10:21:16.081107 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-gh27z\" (UID: \"c5996767-41a7-44d5-ad67-84b175f1faa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-gh27z" Dec 04 10:21:16 crc kubenswrapper[4831]: I1204 10:21:16.081163 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c5996767-41a7-44d5-ad67-84b175f1faa9-ca-trust-extracted\") pod \"image-registry-66df7c8f76-gh27z\" (UID: \"c5996767-41a7-44d5-ad67-84b175f1faa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-gh27z" Dec 04 10:21:16 crc kubenswrapper[4831]: I1204 10:21:16.081198 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c5996767-41a7-44d5-ad67-84b175f1faa9-installation-pull-secrets\") pod \"image-registry-66df7c8f76-gh27z\" (UID: \"c5996767-41a7-44d5-ad67-84b175f1faa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-gh27z" Dec 04 10:21:16 crc kubenswrapper[4831]: I1204 10:21:16.081218 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c5996767-41a7-44d5-ad67-84b175f1faa9-registry-tls\") pod \"image-registry-66df7c8f76-gh27z\" (UID: \"c5996767-41a7-44d5-ad67-84b175f1faa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-gh27z" Dec 04 10:21:16 crc kubenswrapper[4831]: I1204 10:21:16.081236 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c5996767-41a7-44d5-ad67-84b175f1faa9-trusted-ca\") pod \"image-registry-66df7c8f76-gh27z\" (UID: \"c5996767-41a7-44d5-ad67-84b175f1faa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-gh27z" Dec 04 10:21:16 crc kubenswrapper[4831]: I1204 10:21:16.081254 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zn47\" (UniqueName: \"kubernetes.io/projected/c5996767-41a7-44d5-ad67-84b175f1faa9-kube-api-access-6zn47\") pod \"image-registry-66df7c8f76-gh27z\" (UID: \"c5996767-41a7-44d5-ad67-84b175f1faa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-gh27z" Dec 04 10:21:16 crc kubenswrapper[4831]: I1204 10:21:16.081274 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5996767-41a7-44d5-ad67-84b175f1faa9-bound-sa-token\") pod \"image-registry-66df7c8f76-gh27z\" (UID: \"c5996767-41a7-44d5-ad67-84b175f1faa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-gh27z" Dec 04 10:21:16 crc kubenswrapper[4831]: I1204 10:21:16.081300 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c5996767-41a7-44d5-ad67-84b175f1faa9-registry-certificates\") pod \"image-registry-66df7c8f76-gh27z\" (UID: \"c5996767-41a7-44d5-ad67-84b175f1faa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-gh27z" Dec 04 10:21:16 crc kubenswrapper[4831]: I1204 10:21:16.102918 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-gh27z\" (UID: \"c5996767-41a7-44d5-ad67-84b175f1faa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-gh27z" Dec 04 10:21:16 crc kubenswrapper[4831]: I1204 10:21:16.182294 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c5996767-41a7-44d5-ad67-84b175f1faa9-ca-trust-extracted\") pod \"image-registry-66df7c8f76-gh27z\" (UID: \"c5996767-41a7-44d5-ad67-84b175f1faa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-gh27z" Dec 04 10:21:16 crc kubenswrapper[4831]: I1204 10:21:16.182360 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c5996767-41a7-44d5-ad67-84b175f1faa9-installation-pull-secrets\") pod \"image-registry-66df7c8f76-gh27z\" (UID: \"c5996767-41a7-44d5-ad67-84b175f1faa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-gh27z" Dec 04 10:21:16 crc kubenswrapper[4831]: I1204 10:21:16.182390 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c5996767-41a7-44d5-ad67-84b175f1faa9-registry-tls\") pod \"image-registry-66df7c8f76-gh27z\" (UID: \"c5996767-41a7-44d5-ad67-84b175f1faa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-gh27z" Dec 04 10:21:16 crc kubenswrapper[4831]: I1204 10:21:16.182414 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c5996767-41a7-44d5-ad67-84b175f1faa9-trusted-ca\") pod \"image-registry-66df7c8f76-gh27z\" (UID: \"c5996767-41a7-44d5-ad67-84b175f1faa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-gh27z" Dec 04 10:21:16 crc kubenswrapper[4831]: I1204 10:21:16.182438 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zn47\" (UniqueName: \"kubernetes.io/projected/c5996767-41a7-44d5-ad67-84b175f1faa9-kube-api-access-6zn47\") pod \"image-registry-66df7c8f76-gh27z\" (UID: \"c5996767-41a7-44d5-ad67-84b175f1faa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-gh27z" Dec 04 10:21:16 crc kubenswrapper[4831]: I1204 10:21:16.182464 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5996767-41a7-44d5-ad67-84b175f1faa9-bound-sa-token\") pod \"image-registry-66df7c8f76-gh27z\" (UID: \"c5996767-41a7-44d5-ad67-84b175f1faa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-gh27z" Dec 04 10:21:16 crc kubenswrapper[4831]: I1204 10:21:16.182496 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c5996767-41a7-44d5-ad67-84b175f1faa9-registry-certificates\") pod \"image-registry-66df7c8f76-gh27z\" (UID: \"c5996767-41a7-44d5-ad67-84b175f1faa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-gh27z" Dec 04 10:21:16 crc kubenswrapper[4831]: I1204 10:21:16.183853 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c5996767-41a7-44d5-ad67-84b175f1faa9-ca-trust-extracted\") pod \"image-registry-66df7c8f76-gh27z\" (UID: \"c5996767-41a7-44d5-ad67-84b175f1faa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-gh27z" Dec 04 10:21:16 crc kubenswrapper[4831]: I1204 10:21:16.184866 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c5996767-41a7-44d5-ad67-84b175f1faa9-registry-certificates\") pod \"image-registry-66df7c8f76-gh27z\" (UID: \"c5996767-41a7-44d5-ad67-84b175f1faa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-gh27z" Dec 04 10:21:16 crc kubenswrapper[4831]: I1204 10:21:16.185172 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c5996767-41a7-44d5-ad67-84b175f1faa9-trusted-ca\") pod \"image-registry-66df7c8f76-gh27z\" (UID: \"c5996767-41a7-44d5-ad67-84b175f1faa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-gh27z" Dec 04 10:21:16 crc kubenswrapper[4831]: I1204 10:21:16.188483 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c5996767-41a7-44d5-ad67-84b175f1faa9-installation-pull-secrets\") pod \"image-registry-66df7c8f76-gh27z\" (UID: \"c5996767-41a7-44d5-ad67-84b175f1faa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-gh27z" Dec 04 10:21:16 crc kubenswrapper[4831]: I1204 10:21:16.189741 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c5996767-41a7-44d5-ad67-84b175f1faa9-registry-tls\") pod \"image-registry-66df7c8f76-gh27z\" (UID: \"c5996767-41a7-44d5-ad67-84b175f1faa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-gh27z" Dec 04 10:21:16 crc kubenswrapper[4831]: I1204 10:21:16.199929 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5996767-41a7-44d5-ad67-84b175f1faa9-bound-sa-token\") pod \"image-registry-66df7c8f76-gh27z\" (UID: \"c5996767-41a7-44d5-ad67-84b175f1faa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-gh27z" Dec 04 10:21:16 crc kubenswrapper[4831]: I1204 10:21:16.200181 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zn47\" (UniqueName: \"kubernetes.io/projected/c5996767-41a7-44d5-ad67-84b175f1faa9-kube-api-access-6zn47\") pod \"image-registry-66df7c8f76-gh27z\" (UID: \"c5996767-41a7-44d5-ad67-84b175f1faa9\") " pod="openshift-image-registry/image-registry-66df7c8f76-gh27z" Dec 04 10:21:16 crc kubenswrapper[4831]: I1204 10:21:16.291135 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-gh27z" Dec 04 10:21:16 crc kubenswrapper[4831]: I1204 10:21:16.682064 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-gh27z"] Dec 04 10:21:16 crc kubenswrapper[4831]: W1204 10:21:16.687254 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5996767_41a7_44d5_ad67_84b175f1faa9.slice/crio-ab9996392c23794a152d4a931104cc19ff49c04b0febddcc29483cb1473ec258 WatchSource:0}: Error finding container ab9996392c23794a152d4a931104cc19ff49c04b0febddcc29483cb1473ec258: Status 404 returned error can't find the container with id ab9996392c23794a152d4a931104cc19ff49c04b0febddcc29483cb1473ec258 Dec 04 10:21:17 crc kubenswrapper[4831]: I1204 10:21:17.358929 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-gh27z" event={"ID":"c5996767-41a7-44d5-ad67-84b175f1faa9","Type":"ContainerStarted","Data":"ab9996392c23794a152d4a931104cc19ff49c04b0febddcc29483cb1473ec258"} Dec 04 10:21:18 crc kubenswrapper[4831]: I1204 10:21:18.235848 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-khfsg" Dec 04 10:21:18 crc kubenswrapper[4831]: I1204 10:21:18.236196 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-khfsg" Dec 04 10:21:18 crc kubenswrapper[4831]: I1204 10:21:18.289395 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-khfsg" Dec 04 10:21:18 crc kubenswrapper[4831]: I1204 10:21:18.401638 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-khfsg" Dec 04 10:21:18 crc kubenswrapper[4831]: I1204 10:21:18.465739 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-95vcr" Dec 04 10:21:18 crc kubenswrapper[4831]: I1204 10:21:18.465801 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-95vcr" Dec 04 10:21:18 crc kubenswrapper[4831]: I1204 10:21:18.511052 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-95vcr" Dec 04 10:21:19 crc kubenswrapper[4831]: I1204 10:21:19.370649 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-gh27z" event={"ID":"c5996767-41a7-44d5-ad67-84b175f1faa9","Type":"ContainerStarted","Data":"658b85d54a7aee7058c02535775b33b5d978e30cb0f01435741dd4595da84b48"} Dec 04 10:21:19 crc kubenswrapper[4831]: I1204 10:21:19.394340 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-gh27z" podStartSLOduration=4.3943101030000005 podStartE2EDuration="4.394310103s" podCreationTimestamp="2025-12-04 10:21:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:21:19.387602533 +0000 UTC m=+376.336777847" watchObservedRunningTime="2025-12-04 10:21:19.394310103 +0000 UTC m=+376.343485417" Dec 04 10:21:19 crc kubenswrapper[4831]: I1204 10:21:19.416563 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-95vcr" Dec 04 10:21:20 crc kubenswrapper[4831]: I1204 10:21:20.030262 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-99lqd" Dec 04 10:21:20 crc kubenswrapper[4831]: I1204 10:21:20.030403 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-99lqd" Dec 04 10:21:20 crc kubenswrapper[4831]: I1204 10:21:20.069092 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-99lqd" Dec 04 10:21:20 crc kubenswrapper[4831]: I1204 10:21:20.376333 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-gh27z" Dec 04 10:21:20 crc kubenswrapper[4831]: I1204 10:21:20.418020 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-99lqd" Dec 04 10:21:21 crc kubenswrapper[4831]: I1204 10:21:21.025627 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j9qm7" Dec 04 10:21:21 crc kubenswrapper[4831]: I1204 10:21:21.025686 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j9qm7" Dec 04 10:21:21 crc kubenswrapper[4831]: I1204 10:21:21.066832 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j9qm7" Dec 04 10:21:21 crc kubenswrapper[4831]: I1204 10:21:21.417616 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j9qm7" Dec 04 10:21:21 crc kubenswrapper[4831]: I1204 10:21:21.971907 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:21:21 crc kubenswrapper[4831]: I1204 10:21:21.972209 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:21:36 crc kubenswrapper[4831]: I1204 10:21:36.298971 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-gh27z" Dec 04 10:21:36 crc kubenswrapper[4831]: I1204 10:21:36.352221 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-7b6hb"] Dec 04 10:21:51 crc kubenswrapper[4831]: I1204 10:21:51.971583 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:21:51 crc kubenswrapper[4831]: I1204 10:21:51.972324 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:22:01 crc kubenswrapper[4831]: I1204 10:22:01.387766 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" podUID="ca0af9c8-f2fa-45f8-a428-9b061441bddf" containerName="registry" containerID="cri-o://c022d73d85974e6e516667345714b78fb0a15772cc42647110cf984c2551736d" gracePeriod=30 Dec 04 10:22:01 crc kubenswrapper[4831]: I1204 10:22:01.615533 4831 generic.go:334] "Generic (PLEG): container finished" podID="ca0af9c8-f2fa-45f8-a428-9b061441bddf" containerID="c022d73d85974e6e516667345714b78fb0a15772cc42647110cf984c2551736d" exitCode=0 Dec 04 10:22:01 crc kubenswrapper[4831]: I1204 10:22:01.615586 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" event={"ID":"ca0af9c8-f2fa-45f8-a428-9b061441bddf","Type":"ContainerDied","Data":"c022d73d85974e6e516667345714b78fb0a15772cc42647110cf984c2551736d"} Dec 04 10:22:01 crc kubenswrapper[4831]: I1204 10:22:01.793520 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:22:01 crc kubenswrapper[4831]: I1204 10:22:01.961106 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ca0af9c8-f2fa-45f8-a428-9b061441bddf-bound-sa-token\") pod \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " Dec 04 10:22:01 crc kubenswrapper[4831]: I1204 10:22:01.961174 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ca0af9c8-f2fa-45f8-a428-9b061441bddf-installation-pull-secrets\") pod \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " Dec 04 10:22:01 crc kubenswrapper[4831]: I1204 10:22:01.961221 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ca0af9c8-f2fa-45f8-a428-9b061441bddf-trusted-ca\") pod \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " Dec 04 10:22:01 crc kubenswrapper[4831]: I1204 10:22:01.961239 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvshm\" (UniqueName: \"kubernetes.io/projected/ca0af9c8-f2fa-45f8-a428-9b061441bddf-kube-api-access-wvshm\") pod \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " Dec 04 10:22:01 crc kubenswrapper[4831]: I1204 10:22:01.961290 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ca0af9c8-f2fa-45f8-a428-9b061441bddf-registry-tls\") pod \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " Dec 04 10:22:01 crc kubenswrapper[4831]: I1204 10:22:01.961441 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " Dec 04 10:22:01 crc kubenswrapper[4831]: I1204 10:22:01.961477 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ca0af9c8-f2fa-45f8-a428-9b061441bddf-ca-trust-extracted\") pod \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " Dec 04 10:22:01 crc kubenswrapper[4831]: I1204 10:22:01.961492 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ca0af9c8-f2fa-45f8-a428-9b061441bddf-registry-certificates\") pod \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\" (UID: \"ca0af9c8-f2fa-45f8-a428-9b061441bddf\") " Dec 04 10:22:01 crc kubenswrapper[4831]: I1204 10:22:01.962367 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca0af9c8-f2fa-45f8-a428-9b061441bddf-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "ca0af9c8-f2fa-45f8-a428-9b061441bddf" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:22:01 crc kubenswrapper[4831]: I1204 10:22:01.963507 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca0af9c8-f2fa-45f8-a428-9b061441bddf-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "ca0af9c8-f2fa-45f8-a428-9b061441bddf" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:22:01 crc kubenswrapper[4831]: I1204 10:22:01.969414 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca0af9c8-f2fa-45f8-a428-9b061441bddf-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "ca0af9c8-f2fa-45f8-a428-9b061441bddf" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:22:01 crc kubenswrapper[4831]: I1204 10:22:01.976320 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "ca0af9c8-f2fa-45f8-a428-9b061441bddf" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 04 10:22:01 crc kubenswrapper[4831]: I1204 10:22:01.977005 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca0af9c8-f2fa-45f8-a428-9b061441bddf-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "ca0af9c8-f2fa-45f8-a428-9b061441bddf" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:22:01 crc kubenswrapper[4831]: I1204 10:22:01.979291 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca0af9c8-f2fa-45f8-a428-9b061441bddf-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "ca0af9c8-f2fa-45f8-a428-9b061441bddf" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:22:01 crc kubenswrapper[4831]: I1204 10:22:01.979631 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca0af9c8-f2fa-45f8-a428-9b061441bddf-kube-api-access-wvshm" (OuterVolumeSpecName: "kube-api-access-wvshm") pod "ca0af9c8-f2fa-45f8-a428-9b061441bddf" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf"). InnerVolumeSpecName "kube-api-access-wvshm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:22:01 crc kubenswrapper[4831]: I1204 10:22:01.980047 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca0af9c8-f2fa-45f8-a428-9b061441bddf-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "ca0af9c8-f2fa-45f8-a428-9b061441bddf" (UID: "ca0af9c8-f2fa-45f8-a428-9b061441bddf"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:22:02 crc kubenswrapper[4831]: I1204 10:22:02.062738 4831 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ca0af9c8-f2fa-45f8-a428-9b061441bddf-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 04 10:22:02 crc kubenswrapper[4831]: I1204 10:22:02.062787 4831 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ca0af9c8-f2fa-45f8-a428-9b061441bddf-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 04 10:22:02 crc kubenswrapper[4831]: I1204 10:22:02.062808 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvshm\" (UniqueName: \"kubernetes.io/projected/ca0af9c8-f2fa-45f8-a428-9b061441bddf-kube-api-access-wvshm\") on node \"crc\" DevicePath \"\"" Dec 04 10:22:02 crc kubenswrapper[4831]: I1204 10:22:02.062820 4831 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ca0af9c8-f2fa-45f8-a428-9b061441bddf-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 04 10:22:02 crc kubenswrapper[4831]: I1204 10:22:02.062832 4831 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ca0af9c8-f2fa-45f8-a428-9b061441bddf-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 04 10:22:02 crc kubenswrapper[4831]: I1204 10:22:02.062843 4831 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ca0af9c8-f2fa-45f8-a428-9b061441bddf-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 04 10:22:02 crc kubenswrapper[4831]: I1204 10:22:02.062855 4831 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ca0af9c8-f2fa-45f8-a428-9b061441bddf-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 04 10:22:02 crc kubenswrapper[4831]: I1204 10:22:02.626835 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" event={"ID":"ca0af9c8-f2fa-45f8-a428-9b061441bddf","Type":"ContainerDied","Data":"342cc0e1e622c6b03b82f2a37c15b03e4981eb14c6c54ed2150bebe14ad86c37"} Dec 04 10:22:02 crc kubenswrapper[4831]: I1204 10:22:02.626914 4831 scope.go:117] "RemoveContainer" containerID="c022d73d85974e6e516667345714b78fb0a15772cc42647110cf984c2551736d" Dec 04 10:22:02 crc kubenswrapper[4831]: I1204 10:22:02.626917 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-7b6hb" Dec 04 10:22:02 crc kubenswrapper[4831]: I1204 10:22:02.678554 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-7b6hb"] Dec 04 10:22:02 crc kubenswrapper[4831]: I1204 10:22:02.685096 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-7b6hb"] Dec 04 10:22:03 crc kubenswrapper[4831]: I1204 10:22:03.288197 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca0af9c8-f2fa-45f8-a428-9b061441bddf" path="/var/lib/kubelet/pods/ca0af9c8-f2fa-45f8-a428-9b061441bddf/volumes" Dec 04 10:22:21 crc kubenswrapper[4831]: I1204 10:22:21.971963 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:22:21 crc kubenswrapper[4831]: I1204 10:22:21.972536 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:22:21 crc kubenswrapper[4831]: I1204 10:22:21.972638 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" Dec 04 10:22:21 crc kubenswrapper[4831]: I1204 10:22:21.973454 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6f1a1ccac41bfbdc84a51f15b671049caf3ab62892dd1887a29c7fe8435d66ae"} pod="openshift-machine-config-operator/machine-config-daemon-g76nn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 10:22:21 crc kubenswrapper[4831]: I1204 10:22:21.973560 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" containerID="cri-o://6f1a1ccac41bfbdc84a51f15b671049caf3ab62892dd1887a29c7fe8435d66ae" gracePeriod=600 Dec 04 10:22:22 crc kubenswrapper[4831]: I1204 10:22:22.758969 4831 generic.go:334] "Generic (PLEG): container finished" podID="8475bb26-8864-4d49-935b-db7d4cb73387" containerID="6f1a1ccac41bfbdc84a51f15b671049caf3ab62892dd1887a29c7fe8435d66ae" exitCode=0 Dec 04 10:22:22 crc kubenswrapper[4831]: I1204 10:22:22.759042 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" event={"ID":"8475bb26-8864-4d49-935b-db7d4cb73387","Type":"ContainerDied","Data":"6f1a1ccac41bfbdc84a51f15b671049caf3ab62892dd1887a29c7fe8435d66ae"} Dec 04 10:22:22 crc kubenswrapper[4831]: I1204 10:22:22.759483 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" event={"ID":"8475bb26-8864-4d49-935b-db7d4cb73387","Type":"ContainerStarted","Data":"5f944904d013e4dfa65943141ad3d8f85fcad518eb19d11e801287afe4feca7d"} Dec 04 10:22:22 crc kubenswrapper[4831]: I1204 10:22:22.759522 4831 scope.go:117] "RemoveContainer" containerID="5399a139e421589fc50e4bad256402be123e37600f5c39eadf919ce767e9eeeb" Dec 04 10:24:51 crc kubenswrapper[4831]: I1204 10:24:51.971981 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:24:51 crc kubenswrapper[4831]: I1204 10:24:51.972943 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:25:22 crc kubenswrapper[4831]: I1204 10:25:22.122471 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:25:22 crc kubenswrapper[4831]: I1204 10:25:22.123276 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:25:25 crc kubenswrapper[4831]: I1204 10:25:25.240619 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-4t49t"] Dec 04 10:25:25 crc kubenswrapper[4831]: E1204 10:25:25.241372 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca0af9c8-f2fa-45f8-a428-9b061441bddf" containerName="registry" Dec 04 10:25:25 crc kubenswrapper[4831]: I1204 10:25:25.241420 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca0af9c8-f2fa-45f8-a428-9b061441bddf" containerName="registry" Dec 04 10:25:25 crc kubenswrapper[4831]: I1204 10:25:25.241606 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca0af9c8-f2fa-45f8-a428-9b061441bddf" containerName="registry" Dec 04 10:25:25 crc kubenswrapper[4831]: I1204 10:25:25.244517 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-4t49t" Dec 04 10:25:25 crc kubenswrapper[4831]: I1204 10:25:25.247518 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 04 10:25:25 crc kubenswrapper[4831]: I1204 10:25:25.249310 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-hw7ft"] Dec 04 10:25:25 crc kubenswrapper[4831]: I1204 10:25:25.249946 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-hw7ft" Dec 04 10:25:25 crc kubenswrapper[4831]: I1204 10:25:25.253544 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-4t49t"] Dec 04 10:25:25 crc kubenswrapper[4831]: I1204 10:25:25.271288 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 04 10:25:25 crc kubenswrapper[4831]: I1204 10:25:25.274812 4831 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-m8jj7" Dec 04 10:25:25 crc kubenswrapper[4831]: I1204 10:25:25.274917 4831 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-bhwbb" Dec 04 10:25:25 crc kubenswrapper[4831]: I1204 10:25:25.287842 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-hw7ft"] Dec 04 10:25:25 crc kubenswrapper[4831]: I1204 10:25:25.295726 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-ktsxl"] Dec 04 10:25:25 crc kubenswrapper[4831]: I1204 10:25:25.296541 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-ktsxl" Dec 04 10:25:25 crc kubenswrapper[4831]: I1204 10:25:25.299976 4831 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-rrbnz" Dec 04 10:25:25 crc kubenswrapper[4831]: I1204 10:25:25.323594 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-ktsxl"] Dec 04 10:25:25 crc kubenswrapper[4831]: I1204 10:25:25.362940 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt7nq\" (UniqueName: \"kubernetes.io/projected/b5c24c38-2e13-41db-87b8-187403880fba-kube-api-access-rt7nq\") pod \"cert-manager-5b446d88c5-4t49t\" (UID: \"b5c24c38-2e13-41db-87b8-187403880fba\") " pod="cert-manager/cert-manager-5b446d88c5-4t49t" Dec 04 10:25:25 crc kubenswrapper[4831]: I1204 10:25:25.362979 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flk24\" (UniqueName: \"kubernetes.io/projected/d6099ee7-1019-4da3-b3c2-6b47d3c81931-kube-api-access-flk24\") pod \"cert-manager-webhook-5655c58dd6-ktsxl\" (UID: \"d6099ee7-1019-4da3-b3c2-6b47d3c81931\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-ktsxl" Dec 04 10:25:25 crc kubenswrapper[4831]: I1204 10:25:25.363156 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsdtz\" (UniqueName: \"kubernetes.io/projected/d66df267-588e-4675-8081-046e41097b63-kube-api-access-vsdtz\") pod \"cert-manager-cainjector-7f985d654d-hw7ft\" (UID: \"d66df267-588e-4675-8081-046e41097b63\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-hw7ft" Dec 04 10:25:25 crc kubenswrapper[4831]: I1204 10:25:25.464121 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt7nq\" (UniqueName: \"kubernetes.io/projected/b5c24c38-2e13-41db-87b8-187403880fba-kube-api-access-rt7nq\") pod \"cert-manager-5b446d88c5-4t49t\" (UID: \"b5c24c38-2e13-41db-87b8-187403880fba\") " pod="cert-manager/cert-manager-5b446d88c5-4t49t" Dec 04 10:25:25 crc kubenswrapper[4831]: I1204 10:25:25.464178 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flk24\" (UniqueName: \"kubernetes.io/projected/d6099ee7-1019-4da3-b3c2-6b47d3c81931-kube-api-access-flk24\") pod \"cert-manager-webhook-5655c58dd6-ktsxl\" (UID: \"d6099ee7-1019-4da3-b3c2-6b47d3c81931\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-ktsxl" Dec 04 10:25:25 crc kubenswrapper[4831]: I1204 10:25:25.464260 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsdtz\" (UniqueName: \"kubernetes.io/projected/d66df267-588e-4675-8081-046e41097b63-kube-api-access-vsdtz\") pod \"cert-manager-cainjector-7f985d654d-hw7ft\" (UID: \"d66df267-588e-4675-8081-046e41097b63\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-hw7ft" Dec 04 10:25:25 crc kubenswrapper[4831]: I1204 10:25:25.483324 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt7nq\" (UniqueName: \"kubernetes.io/projected/b5c24c38-2e13-41db-87b8-187403880fba-kube-api-access-rt7nq\") pod \"cert-manager-5b446d88c5-4t49t\" (UID: \"b5c24c38-2e13-41db-87b8-187403880fba\") " pod="cert-manager/cert-manager-5b446d88c5-4t49t" Dec 04 10:25:25 crc kubenswrapper[4831]: I1204 10:25:25.484576 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsdtz\" (UniqueName: \"kubernetes.io/projected/d66df267-588e-4675-8081-046e41097b63-kube-api-access-vsdtz\") pod \"cert-manager-cainjector-7f985d654d-hw7ft\" (UID: \"d66df267-588e-4675-8081-046e41097b63\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-hw7ft" Dec 04 10:25:25 crc kubenswrapper[4831]: I1204 10:25:25.487730 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flk24\" (UniqueName: \"kubernetes.io/projected/d6099ee7-1019-4da3-b3c2-6b47d3c81931-kube-api-access-flk24\") pod \"cert-manager-webhook-5655c58dd6-ktsxl\" (UID: \"d6099ee7-1019-4da3-b3c2-6b47d3c81931\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-ktsxl" Dec 04 10:25:25 crc kubenswrapper[4831]: I1204 10:25:25.561163 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-4t49t" Dec 04 10:25:25 crc kubenswrapper[4831]: I1204 10:25:25.567078 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-hw7ft" Dec 04 10:25:25 crc kubenswrapper[4831]: I1204 10:25:25.615257 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-ktsxl" Dec 04 10:25:25 crc kubenswrapper[4831]: I1204 10:25:25.870144 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-ktsxl"] Dec 04 10:25:25 crc kubenswrapper[4831]: W1204 10:25:25.873988 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6099ee7_1019_4da3_b3c2_6b47d3c81931.slice/crio-3e09e971e6beb0933277e9048c147703c7d264abca87f265175db7a0529e6a8c WatchSource:0}: Error finding container 3e09e971e6beb0933277e9048c147703c7d264abca87f265175db7a0529e6a8c: Status 404 returned error can't find the container with id 3e09e971e6beb0933277e9048c147703c7d264abca87f265175db7a0529e6a8c Dec 04 10:25:25 crc kubenswrapper[4831]: I1204 10:25:25.876227 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 10:25:26 crc kubenswrapper[4831]: I1204 10:25:26.040121 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-4t49t"] Dec 04 10:25:26 crc kubenswrapper[4831]: I1204 10:25:26.047127 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-hw7ft"] Dec 04 10:25:26 crc kubenswrapper[4831]: W1204 10:25:26.048714 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5c24c38_2e13_41db_87b8_187403880fba.slice/crio-2a10edeb316036fc1e8de93b49d35d76e7ea42704582040aa5bf1701a2666f59 WatchSource:0}: Error finding container 2a10edeb316036fc1e8de93b49d35d76e7ea42704582040aa5bf1701a2666f59: Status 404 returned error can't find the container with id 2a10edeb316036fc1e8de93b49d35d76e7ea42704582040aa5bf1701a2666f59 Dec 04 10:25:26 crc kubenswrapper[4831]: W1204 10:25:26.052796 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd66df267_588e_4675_8081_046e41097b63.slice/crio-668543ae89094e69523140ff08d1d69dfe65b953e3b93759a22a3d91dc6201f6 WatchSource:0}: Error finding container 668543ae89094e69523140ff08d1d69dfe65b953e3b93759a22a3d91dc6201f6: Status 404 returned error can't find the container with id 668543ae89094e69523140ff08d1d69dfe65b953e3b93759a22a3d91dc6201f6 Dec 04 10:25:26 crc kubenswrapper[4831]: I1204 10:25:26.309165 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-hw7ft" event={"ID":"d66df267-588e-4675-8081-046e41097b63","Type":"ContainerStarted","Data":"668543ae89094e69523140ff08d1d69dfe65b953e3b93759a22a3d91dc6201f6"} Dec 04 10:25:26 crc kubenswrapper[4831]: I1204 10:25:26.311460 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-ktsxl" event={"ID":"d6099ee7-1019-4da3-b3c2-6b47d3c81931","Type":"ContainerStarted","Data":"3e09e971e6beb0933277e9048c147703c7d264abca87f265175db7a0529e6a8c"} Dec 04 10:25:26 crc kubenswrapper[4831]: I1204 10:25:26.313140 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-4t49t" event={"ID":"b5c24c38-2e13-41db-87b8-187403880fba","Type":"ContainerStarted","Data":"2a10edeb316036fc1e8de93b49d35d76e7ea42704582040aa5bf1701a2666f59"} Dec 04 10:25:30 crc kubenswrapper[4831]: I1204 10:25:30.354703 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-hw7ft" event={"ID":"d66df267-588e-4675-8081-046e41097b63","Type":"ContainerStarted","Data":"4dcca13c1d4b458f07f4898b1d4d5cbd8550eed9f62ccd961c875868e20bd37e"} Dec 04 10:25:30 crc kubenswrapper[4831]: I1204 10:25:30.358150 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-ktsxl" event={"ID":"d6099ee7-1019-4da3-b3c2-6b47d3c81931","Type":"ContainerStarted","Data":"29e4a6d66b2a7e43eb5fdaaa660dc1bdd04733eed6e5128bcad7b925e6643c70"} Dec 04 10:25:30 crc kubenswrapper[4831]: I1204 10:25:30.358789 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-ktsxl" Dec 04 10:25:30 crc kubenswrapper[4831]: I1204 10:25:30.360920 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-4t49t" event={"ID":"b5c24c38-2e13-41db-87b8-187403880fba","Type":"ContainerStarted","Data":"cfd1d9666b24506683de4f21d82663cb87bd561bd8f9b4192639dedc347b3b5b"} Dec 04 10:25:30 crc kubenswrapper[4831]: I1204 10:25:30.376677 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-hw7ft" podStartSLOduration=1.6071548679999998 podStartE2EDuration="5.376637806s" podCreationTimestamp="2025-12-04 10:25:25 +0000 UTC" firstStartedPulling="2025-12-04 10:25:26.054811059 +0000 UTC m=+623.003986373" lastFinishedPulling="2025-12-04 10:25:29.824293957 +0000 UTC m=+626.773469311" observedRunningTime="2025-12-04 10:25:30.371956411 +0000 UTC m=+627.321131725" watchObservedRunningTime="2025-12-04 10:25:30.376637806 +0000 UTC m=+627.325813120" Dec 04 10:25:30 crc kubenswrapper[4831]: I1204 10:25:30.398680 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-4t49t" podStartSLOduration=1.6972319599999999 podStartE2EDuration="5.398647843s" podCreationTimestamp="2025-12-04 10:25:25 +0000 UTC" firstStartedPulling="2025-12-04 10:25:26.05148022 +0000 UTC m=+623.000655534" lastFinishedPulling="2025-12-04 10:25:29.752896083 +0000 UTC m=+626.702071417" observedRunningTime="2025-12-04 10:25:30.396278939 +0000 UTC m=+627.345454253" watchObservedRunningTime="2025-12-04 10:25:30.398647843 +0000 UTC m=+627.347823157" Dec 04 10:25:30 crc kubenswrapper[4831]: I1204 10:25:30.423198 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-ktsxl" podStartSLOduration=1.546067519 podStartE2EDuration="5.423168536s" podCreationTimestamp="2025-12-04 10:25:25 +0000 UTC" firstStartedPulling="2025-12-04 10:25:25.875801826 +0000 UTC m=+622.824977150" lastFinishedPulling="2025-12-04 10:25:29.752902843 +0000 UTC m=+626.702078167" observedRunningTime="2025-12-04 10:25:30.4161574 +0000 UTC m=+627.365332724" watchObservedRunningTime="2025-12-04 10:25:30.423168536 +0000 UTC m=+627.372343870" Dec 04 10:25:35 crc kubenswrapper[4831]: I1204 10:25:35.619203 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-ktsxl" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.077884 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4xzkp"] Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.078446 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="ovn-controller" containerID="cri-o://648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92" gracePeriod=30 Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.078562 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="nbdb" containerID="cri-o://c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea" gracePeriod=30 Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.078599 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="kube-rbac-proxy-node" containerID="cri-o://99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140" gracePeriod=30 Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.078600 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3" gracePeriod=30 Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.078706 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="ovn-acl-logging" containerID="cri-o://cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa" gracePeriod=30 Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.078724 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="sbdb" containerID="cri-o://10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe" gracePeriod=30 Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.078738 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="northd" containerID="cri-o://ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96" gracePeriod=30 Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.120771 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="ovnkube-controller" containerID="cri-o://374249a0590c2fc6e217a4254dd8a1d33c409fd5486c32eddcd4f1dc30bd9f7e" gracePeriod=30 Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.378382 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xzkp_1261b9db-fe52-4fbc-9a9c-7e0c3486276e/ovnkube-controller/3.log" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.381830 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xzkp_1261b9db-fe52-4fbc-9a9c-7e0c3486276e/ovn-acl-logging/0.log" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.382409 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xzkp_1261b9db-fe52-4fbc-9a9c-7e0c3486276e/ovn-controller/0.log" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.382988 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.429347 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xzkp_1261b9db-fe52-4fbc-9a9c-7e0c3486276e/ovnkube-controller/3.log" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.434865 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xzkp_1261b9db-fe52-4fbc-9a9c-7e0c3486276e/ovn-acl-logging/0.log" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.435847 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xzkp_1261b9db-fe52-4fbc-9a9c-7e0c3486276e/ovn-controller/0.log" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437025 4831 generic.go:334] "Generic (PLEG): container finished" podID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerID="374249a0590c2fc6e217a4254dd8a1d33c409fd5486c32eddcd4f1dc30bd9f7e" exitCode=0 Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437080 4831 generic.go:334] "Generic (PLEG): container finished" podID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerID="10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe" exitCode=0 Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437103 4831 generic.go:334] "Generic (PLEG): container finished" podID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerID="c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea" exitCode=0 Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437130 4831 generic.go:334] "Generic (PLEG): container finished" podID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerID="ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96" exitCode=0 Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437150 4831 generic.go:334] "Generic (PLEG): container finished" podID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerID="3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3" exitCode=0 Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437168 4831 generic.go:334] "Generic (PLEG): container finished" podID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerID="99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140" exitCode=0 Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437188 4831 generic.go:334] "Generic (PLEG): container finished" podID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerID="cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa" exitCode=143 Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437213 4831 generic.go:334] "Generic (PLEG): container finished" podID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerID="648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92" exitCode=143 Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437313 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" event={"ID":"1261b9db-fe52-4fbc-9a9c-7e0c3486276e","Type":"ContainerDied","Data":"374249a0590c2fc6e217a4254dd8a1d33c409fd5486c32eddcd4f1dc30bd9f7e"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437359 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" event={"ID":"1261b9db-fe52-4fbc-9a9c-7e0c3486276e","Type":"ContainerDied","Data":"10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437384 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" event={"ID":"1261b9db-fe52-4fbc-9a9c-7e0c3486276e","Type":"ContainerDied","Data":"c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437409 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" event={"ID":"1261b9db-fe52-4fbc-9a9c-7e0c3486276e","Type":"ContainerDied","Data":"ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437429 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" event={"ID":"1261b9db-fe52-4fbc-9a9c-7e0c3486276e","Type":"ContainerDied","Data":"3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437456 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" event={"ID":"1261b9db-fe52-4fbc-9a9c-7e0c3486276e","Type":"ContainerDied","Data":"99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437505 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e913d104de697629b6461dc48f3d2e2f336dce5f39f5d9a976f749dca87a2382"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437534 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437553 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437569 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437584 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437601 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437616 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437628 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437640 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437697 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" event={"ID":"1261b9db-fe52-4fbc-9a9c-7e0c3486276e","Type":"ContainerDied","Data":"cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437722 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"374249a0590c2fc6e217a4254dd8a1d33c409fd5486c32eddcd4f1dc30bd9f7e"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437739 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e913d104de697629b6461dc48f3d2e2f336dce5f39f5d9a976f749dca87a2382"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437751 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437763 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437776 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437788 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437800 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437816 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437832 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437847 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437869 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" event={"ID":"1261b9db-fe52-4fbc-9a9c-7e0c3486276e","Type":"ContainerDied","Data":"648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437896 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"374249a0590c2fc6e217a4254dd8a1d33c409fd5486c32eddcd4f1dc30bd9f7e"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437910 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e913d104de697629b6461dc48f3d2e2f336dce5f39f5d9a976f749dca87a2382"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437922 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437935 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437946 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437957 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437970 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437982 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.437993 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.438004 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.438020 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" event={"ID":"1261b9db-fe52-4fbc-9a9c-7e0c3486276e","Type":"ContainerDied","Data":"e9eea486c4c1a5a5149fe823358e24fb2e8b0d0101dcbffe45d9305f5b002602"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.438039 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"374249a0590c2fc6e217a4254dd8a1d33c409fd5486c32eddcd4f1dc30bd9f7e"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.438053 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e913d104de697629b6461dc48f3d2e2f336dce5f39f5d9a976f749dca87a2382"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.438065 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.438076 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.438087 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.438099 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.438110 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.438121 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.438133 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.438144 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.438172 4831 scope.go:117] "RemoveContainer" containerID="374249a0590c2fc6e217a4254dd8a1d33c409fd5486c32eddcd4f1dc30bd9f7e" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.440005 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4xzkp" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.441246 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5g27v_c6a78509-d612-4338-8562-9b0627c1793f/kube-multus/2.log" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.442131 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5g27v_c6a78509-d612-4338-8562-9b0627c1793f/kube-multus/1.log" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.442202 4831 generic.go:334] "Generic (PLEG): container finished" podID="c6a78509-d612-4338-8562-9b0627c1793f" containerID="cf6417296513bd39972c5b248d9ea179e028735ddb1b875af0424524b1d2c67d" exitCode=2 Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.442244 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5g27v" event={"ID":"c6a78509-d612-4338-8562-9b0627c1793f","Type":"ContainerDied","Data":"cf6417296513bd39972c5b248d9ea179e028735ddb1b875af0424524b1d2c67d"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.442277 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f4bd1b329fe50dcd62c415378ed6a27d47c10396502c612f82a95540cd56501a"} Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.442977 4831 scope.go:117] "RemoveContainer" containerID="cf6417296513bd39972c5b248d9ea179e028735ddb1b875af0424524b1d2c67d" Dec 04 10:25:36 crc kubenswrapper[4831]: E1204 10:25:36.443844 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-5g27v_openshift-multus(c6a78509-d612-4338-8562-9b0627c1793f)\"" pod="openshift-multus/multus-5g27v" podUID="c6a78509-d612-4338-8562-9b0627c1793f" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.461576 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bhzlh"] Dec 04 10:25:36 crc kubenswrapper[4831]: E1204 10:25:36.461910 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="ovnkube-controller" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.461930 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="ovnkube-controller" Dec 04 10:25:36 crc kubenswrapper[4831]: E1204 10:25:36.461947 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="ovnkube-controller" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.461960 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="ovnkube-controller" Dec 04 10:25:36 crc kubenswrapper[4831]: E1204 10:25:36.461978 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="sbdb" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.461992 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="sbdb" Dec 04 10:25:36 crc kubenswrapper[4831]: E1204 10:25:36.462019 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="ovn-controller" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.462032 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="ovn-controller" Dec 04 10:25:36 crc kubenswrapper[4831]: E1204 10:25:36.462048 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="kubecfg-setup" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.462060 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="kubecfg-setup" Dec 04 10:25:36 crc kubenswrapper[4831]: E1204 10:25:36.462077 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="ovn-acl-logging" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.462092 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="ovn-acl-logging" Dec 04 10:25:36 crc kubenswrapper[4831]: E1204 10:25:36.462107 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="nbdb" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.462120 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="nbdb" Dec 04 10:25:36 crc kubenswrapper[4831]: E1204 10:25:36.462135 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="kube-rbac-proxy-ovn-metrics" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.462149 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="kube-rbac-proxy-ovn-metrics" Dec 04 10:25:36 crc kubenswrapper[4831]: E1204 10:25:36.462167 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="northd" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.462179 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="northd" Dec 04 10:25:36 crc kubenswrapper[4831]: E1204 10:25:36.462202 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="ovnkube-controller" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.462215 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="ovnkube-controller" Dec 04 10:25:36 crc kubenswrapper[4831]: E1204 10:25:36.462232 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="ovnkube-controller" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.462245 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="ovnkube-controller" Dec 04 10:25:36 crc kubenswrapper[4831]: E1204 10:25:36.462261 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="kube-rbac-proxy-node" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.462274 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="kube-rbac-proxy-node" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.462442 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="ovnkube-controller" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.462461 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="ovn-acl-logging" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.462476 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="kube-rbac-proxy-node" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.462493 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="ovnkube-controller" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.462514 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="ovn-controller" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.462531 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="northd" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.462545 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="nbdb" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.462562 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="sbdb" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.462582 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="ovnkube-controller" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.462596 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="ovnkube-controller" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.462609 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="kube-rbac-proxy-ovn-metrics" Dec 04 10:25:36 crc kubenswrapper[4831]: E1204 10:25:36.462820 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="ovnkube-controller" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.462836 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="ovnkube-controller" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.463017 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" containerName="ovnkube-controller" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.466089 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.479429 4831 scope.go:117] "RemoveContainer" containerID="e913d104de697629b6461dc48f3d2e2f336dce5f39f5d9a976f749dca87a2382" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.512903 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-ovnkube-script-lib\") pod \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.512951 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-etc-openvswitch\") pod \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.512975 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-ovnkube-config\") pod \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.512995 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-host-cni-bin\") pod \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.513015 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-run-ovn\") pod \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.513033 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-host-cni-netd\") pod \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.513049 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-node-log\") pod \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.513081 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tb7xl\" (UniqueName: \"kubernetes.io/projected/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-kube-api-access-tb7xl\") pod \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.513100 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-systemd-units\") pod \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.513124 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.513144 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-run-systemd\") pod \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.513167 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-host-run-netns\") pod \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.513188 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-host-run-ovn-kubernetes\") pod \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.513214 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-env-overrides\") pod \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.513233 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-log-socket\") pod \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.513252 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-host-kubelet\") pod \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.513275 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-run-openvswitch\") pod \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.513294 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-host-slash\") pod \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.513321 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-var-lib-openvswitch\") pod \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.513342 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-ovn-node-metrics-cert\") pod \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\" (UID: \"1261b9db-fe52-4fbc-9a9c-7e0c3486276e\") " Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.513426 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-host-run-netns\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.513457 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-env-overrides\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.513459 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "1261b9db-fe52-4fbc-9a9c-7e0c3486276e" (UID: "1261b9db-fe52-4fbc-9a9c-7e0c3486276e"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.513480 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-run-systemd\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.513538 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "1261b9db-fe52-4fbc-9a9c-7e0c3486276e" (UID: "1261b9db-fe52-4fbc-9a9c-7e0c3486276e"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.513563 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-ovnkube-script-lib\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.513607 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-host-run-ovn-kubernetes\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.513633 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-host-cni-bin\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.513715 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-node-log\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.513766 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-log-socket\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.513787 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-host-cni-netd\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.513807 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-run-ovn\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.513831 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-ovnkube-config\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.513854 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-etc-openvswitch\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.513877 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h65d5\" (UniqueName: \"kubernetes.io/projected/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-kube-api-access-h65d5\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.513900 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-run-openvswitch\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.513930 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-var-lib-openvswitch\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.513954 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-host-kubelet\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.513990 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.513992 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "1261b9db-fe52-4fbc-9a9c-7e0c3486276e" (UID: "1261b9db-fe52-4fbc-9a9c-7e0c3486276e"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.514026 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-systemd-units\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.514030 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "1261b9db-fe52-4fbc-9a9c-7e0c3486276e" (UID: "1261b9db-fe52-4fbc-9a9c-7e0c3486276e"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.514052 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "1261b9db-fe52-4fbc-9a9c-7e0c3486276e" (UID: "1261b9db-fe52-4fbc-9a9c-7e0c3486276e"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.514066 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-ovn-node-metrics-cert\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.514075 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "1261b9db-fe52-4fbc-9a9c-7e0c3486276e" (UID: "1261b9db-fe52-4fbc-9a9c-7e0c3486276e"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.514095 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-node-log" (OuterVolumeSpecName: "node-log") pod "1261b9db-fe52-4fbc-9a9c-7e0c3486276e" (UID: "1261b9db-fe52-4fbc-9a9c-7e0c3486276e"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.514103 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-host-slash\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.514158 4831 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.514173 4831 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.514185 4831 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.514199 4831 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.514211 4831 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.514222 4831 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.514233 4831 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-node-log\") on node \"crc\" DevicePath \"\"" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.514590 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-log-socket" (OuterVolumeSpecName: "log-socket") pod "1261b9db-fe52-4fbc-9a9c-7e0c3486276e" (UID: "1261b9db-fe52-4fbc-9a9c-7e0c3486276e"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.514780 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "1261b9db-fe52-4fbc-9a9c-7e0c3486276e" (UID: "1261b9db-fe52-4fbc-9a9c-7e0c3486276e"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.514813 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "1261b9db-fe52-4fbc-9a9c-7e0c3486276e" (UID: "1261b9db-fe52-4fbc-9a9c-7e0c3486276e"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.515217 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "1261b9db-fe52-4fbc-9a9c-7e0c3486276e" (UID: "1261b9db-fe52-4fbc-9a9c-7e0c3486276e"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.515246 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "1261b9db-fe52-4fbc-9a9c-7e0c3486276e" (UID: "1261b9db-fe52-4fbc-9a9c-7e0c3486276e"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.515272 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "1261b9db-fe52-4fbc-9a9c-7e0c3486276e" (UID: "1261b9db-fe52-4fbc-9a9c-7e0c3486276e"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.515294 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-host-slash" (OuterVolumeSpecName: "host-slash") pod "1261b9db-fe52-4fbc-9a9c-7e0c3486276e" (UID: "1261b9db-fe52-4fbc-9a9c-7e0c3486276e"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.515319 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "1261b9db-fe52-4fbc-9a9c-7e0c3486276e" (UID: "1261b9db-fe52-4fbc-9a9c-7e0c3486276e"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.515340 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "1261b9db-fe52-4fbc-9a9c-7e0c3486276e" (UID: "1261b9db-fe52-4fbc-9a9c-7e0c3486276e"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.524641 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "1261b9db-fe52-4fbc-9a9c-7e0c3486276e" (UID: "1261b9db-fe52-4fbc-9a9c-7e0c3486276e"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.525021 4831 scope.go:117] "RemoveContainer" containerID="10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.528903 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-kube-api-access-tb7xl" (OuterVolumeSpecName: "kube-api-access-tb7xl") pod "1261b9db-fe52-4fbc-9a9c-7e0c3486276e" (UID: "1261b9db-fe52-4fbc-9a9c-7e0c3486276e"). InnerVolumeSpecName "kube-api-access-tb7xl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.528951 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "1261b9db-fe52-4fbc-9a9c-7e0c3486276e" (UID: "1261b9db-fe52-4fbc-9a9c-7e0c3486276e"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.535950 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "1261b9db-fe52-4fbc-9a9c-7e0c3486276e" (UID: "1261b9db-fe52-4fbc-9a9c-7e0c3486276e"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.541770 4831 scope.go:117] "RemoveContainer" containerID="c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.559828 4831 scope.go:117] "RemoveContainer" containerID="ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.573851 4831 scope.go:117] "RemoveContainer" containerID="3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.592263 4831 scope.go:117] "RemoveContainer" containerID="99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.609087 4831 scope.go:117] "RemoveContainer" containerID="cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.615686 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-systemd-units\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.615746 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-ovn-node-metrics-cert\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.615813 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-systemd-units\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.615948 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-host-slash\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.616328 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-host-slash\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.616389 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-host-run-netns\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.616420 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-env-overrides\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.616457 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-run-systemd\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.616480 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-ovnkube-script-lib\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.616505 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-host-run-ovn-kubernetes\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.616565 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-host-cni-bin\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.616603 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-node-log\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.616636 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-run-ovn\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.616640 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-host-run-netns\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.616677 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-log-socket\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.616633 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-run-systemd\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.616705 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-host-cni-bin\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.616751 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-log-socket\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.616701 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-host-cni-netd\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.616731 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-host-cni-netd\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.616795 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-node-log\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.616815 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-run-ovn\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.616820 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-ovnkube-config\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.616867 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-host-run-ovn-kubernetes\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.616983 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-etc-openvswitch\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.617009 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-run-openvswitch\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.617030 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h65d5\" (UniqueName: \"kubernetes.io/projected/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-kube-api-access-h65d5\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.617055 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-var-lib-openvswitch\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.617074 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-host-kubelet\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.617095 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-env-overrides\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.617101 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.617127 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.617151 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-etc-openvswitch\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.617165 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-host-kubelet\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.617175 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-var-lib-openvswitch\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.617201 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-run-openvswitch\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.617673 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tb7xl\" (UniqueName: \"kubernetes.io/projected/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-kube-api-access-tb7xl\") on node \"crc\" DevicePath \"\"" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.617691 4831 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.617703 4831 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.617716 4831 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.617763 4831 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.617778 4831 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.617789 4831 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.617799 4831 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-log-socket\") on node \"crc\" DevicePath \"\"" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.617810 4831 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.617810 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-ovnkube-script-lib\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.617820 4831 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.617884 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-ovnkube-config\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.617891 4831 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-host-slash\") on node \"crc\" DevicePath \"\"" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.617938 4831 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.617955 4831 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1261b9db-fe52-4fbc-9a9c-7e0c3486276e-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.618851 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-ovn-node-metrics-cert\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.624649 4831 scope.go:117] "RemoveContainer" containerID="648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.637791 4831 scope.go:117] "RemoveContainer" containerID="dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.640628 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h65d5\" (UniqueName: \"kubernetes.io/projected/c4c30bb6-9cd8-464c-934c-d0feb55a1fda-kube-api-access-h65d5\") pod \"ovnkube-node-bhzlh\" (UID: \"c4c30bb6-9cd8-464c-934c-d0feb55a1fda\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.658315 4831 scope.go:117] "RemoveContainer" containerID="374249a0590c2fc6e217a4254dd8a1d33c409fd5486c32eddcd4f1dc30bd9f7e" Dec 04 10:25:36 crc kubenswrapper[4831]: E1204 10:25:36.658921 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"374249a0590c2fc6e217a4254dd8a1d33c409fd5486c32eddcd4f1dc30bd9f7e\": container with ID starting with 374249a0590c2fc6e217a4254dd8a1d33c409fd5486c32eddcd4f1dc30bd9f7e not found: ID does not exist" containerID="374249a0590c2fc6e217a4254dd8a1d33c409fd5486c32eddcd4f1dc30bd9f7e" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.658961 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"374249a0590c2fc6e217a4254dd8a1d33c409fd5486c32eddcd4f1dc30bd9f7e"} err="failed to get container status \"374249a0590c2fc6e217a4254dd8a1d33c409fd5486c32eddcd4f1dc30bd9f7e\": rpc error: code = NotFound desc = could not find container \"374249a0590c2fc6e217a4254dd8a1d33c409fd5486c32eddcd4f1dc30bd9f7e\": container with ID starting with 374249a0590c2fc6e217a4254dd8a1d33c409fd5486c32eddcd4f1dc30bd9f7e not found: ID does not exist" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.658987 4831 scope.go:117] "RemoveContainer" containerID="e913d104de697629b6461dc48f3d2e2f336dce5f39f5d9a976f749dca87a2382" Dec 04 10:25:36 crc kubenswrapper[4831]: E1204 10:25:36.660167 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e913d104de697629b6461dc48f3d2e2f336dce5f39f5d9a976f749dca87a2382\": container with ID starting with e913d104de697629b6461dc48f3d2e2f336dce5f39f5d9a976f749dca87a2382 not found: ID does not exist" containerID="e913d104de697629b6461dc48f3d2e2f336dce5f39f5d9a976f749dca87a2382" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.660240 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e913d104de697629b6461dc48f3d2e2f336dce5f39f5d9a976f749dca87a2382"} err="failed to get container status \"e913d104de697629b6461dc48f3d2e2f336dce5f39f5d9a976f749dca87a2382\": rpc error: code = NotFound desc = could not find container \"e913d104de697629b6461dc48f3d2e2f336dce5f39f5d9a976f749dca87a2382\": container with ID starting with e913d104de697629b6461dc48f3d2e2f336dce5f39f5d9a976f749dca87a2382 not found: ID does not exist" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.660280 4831 scope.go:117] "RemoveContainer" containerID="10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe" Dec 04 10:25:36 crc kubenswrapper[4831]: E1204 10:25:36.660741 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe\": container with ID starting with 10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe not found: ID does not exist" containerID="10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.660785 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe"} err="failed to get container status \"10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe\": rpc error: code = NotFound desc = could not find container \"10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe\": container with ID starting with 10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe not found: ID does not exist" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.660812 4831 scope.go:117] "RemoveContainer" containerID="c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea" Dec 04 10:25:36 crc kubenswrapper[4831]: E1204 10:25:36.661196 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea\": container with ID starting with c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea not found: ID does not exist" containerID="c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.661250 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea"} err="failed to get container status \"c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea\": rpc error: code = NotFound desc = could not find container \"c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea\": container with ID starting with c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea not found: ID does not exist" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.661279 4831 scope.go:117] "RemoveContainer" containerID="ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96" Dec 04 10:25:36 crc kubenswrapper[4831]: E1204 10:25:36.661816 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96\": container with ID starting with ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96 not found: ID does not exist" containerID="ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.661847 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96"} err="failed to get container status \"ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96\": rpc error: code = NotFound desc = could not find container \"ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96\": container with ID starting with ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96 not found: ID does not exist" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.661868 4831 scope.go:117] "RemoveContainer" containerID="3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3" Dec 04 10:25:36 crc kubenswrapper[4831]: E1204 10:25:36.662191 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3\": container with ID starting with 3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3 not found: ID does not exist" containerID="3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.662220 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3"} err="failed to get container status \"3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3\": rpc error: code = NotFound desc = could not find container \"3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3\": container with ID starting with 3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3 not found: ID does not exist" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.662235 4831 scope.go:117] "RemoveContainer" containerID="99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140" Dec 04 10:25:36 crc kubenswrapper[4831]: E1204 10:25:36.662649 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140\": container with ID starting with 99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140 not found: ID does not exist" containerID="99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.662696 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140"} err="failed to get container status \"99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140\": rpc error: code = NotFound desc = could not find container \"99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140\": container with ID starting with 99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140 not found: ID does not exist" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.662714 4831 scope.go:117] "RemoveContainer" containerID="cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa" Dec 04 10:25:36 crc kubenswrapper[4831]: E1204 10:25:36.663153 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa\": container with ID starting with cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa not found: ID does not exist" containerID="cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.663251 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa"} err="failed to get container status \"cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa\": rpc error: code = NotFound desc = could not find container \"cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa\": container with ID starting with cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa not found: ID does not exist" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.663307 4831 scope.go:117] "RemoveContainer" containerID="648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92" Dec 04 10:25:36 crc kubenswrapper[4831]: E1204 10:25:36.663760 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92\": container with ID starting with 648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92 not found: ID does not exist" containerID="648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.663793 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92"} err="failed to get container status \"648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92\": rpc error: code = NotFound desc = could not find container \"648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92\": container with ID starting with 648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92 not found: ID does not exist" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.663816 4831 scope.go:117] "RemoveContainer" containerID="dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f" Dec 04 10:25:36 crc kubenswrapper[4831]: E1204 10:25:36.664141 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\": container with ID starting with dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f not found: ID does not exist" containerID="dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.664196 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f"} err="failed to get container status \"dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\": rpc error: code = NotFound desc = could not find container \"dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\": container with ID starting with dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f not found: ID does not exist" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.664225 4831 scope.go:117] "RemoveContainer" containerID="374249a0590c2fc6e217a4254dd8a1d33c409fd5486c32eddcd4f1dc30bd9f7e" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.664648 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"374249a0590c2fc6e217a4254dd8a1d33c409fd5486c32eddcd4f1dc30bd9f7e"} err="failed to get container status \"374249a0590c2fc6e217a4254dd8a1d33c409fd5486c32eddcd4f1dc30bd9f7e\": rpc error: code = NotFound desc = could not find container \"374249a0590c2fc6e217a4254dd8a1d33c409fd5486c32eddcd4f1dc30bd9f7e\": container with ID starting with 374249a0590c2fc6e217a4254dd8a1d33c409fd5486c32eddcd4f1dc30bd9f7e not found: ID does not exist" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.664692 4831 scope.go:117] "RemoveContainer" containerID="e913d104de697629b6461dc48f3d2e2f336dce5f39f5d9a976f749dca87a2382" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.664975 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e913d104de697629b6461dc48f3d2e2f336dce5f39f5d9a976f749dca87a2382"} err="failed to get container status \"e913d104de697629b6461dc48f3d2e2f336dce5f39f5d9a976f749dca87a2382\": rpc error: code = NotFound desc = could not find container \"e913d104de697629b6461dc48f3d2e2f336dce5f39f5d9a976f749dca87a2382\": container with ID starting with e913d104de697629b6461dc48f3d2e2f336dce5f39f5d9a976f749dca87a2382 not found: ID does not exist" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.665016 4831 scope.go:117] "RemoveContainer" containerID="10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.665452 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe"} err="failed to get container status \"10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe\": rpc error: code = NotFound desc = could not find container \"10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe\": container with ID starting with 10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe not found: ID does not exist" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.665488 4831 scope.go:117] "RemoveContainer" containerID="c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.665865 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea"} err="failed to get container status \"c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea\": rpc error: code = NotFound desc = could not find container \"c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea\": container with ID starting with c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea not found: ID does not exist" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.665885 4831 scope.go:117] "RemoveContainer" containerID="ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.666201 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96"} err="failed to get container status \"ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96\": rpc error: code = NotFound desc = could not find container \"ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96\": container with ID starting with ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96 not found: ID does not exist" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.666233 4831 scope.go:117] "RemoveContainer" containerID="3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.666543 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3"} err="failed to get container status \"3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3\": rpc error: code = NotFound desc = could not find container \"3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3\": container with ID starting with 3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3 not found: ID does not exist" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.666610 4831 scope.go:117] "RemoveContainer" containerID="99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.667130 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140"} err="failed to get container status \"99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140\": rpc error: code = NotFound desc = could not find container \"99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140\": container with ID starting with 99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140 not found: ID does not exist" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.667151 4831 scope.go:117] "RemoveContainer" containerID="cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.667439 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa"} err="failed to get container status \"cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa\": rpc error: code = NotFound desc = could not find container \"cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa\": container with ID starting with cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa not found: ID does not exist" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.667458 4831 scope.go:117] "RemoveContainer" containerID="648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.667744 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92"} err="failed to get container status \"648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92\": rpc error: code = NotFound desc = could not find container \"648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92\": container with ID starting with 648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92 not found: ID does not exist" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.667762 4831 scope.go:117] "RemoveContainer" containerID="dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.668090 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f"} err="failed to get container status \"dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\": rpc error: code = NotFound desc = could not find container \"dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\": container with ID starting with dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f not found: ID does not exist" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.668109 4831 scope.go:117] "RemoveContainer" containerID="374249a0590c2fc6e217a4254dd8a1d33c409fd5486c32eddcd4f1dc30bd9f7e" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.668386 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"374249a0590c2fc6e217a4254dd8a1d33c409fd5486c32eddcd4f1dc30bd9f7e"} err="failed to get container status \"374249a0590c2fc6e217a4254dd8a1d33c409fd5486c32eddcd4f1dc30bd9f7e\": rpc error: code = NotFound desc = could not find container \"374249a0590c2fc6e217a4254dd8a1d33c409fd5486c32eddcd4f1dc30bd9f7e\": container with ID starting with 374249a0590c2fc6e217a4254dd8a1d33c409fd5486c32eddcd4f1dc30bd9f7e not found: ID does not exist" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.668406 4831 scope.go:117] "RemoveContainer" containerID="e913d104de697629b6461dc48f3d2e2f336dce5f39f5d9a976f749dca87a2382" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.668727 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e913d104de697629b6461dc48f3d2e2f336dce5f39f5d9a976f749dca87a2382"} err="failed to get container status \"e913d104de697629b6461dc48f3d2e2f336dce5f39f5d9a976f749dca87a2382\": rpc error: code = NotFound desc = could not find container \"e913d104de697629b6461dc48f3d2e2f336dce5f39f5d9a976f749dca87a2382\": container with ID starting with e913d104de697629b6461dc48f3d2e2f336dce5f39f5d9a976f749dca87a2382 not found: ID does not exist" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.668796 4831 scope.go:117] "RemoveContainer" containerID="10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.669367 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe"} err="failed to get container status \"10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe\": rpc error: code = NotFound desc = could not find container \"10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe\": container with ID starting with 10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe not found: ID does not exist" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.669395 4831 scope.go:117] "RemoveContainer" containerID="c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.669755 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea"} err="failed to get container status \"c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea\": rpc error: code = NotFound desc = could not find container \"c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea\": container with ID starting with c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea not found: ID does not exist" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.669792 4831 scope.go:117] "RemoveContainer" containerID="ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.670214 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96"} err="failed to get container status \"ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96\": rpc error: code = NotFound desc = could not find container \"ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96\": container with ID starting with ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96 not found: ID does not exist" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.670268 4831 scope.go:117] "RemoveContainer" containerID="3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.670731 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3"} err="failed to get container status \"3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3\": rpc error: code = NotFound desc = could not find container \"3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3\": container with ID starting with 3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3 not found: ID does not exist" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.670756 4831 scope.go:117] "RemoveContainer" containerID="99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.670994 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140"} err="failed to get container status \"99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140\": rpc error: code = NotFound desc = could not find container \"99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140\": container with ID starting with 99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140 not found: ID does not exist" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.671089 4831 scope.go:117] "RemoveContainer" containerID="cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.671527 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa"} err="failed to get container status \"cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa\": rpc error: code = NotFound desc = could not find container \"cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa\": container with ID starting with cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa not found: ID does not exist" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.671548 4831 scope.go:117] "RemoveContainer" containerID="648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.671863 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92"} err="failed to get container status \"648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92\": rpc error: code = NotFound desc = could not find container \"648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92\": container with ID starting with 648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92 not found: ID does not exist" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.671906 4831 scope.go:117] "RemoveContainer" containerID="dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.672250 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f"} err="failed to get container status \"dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\": rpc error: code = NotFound desc = could not find container \"dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\": container with ID starting with dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f not found: ID does not exist" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.672268 4831 scope.go:117] "RemoveContainer" containerID="374249a0590c2fc6e217a4254dd8a1d33c409fd5486c32eddcd4f1dc30bd9f7e" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.672576 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"374249a0590c2fc6e217a4254dd8a1d33c409fd5486c32eddcd4f1dc30bd9f7e"} err="failed to get container status \"374249a0590c2fc6e217a4254dd8a1d33c409fd5486c32eddcd4f1dc30bd9f7e\": rpc error: code = NotFound desc = could not find container \"374249a0590c2fc6e217a4254dd8a1d33c409fd5486c32eddcd4f1dc30bd9f7e\": container with ID starting with 374249a0590c2fc6e217a4254dd8a1d33c409fd5486c32eddcd4f1dc30bd9f7e not found: ID does not exist" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.672615 4831 scope.go:117] "RemoveContainer" containerID="e913d104de697629b6461dc48f3d2e2f336dce5f39f5d9a976f749dca87a2382" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.673021 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e913d104de697629b6461dc48f3d2e2f336dce5f39f5d9a976f749dca87a2382"} err="failed to get container status \"e913d104de697629b6461dc48f3d2e2f336dce5f39f5d9a976f749dca87a2382\": rpc error: code = NotFound desc = could not find container \"e913d104de697629b6461dc48f3d2e2f336dce5f39f5d9a976f749dca87a2382\": container with ID starting with e913d104de697629b6461dc48f3d2e2f336dce5f39f5d9a976f749dca87a2382 not found: ID does not exist" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.673039 4831 scope.go:117] "RemoveContainer" containerID="10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.673460 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe"} err="failed to get container status \"10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe\": rpc error: code = NotFound desc = could not find container \"10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe\": container with ID starting with 10bddd31a1d8b40b72ee665b216db29bc4c3be7189e123d3f65a61657419b5fe not found: ID does not exist" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.673493 4831 scope.go:117] "RemoveContainer" containerID="c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.674046 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea"} err="failed to get container status \"c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea\": rpc error: code = NotFound desc = could not find container \"c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea\": container with ID starting with c6aad03160309dbcb14976e320a82ef41bb5df30923a761193feedce7e7432ea not found: ID does not exist" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.674088 4831 scope.go:117] "RemoveContainer" containerID="ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.674515 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96"} err="failed to get container status \"ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96\": rpc error: code = NotFound desc = could not find container \"ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96\": container with ID starting with ff4ed0cdab15e4b732c4da632ec84c11fb47d2d92405b0cce8e6898419049d96 not found: ID does not exist" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.674541 4831 scope.go:117] "RemoveContainer" containerID="3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.674958 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3"} err="failed to get container status \"3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3\": rpc error: code = NotFound desc = could not find container \"3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3\": container with ID starting with 3e7f8124cf7b16168a4472ec5c9c761b1cfbcda346928239e0728f91f53cafc3 not found: ID does not exist" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.674977 4831 scope.go:117] "RemoveContainer" containerID="99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.675240 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140"} err="failed to get container status \"99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140\": rpc error: code = NotFound desc = could not find container \"99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140\": container with ID starting with 99bdb24487a452734969faad228beacf6bfa9306c8bbd8323add111bd5d75140 not found: ID does not exist" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.675292 4831 scope.go:117] "RemoveContainer" containerID="cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.675863 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa"} err="failed to get container status \"cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa\": rpc error: code = NotFound desc = could not find container \"cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa\": container with ID starting with cd60d53c13947941c93b7ab98c2375b567653dcd9f1cc0639a5582cae3ca68aa not found: ID does not exist" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.675883 4831 scope.go:117] "RemoveContainer" containerID="648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.676196 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92"} err="failed to get container status \"648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92\": rpc error: code = NotFound desc = could not find container \"648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92\": container with ID starting with 648a27bdbfbf11f72cfb168340ea7e20f224d179a85b54074982da1f77d3ac92 not found: ID does not exist" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.676214 4831 scope.go:117] "RemoveContainer" containerID="dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.676490 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f"} err="failed to get container status \"dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\": rpc error: code = NotFound desc = could not find container \"dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f\": container with ID starting with dd4da91614a1ff61a3846b5f07b0e9cfb81f201b2c2cdb70453f59d7f5b7731f not found: ID does not exist" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.676562 4831 scope.go:117] "RemoveContainer" containerID="374249a0590c2fc6e217a4254dd8a1d33c409fd5486c32eddcd4f1dc30bd9f7e" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.677038 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"374249a0590c2fc6e217a4254dd8a1d33c409fd5486c32eddcd4f1dc30bd9f7e"} err="failed to get container status \"374249a0590c2fc6e217a4254dd8a1d33c409fd5486c32eddcd4f1dc30bd9f7e\": rpc error: code = NotFound desc = could not find container \"374249a0590c2fc6e217a4254dd8a1d33c409fd5486c32eddcd4f1dc30bd9f7e\": container with ID starting with 374249a0590c2fc6e217a4254dd8a1d33c409fd5486c32eddcd4f1dc30bd9f7e not found: ID does not exist" Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.789461 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4xzkp"] Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.794512 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4xzkp"] Dec 04 10:25:36 crc kubenswrapper[4831]: I1204 10:25:36.801290 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:37 crc kubenswrapper[4831]: I1204 10:25:37.283391 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1261b9db-fe52-4fbc-9a9c-7e0c3486276e" path="/var/lib/kubelet/pods/1261b9db-fe52-4fbc-9a9c-7e0c3486276e/volumes" Dec 04 10:25:37 crc kubenswrapper[4831]: I1204 10:25:37.450542 4831 generic.go:334] "Generic (PLEG): container finished" podID="c4c30bb6-9cd8-464c-934c-d0feb55a1fda" containerID="375a03ba9e630177ec7e53ef16e162f3cfba39ddb0ccef32830193356c486168" exitCode=0 Dec 04 10:25:37 crc kubenswrapper[4831]: I1204 10:25:37.450638 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" event={"ID":"c4c30bb6-9cd8-464c-934c-d0feb55a1fda","Type":"ContainerDied","Data":"375a03ba9e630177ec7e53ef16e162f3cfba39ddb0ccef32830193356c486168"} Dec 04 10:25:37 crc kubenswrapper[4831]: I1204 10:25:37.450689 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" event={"ID":"c4c30bb6-9cd8-464c-934c-d0feb55a1fda","Type":"ContainerStarted","Data":"e20d7e9cf733c333f3d82c2b7aa5af40da32beb91462651c6942126e04ec66d1"} Dec 04 10:25:38 crc kubenswrapper[4831]: I1204 10:25:38.463304 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" event={"ID":"c4c30bb6-9cd8-464c-934c-d0feb55a1fda","Type":"ContainerStarted","Data":"36de8f4d7358ab7070974afb85858b9459cd39ed5b6f079376fabf1a36e92723"} Dec 04 10:25:38 crc kubenswrapper[4831]: I1204 10:25:38.463699 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" event={"ID":"c4c30bb6-9cd8-464c-934c-d0feb55a1fda","Type":"ContainerStarted","Data":"f1e8a58f21546292d540dedcc56221cff54b224ca74c80f706c62eb24b06a155"} Dec 04 10:25:38 crc kubenswrapper[4831]: I1204 10:25:38.463725 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" event={"ID":"c4c30bb6-9cd8-464c-934c-d0feb55a1fda","Type":"ContainerStarted","Data":"9ee2aba245f6f7a495d36bc4cf9f1f8f2b2774a1e3546bc28e394ff9055223a7"} Dec 04 10:25:38 crc kubenswrapper[4831]: I1204 10:25:38.463741 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" event={"ID":"c4c30bb6-9cd8-464c-934c-d0feb55a1fda","Type":"ContainerStarted","Data":"e2fc0706076bbaf7d136b05abc9a264f63a47dbadcb4a877ada02239273e593d"} Dec 04 10:25:38 crc kubenswrapper[4831]: I1204 10:25:38.463755 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" event={"ID":"c4c30bb6-9cd8-464c-934c-d0feb55a1fda","Type":"ContainerStarted","Data":"b0a2b28591b79513ae6f286601f0dd66757b6cce090ac24252cb736f6e982ab9"} Dec 04 10:25:39 crc kubenswrapper[4831]: I1204 10:25:39.475545 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" event={"ID":"c4c30bb6-9cd8-464c-934c-d0feb55a1fda","Type":"ContainerStarted","Data":"bd6adbf38baad9470e96ceafe91a3e4960bf2ba797bd6acbf64d5f2ebaa5cca4"} Dec 04 10:25:41 crc kubenswrapper[4831]: I1204 10:25:41.489882 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" event={"ID":"c4c30bb6-9cd8-464c-934c-d0feb55a1fda","Type":"ContainerStarted","Data":"a4659d6210d6b1d00d1b4cacffda0ad5ba1768db6a2f9ce1f95d3c47a9aabf72"} Dec 04 10:25:43 crc kubenswrapper[4831]: I1204 10:25:43.502095 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" event={"ID":"c4c30bb6-9cd8-464c-934c-d0feb55a1fda","Type":"ContainerStarted","Data":"2e011956a350b94d685dfd11d8c50eec62e21e1cd03c77058b14d3ff5adf7b4d"} Dec 04 10:25:43 crc kubenswrapper[4831]: I1204 10:25:43.502512 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:43 crc kubenswrapper[4831]: I1204 10:25:43.502546 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:43 crc kubenswrapper[4831]: I1204 10:25:43.502554 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:43 crc kubenswrapper[4831]: I1204 10:25:43.538258 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:43 crc kubenswrapper[4831]: I1204 10:25:43.539300 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:25:43 crc kubenswrapper[4831]: I1204 10:25:43.576766 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" podStartSLOduration=7.57674599 podStartE2EDuration="7.57674599s" podCreationTimestamp="2025-12-04 10:25:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:25:43.537637287 +0000 UTC m=+640.486812611" watchObservedRunningTime="2025-12-04 10:25:43.57674599 +0000 UTC m=+640.525921304" Dec 04 10:25:48 crc kubenswrapper[4831]: I1204 10:25:48.276744 4831 scope.go:117] "RemoveContainer" containerID="cf6417296513bd39972c5b248d9ea179e028735ddb1b875af0424524b1d2c67d" Dec 04 10:25:48 crc kubenswrapper[4831]: E1204 10:25:48.277630 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-5g27v_openshift-multus(c6a78509-d612-4338-8562-9b0627c1793f)\"" pod="openshift-multus/multus-5g27v" podUID="c6a78509-d612-4338-8562-9b0627c1793f" Dec 04 10:25:51 crc kubenswrapper[4831]: I1204 10:25:51.972906 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:25:51 crc kubenswrapper[4831]: I1204 10:25:51.973432 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:25:51 crc kubenswrapper[4831]: I1204 10:25:51.973538 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" Dec 04 10:25:51 crc kubenswrapper[4831]: I1204 10:25:51.974506 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5f944904d013e4dfa65943141ad3d8f85fcad518eb19d11e801287afe4feca7d"} pod="openshift-machine-config-operator/machine-config-daemon-g76nn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 10:25:51 crc kubenswrapper[4831]: I1204 10:25:51.974594 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" containerID="cri-o://5f944904d013e4dfa65943141ad3d8f85fcad518eb19d11e801287afe4feca7d" gracePeriod=600 Dec 04 10:25:53 crc kubenswrapper[4831]: I1204 10:25:53.566564 4831 generic.go:334] "Generic (PLEG): container finished" podID="8475bb26-8864-4d49-935b-db7d4cb73387" containerID="5f944904d013e4dfa65943141ad3d8f85fcad518eb19d11e801287afe4feca7d" exitCode=0 Dec 04 10:25:53 crc kubenswrapper[4831]: I1204 10:25:53.566625 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" event={"ID":"8475bb26-8864-4d49-935b-db7d4cb73387","Type":"ContainerDied","Data":"5f944904d013e4dfa65943141ad3d8f85fcad518eb19d11e801287afe4feca7d"} Dec 04 10:25:53 crc kubenswrapper[4831]: I1204 10:25:53.567014 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" event={"ID":"8475bb26-8864-4d49-935b-db7d4cb73387","Type":"ContainerStarted","Data":"3282642ca73576095ec0c4e4a39a50ef22334c6e41917a860b9796381706e28c"} Dec 04 10:25:53 crc kubenswrapper[4831]: I1204 10:25:53.567041 4831 scope.go:117] "RemoveContainer" containerID="6f1a1ccac41bfbdc84a51f15b671049caf3ab62892dd1887a29c7fe8435d66ae" Dec 04 10:26:01 crc kubenswrapper[4831]: I1204 10:26:01.276595 4831 scope.go:117] "RemoveContainer" containerID="cf6417296513bd39972c5b248d9ea179e028735ddb1b875af0424524b1d2c67d" Dec 04 10:26:01 crc kubenswrapper[4831]: I1204 10:26:01.620436 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5g27v_c6a78509-d612-4338-8562-9b0627c1793f/kube-multus/2.log" Dec 04 10:26:01 crc kubenswrapper[4831]: I1204 10:26:01.621527 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5g27v_c6a78509-d612-4338-8562-9b0627c1793f/kube-multus/1.log" Dec 04 10:26:01 crc kubenswrapper[4831]: I1204 10:26:01.621605 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5g27v" event={"ID":"c6a78509-d612-4338-8562-9b0627c1793f","Type":"ContainerStarted","Data":"718625da62107e05b8eb99eef11512818ca2bb40643313a20d133171c40cb3a2"} Dec 04 10:26:03 crc kubenswrapper[4831]: I1204 10:26:03.515769 4831 scope.go:117] "RemoveContainer" containerID="f4bd1b329fe50dcd62c415378ed6a27d47c10396502c612f82a95540cd56501a" Dec 04 10:26:03 crc kubenswrapper[4831]: I1204 10:26:03.634814 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5g27v_c6a78509-d612-4338-8562-9b0627c1793f/kube-multus/2.log" Dec 04 10:26:05 crc kubenswrapper[4831]: I1204 10:26:05.453470 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2ww4"] Dec 04 10:26:05 crc kubenswrapper[4831]: I1204 10:26:05.455069 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2ww4" Dec 04 10:26:05 crc kubenswrapper[4831]: I1204 10:26:05.458009 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 04 10:26:05 crc kubenswrapper[4831]: I1204 10:26:05.463064 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2ww4"] Dec 04 10:26:05 crc kubenswrapper[4831]: I1204 10:26:05.605279 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a85d95a9-36ba-4a69-a690-6e2f3b5421c6-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2ww4\" (UID: \"a85d95a9-36ba-4a69-a690-6e2f3b5421c6\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2ww4" Dec 04 10:26:05 crc kubenswrapper[4831]: I1204 10:26:05.605415 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a85d95a9-36ba-4a69-a690-6e2f3b5421c6-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2ww4\" (UID: \"a85d95a9-36ba-4a69-a690-6e2f3b5421c6\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2ww4" Dec 04 10:26:05 crc kubenswrapper[4831]: I1204 10:26:05.605481 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chzwn\" (UniqueName: \"kubernetes.io/projected/a85d95a9-36ba-4a69-a690-6e2f3b5421c6-kube-api-access-chzwn\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2ww4\" (UID: \"a85d95a9-36ba-4a69-a690-6e2f3b5421c6\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2ww4" Dec 04 10:26:05 crc kubenswrapper[4831]: I1204 10:26:05.706214 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a85d95a9-36ba-4a69-a690-6e2f3b5421c6-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2ww4\" (UID: \"a85d95a9-36ba-4a69-a690-6e2f3b5421c6\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2ww4" Dec 04 10:26:05 crc kubenswrapper[4831]: I1204 10:26:05.706345 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a85d95a9-36ba-4a69-a690-6e2f3b5421c6-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2ww4\" (UID: \"a85d95a9-36ba-4a69-a690-6e2f3b5421c6\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2ww4" Dec 04 10:26:05 crc kubenswrapper[4831]: I1204 10:26:05.706413 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chzwn\" (UniqueName: \"kubernetes.io/projected/a85d95a9-36ba-4a69-a690-6e2f3b5421c6-kube-api-access-chzwn\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2ww4\" (UID: \"a85d95a9-36ba-4a69-a690-6e2f3b5421c6\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2ww4" Dec 04 10:26:05 crc kubenswrapper[4831]: I1204 10:26:05.707232 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a85d95a9-36ba-4a69-a690-6e2f3b5421c6-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2ww4\" (UID: \"a85d95a9-36ba-4a69-a690-6e2f3b5421c6\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2ww4" Dec 04 10:26:05 crc kubenswrapper[4831]: I1204 10:26:05.707290 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a85d95a9-36ba-4a69-a690-6e2f3b5421c6-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2ww4\" (UID: \"a85d95a9-36ba-4a69-a690-6e2f3b5421c6\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2ww4" Dec 04 10:26:05 crc kubenswrapper[4831]: I1204 10:26:05.731284 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chzwn\" (UniqueName: \"kubernetes.io/projected/a85d95a9-36ba-4a69-a690-6e2f3b5421c6-kube-api-access-chzwn\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2ww4\" (UID: \"a85d95a9-36ba-4a69-a690-6e2f3b5421c6\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2ww4" Dec 04 10:26:05 crc kubenswrapper[4831]: I1204 10:26:05.773833 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2ww4" Dec 04 10:26:06 crc kubenswrapper[4831]: W1204 10:26:06.042574 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda85d95a9_36ba_4a69_a690_6e2f3b5421c6.slice/crio-c67883473c18be9c84bb82ec06161a26290e77893475a7c2c03916721982b97c WatchSource:0}: Error finding container c67883473c18be9c84bb82ec06161a26290e77893475a7c2c03916721982b97c: Status 404 returned error can't find the container with id c67883473c18be9c84bb82ec06161a26290e77893475a7c2c03916721982b97c Dec 04 10:26:06 crc kubenswrapper[4831]: I1204 10:26:06.042884 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2ww4"] Dec 04 10:26:06 crc kubenswrapper[4831]: I1204 10:26:06.651621 4831 generic.go:334] "Generic (PLEG): container finished" podID="a85d95a9-36ba-4a69-a690-6e2f3b5421c6" containerID="bc79665236ea0f241c23bf20923dea6346356f544fa69c15cd2b64c8dd14b450" exitCode=0 Dec 04 10:26:06 crc kubenswrapper[4831]: I1204 10:26:06.651737 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2ww4" event={"ID":"a85d95a9-36ba-4a69-a690-6e2f3b5421c6","Type":"ContainerDied","Data":"bc79665236ea0f241c23bf20923dea6346356f544fa69c15cd2b64c8dd14b450"} Dec 04 10:26:06 crc kubenswrapper[4831]: I1204 10:26:06.652771 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2ww4" event={"ID":"a85d95a9-36ba-4a69-a690-6e2f3b5421c6","Type":"ContainerStarted","Data":"c67883473c18be9c84bb82ec06161a26290e77893475a7c2c03916721982b97c"} Dec 04 10:26:06 crc kubenswrapper[4831]: I1204 10:26:06.842338 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bhzlh" Dec 04 10:26:10 crc kubenswrapper[4831]: I1204 10:26:10.679320 4831 generic.go:334] "Generic (PLEG): container finished" podID="a85d95a9-36ba-4a69-a690-6e2f3b5421c6" containerID="21df4d1832ebc4555d2e451e8da6ea73a5585feead4cc3b4c1670728e03f8f5f" exitCode=0 Dec 04 10:26:10 crc kubenswrapper[4831]: I1204 10:26:10.679447 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2ww4" event={"ID":"a85d95a9-36ba-4a69-a690-6e2f3b5421c6","Type":"ContainerDied","Data":"21df4d1832ebc4555d2e451e8da6ea73a5585feead4cc3b4c1670728e03f8f5f"} Dec 04 10:26:11 crc kubenswrapper[4831]: I1204 10:26:11.687462 4831 generic.go:334] "Generic (PLEG): container finished" podID="a85d95a9-36ba-4a69-a690-6e2f3b5421c6" containerID="57ca9c6a1c4c055d0a4a915748f91a9ddfd62553d6e9b8e4c2e4eb4357b1b22a" exitCode=0 Dec 04 10:26:11 crc kubenswrapper[4831]: I1204 10:26:11.687568 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2ww4" event={"ID":"a85d95a9-36ba-4a69-a690-6e2f3b5421c6","Type":"ContainerDied","Data":"57ca9c6a1c4c055d0a4a915748f91a9ddfd62553d6e9b8e4c2e4eb4357b1b22a"} Dec 04 10:26:12 crc kubenswrapper[4831]: I1204 10:26:12.978452 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2ww4" Dec 04 10:26:13 crc kubenswrapper[4831]: I1204 10:26:13.108866 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a85d95a9-36ba-4a69-a690-6e2f3b5421c6-bundle\") pod \"a85d95a9-36ba-4a69-a690-6e2f3b5421c6\" (UID: \"a85d95a9-36ba-4a69-a690-6e2f3b5421c6\") " Dec 04 10:26:13 crc kubenswrapper[4831]: I1204 10:26:13.108941 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a85d95a9-36ba-4a69-a690-6e2f3b5421c6-util\") pod \"a85d95a9-36ba-4a69-a690-6e2f3b5421c6\" (UID: \"a85d95a9-36ba-4a69-a690-6e2f3b5421c6\") " Dec 04 10:26:13 crc kubenswrapper[4831]: I1204 10:26:13.108991 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chzwn\" (UniqueName: \"kubernetes.io/projected/a85d95a9-36ba-4a69-a690-6e2f3b5421c6-kube-api-access-chzwn\") pod \"a85d95a9-36ba-4a69-a690-6e2f3b5421c6\" (UID: \"a85d95a9-36ba-4a69-a690-6e2f3b5421c6\") " Dec 04 10:26:13 crc kubenswrapper[4831]: I1204 10:26:13.111720 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a85d95a9-36ba-4a69-a690-6e2f3b5421c6-bundle" (OuterVolumeSpecName: "bundle") pod "a85d95a9-36ba-4a69-a690-6e2f3b5421c6" (UID: "a85d95a9-36ba-4a69-a690-6e2f3b5421c6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:26:13 crc kubenswrapper[4831]: I1204 10:26:13.116190 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a85d95a9-36ba-4a69-a690-6e2f3b5421c6-kube-api-access-chzwn" (OuterVolumeSpecName: "kube-api-access-chzwn") pod "a85d95a9-36ba-4a69-a690-6e2f3b5421c6" (UID: "a85d95a9-36ba-4a69-a690-6e2f3b5421c6"). InnerVolumeSpecName "kube-api-access-chzwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:26:13 crc kubenswrapper[4831]: I1204 10:26:13.124520 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a85d95a9-36ba-4a69-a690-6e2f3b5421c6-util" (OuterVolumeSpecName: "util") pod "a85d95a9-36ba-4a69-a690-6e2f3b5421c6" (UID: "a85d95a9-36ba-4a69-a690-6e2f3b5421c6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:26:13 crc kubenswrapper[4831]: I1204 10:26:13.210431 4831 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a85d95a9-36ba-4a69-a690-6e2f3b5421c6-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:26:13 crc kubenswrapper[4831]: I1204 10:26:13.210467 4831 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a85d95a9-36ba-4a69-a690-6e2f3b5421c6-util\") on node \"crc\" DevicePath \"\"" Dec 04 10:26:13 crc kubenswrapper[4831]: I1204 10:26:13.210481 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chzwn\" (UniqueName: \"kubernetes.io/projected/a85d95a9-36ba-4a69-a690-6e2f3b5421c6-kube-api-access-chzwn\") on node \"crc\" DevicePath \"\"" Dec 04 10:26:13 crc kubenswrapper[4831]: I1204 10:26:13.702573 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2ww4" event={"ID":"a85d95a9-36ba-4a69-a690-6e2f3b5421c6","Type":"ContainerDied","Data":"c67883473c18be9c84bb82ec06161a26290e77893475a7c2c03916721982b97c"} Dec 04 10:26:13 crc kubenswrapper[4831]: I1204 10:26:13.702860 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c67883473c18be9c84bb82ec06161a26290e77893475a7c2c03916721982b97c" Dec 04 10:26:13 crc kubenswrapper[4831]: I1204 10:26:13.702706 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2ww4" Dec 04 10:26:23 crc kubenswrapper[4831]: I1204 10:26:23.935640 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-tmjrz"] Dec 04 10:26:23 crc kubenswrapper[4831]: E1204 10:26:23.936339 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a85d95a9-36ba-4a69-a690-6e2f3b5421c6" containerName="pull" Dec 04 10:26:23 crc kubenswrapper[4831]: I1204 10:26:23.936351 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a85d95a9-36ba-4a69-a690-6e2f3b5421c6" containerName="pull" Dec 04 10:26:23 crc kubenswrapper[4831]: E1204 10:26:23.936373 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a85d95a9-36ba-4a69-a690-6e2f3b5421c6" containerName="extract" Dec 04 10:26:23 crc kubenswrapper[4831]: I1204 10:26:23.936380 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a85d95a9-36ba-4a69-a690-6e2f3b5421c6" containerName="extract" Dec 04 10:26:23 crc kubenswrapper[4831]: E1204 10:26:23.936389 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a85d95a9-36ba-4a69-a690-6e2f3b5421c6" containerName="util" Dec 04 10:26:23 crc kubenswrapper[4831]: I1204 10:26:23.936394 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a85d95a9-36ba-4a69-a690-6e2f3b5421c6" containerName="util" Dec 04 10:26:23 crc kubenswrapper[4831]: I1204 10:26:23.936477 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a85d95a9-36ba-4a69-a690-6e2f3b5421c6" containerName="extract" Dec 04 10:26:23 crc kubenswrapper[4831]: I1204 10:26:23.936847 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-tmjrz" Dec 04 10:26:23 crc kubenswrapper[4831]: I1204 10:26:23.940648 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Dec 04 10:26:23 crc kubenswrapper[4831]: I1204 10:26:23.940803 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Dec 04 10:26:23 crc kubenswrapper[4831]: I1204 10:26:23.944481 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-x9k5p" Dec 04 10:26:23 crc kubenswrapper[4831]: I1204 10:26:23.956547 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-tmjrz"] Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.048239 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqms9\" (UniqueName: \"kubernetes.io/projected/6aea0d1c-6c0f-4d5a-a071-c1c597eea91c-kube-api-access-xqms9\") pod \"obo-prometheus-operator-668cf9dfbb-tmjrz\" (UID: \"6aea0d1c-6c0f-4d5a-a071-c1c597eea91c\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-tmjrz" Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.051466 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-74cccbd8d-4qkhw"] Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.052127 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-74cccbd8d-4qkhw" Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.055771 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-ldffw" Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.056713 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.067217 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-74cccbd8d-6mtfn"] Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.069062 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-74cccbd8d-6mtfn" Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.070279 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-74cccbd8d-4qkhw"] Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.082555 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-74cccbd8d-6mtfn"] Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.149081 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqms9\" (UniqueName: \"kubernetes.io/projected/6aea0d1c-6c0f-4d5a-a071-c1c597eea91c-kube-api-access-xqms9\") pod \"obo-prometheus-operator-668cf9dfbb-tmjrz\" (UID: \"6aea0d1c-6c0f-4d5a-a071-c1c597eea91c\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-tmjrz" Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.149136 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cd429fe0-b539-4209-a587-a9534c8fcc74-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-74cccbd8d-4qkhw\" (UID: \"cd429fe0-b539-4209-a587-a9534c8fcc74\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-74cccbd8d-4qkhw" Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.149221 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cd429fe0-b539-4209-a587-a9534c8fcc74-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-74cccbd8d-4qkhw\" (UID: \"cd429fe0-b539-4209-a587-a9534c8fcc74\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-74cccbd8d-4qkhw" Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.168586 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-j8hhw"] Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.169305 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-j8hhw" Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.170488 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqms9\" (UniqueName: \"kubernetes.io/projected/6aea0d1c-6c0f-4d5a-a071-c1c597eea91c-kube-api-access-xqms9\") pod \"obo-prometheus-operator-668cf9dfbb-tmjrz\" (UID: \"6aea0d1c-6c0f-4d5a-a071-c1c597eea91c\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-tmjrz" Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.171450 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-78dsb" Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.171974 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.179575 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-j8hhw"] Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.250485 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/392c5061-c622-463b-b71c-961e2495e965-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-74cccbd8d-6mtfn\" (UID: \"392c5061-c622-463b-b71c-961e2495e965\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-74cccbd8d-6mtfn" Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.250535 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/392c5061-c622-463b-b71c-961e2495e965-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-74cccbd8d-6mtfn\" (UID: \"392c5061-c622-463b-b71c-961e2495e965\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-74cccbd8d-6mtfn" Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.250559 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cd429fe0-b539-4209-a587-a9534c8fcc74-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-74cccbd8d-4qkhw\" (UID: \"cd429fe0-b539-4209-a587-a9534c8fcc74\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-74cccbd8d-4qkhw" Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.250772 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cd429fe0-b539-4209-a587-a9534c8fcc74-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-74cccbd8d-4qkhw\" (UID: \"cd429fe0-b539-4209-a587-a9534c8fcc74\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-74cccbd8d-4qkhw" Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.253496 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cd429fe0-b539-4209-a587-a9534c8fcc74-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-74cccbd8d-4qkhw\" (UID: \"cd429fe0-b539-4209-a587-a9534c8fcc74\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-74cccbd8d-4qkhw" Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.253501 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cd429fe0-b539-4209-a587-a9534c8fcc74-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-74cccbd8d-4qkhw\" (UID: \"cd429fe0-b539-4209-a587-a9534c8fcc74\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-74cccbd8d-4qkhw" Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.258585 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-tmjrz" Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.351039 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-b4ljp"] Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.354093 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-b4ljp" Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.354979 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/392c5061-c622-463b-b71c-961e2495e965-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-74cccbd8d-6mtfn\" (UID: \"392c5061-c622-463b-b71c-961e2495e965\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-74cccbd8d-6mtfn" Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.355187 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/392c5061-c622-463b-b71c-961e2495e965-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-74cccbd8d-6mtfn\" (UID: \"392c5061-c622-463b-b71c-961e2495e965\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-74cccbd8d-6mtfn" Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.355500 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flv44\" (UniqueName: \"kubernetes.io/projected/a0aeeeda-2835-40be-bc77-78056739952f-kube-api-access-flv44\") pod \"observability-operator-d8bb48f5d-j8hhw\" (UID: \"a0aeeeda-2835-40be-bc77-78056739952f\") " pod="openshift-operators/observability-operator-d8bb48f5d-j8hhw" Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.355625 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0aeeeda-2835-40be-bc77-78056739952f-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-j8hhw\" (UID: \"a0aeeeda-2835-40be-bc77-78056739952f\") " pod="openshift-operators/observability-operator-d8bb48f5d-j8hhw" Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.369196 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-bkxdp" Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.375699 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-74cccbd8d-4qkhw" Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.396524 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/392c5061-c622-463b-b71c-961e2495e965-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-74cccbd8d-6mtfn\" (UID: \"392c5061-c622-463b-b71c-961e2495e965\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-74cccbd8d-6mtfn" Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.397409 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/392c5061-c622-463b-b71c-961e2495e965-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-74cccbd8d-6mtfn\" (UID: \"392c5061-c622-463b-b71c-961e2495e965\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-74cccbd8d-6mtfn" Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.402510 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-b4ljp"] Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.458192 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0aeeeda-2835-40be-bc77-78056739952f-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-j8hhw\" (UID: \"a0aeeeda-2835-40be-bc77-78056739952f\") " pod="openshift-operators/observability-operator-d8bb48f5d-j8hhw" Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.458551 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr4r5\" (UniqueName: \"kubernetes.io/projected/515a4768-685f-45a1-b7e7-0b0087ba126e-kube-api-access-fr4r5\") pod \"perses-operator-5446b9c989-b4ljp\" (UID: \"515a4768-685f-45a1-b7e7-0b0087ba126e\") " pod="openshift-operators/perses-operator-5446b9c989-b4ljp" Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.458589 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/515a4768-685f-45a1-b7e7-0b0087ba126e-openshift-service-ca\") pod \"perses-operator-5446b9c989-b4ljp\" (UID: \"515a4768-685f-45a1-b7e7-0b0087ba126e\") " pod="openshift-operators/perses-operator-5446b9c989-b4ljp" Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.458633 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flv44\" (UniqueName: \"kubernetes.io/projected/a0aeeeda-2835-40be-bc77-78056739952f-kube-api-access-flv44\") pod \"observability-operator-d8bb48f5d-j8hhw\" (UID: \"a0aeeeda-2835-40be-bc77-78056739952f\") " pod="openshift-operators/observability-operator-d8bb48f5d-j8hhw" Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.466419 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0aeeeda-2835-40be-bc77-78056739952f-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-j8hhw\" (UID: \"a0aeeeda-2835-40be-bc77-78056739952f\") " pod="openshift-operators/observability-operator-d8bb48f5d-j8hhw" Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.481961 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flv44\" (UniqueName: \"kubernetes.io/projected/a0aeeeda-2835-40be-bc77-78056739952f-kube-api-access-flv44\") pod \"observability-operator-d8bb48f5d-j8hhw\" (UID: \"a0aeeeda-2835-40be-bc77-78056739952f\") " pod="openshift-operators/observability-operator-d8bb48f5d-j8hhw" Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.507080 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-tmjrz"] Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.507105 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-j8hhw" Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.560421 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr4r5\" (UniqueName: \"kubernetes.io/projected/515a4768-685f-45a1-b7e7-0b0087ba126e-kube-api-access-fr4r5\") pod \"perses-operator-5446b9c989-b4ljp\" (UID: \"515a4768-685f-45a1-b7e7-0b0087ba126e\") " pod="openshift-operators/perses-operator-5446b9c989-b4ljp" Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.560468 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/515a4768-685f-45a1-b7e7-0b0087ba126e-openshift-service-ca\") pod \"perses-operator-5446b9c989-b4ljp\" (UID: \"515a4768-685f-45a1-b7e7-0b0087ba126e\") " pod="openshift-operators/perses-operator-5446b9c989-b4ljp" Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.561403 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/515a4768-685f-45a1-b7e7-0b0087ba126e-openshift-service-ca\") pod \"perses-operator-5446b9c989-b4ljp\" (UID: \"515a4768-685f-45a1-b7e7-0b0087ba126e\") " pod="openshift-operators/perses-operator-5446b9c989-b4ljp" Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.579203 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr4r5\" (UniqueName: \"kubernetes.io/projected/515a4768-685f-45a1-b7e7-0b0087ba126e-kube-api-access-fr4r5\") pod \"perses-operator-5446b9c989-b4ljp\" (UID: \"515a4768-685f-45a1-b7e7-0b0087ba126e\") " pod="openshift-operators/perses-operator-5446b9c989-b4ljp" Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.656681 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-74cccbd8d-4qkhw"] Dec 04 10:26:24 crc kubenswrapper[4831]: W1204 10:26:24.663766 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd429fe0_b539_4209_a587_a9534c8fcc74.slice/crio-dd575bf376289e0402b96e7c1a99726cba908a9f277b577192fdbe8ad0846ab2 WatchSource:0}: Error finding container dd575bf376289e0402b96e7c1a99726cba908a9f277b577192fdbe8ad0846ab2: Status 404 returned error can't find the container with id dd575bf376289e0402b96e7c1a99726cba908a9f277b577192fdbe8ad0846ab2 Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.683379 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-74cccbd8d-6mtfn" Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.739908 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-b4ljp" Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.779546 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-74cccbd8d-4qkhw" event={"ID":"cd429fe0-b539-4209-a587-a9534c8fcc74","Type":"ContainerStarted","Data":"dd575bf376289e0402b96e7c1a99726cba908a9f277b577192fdbe8ad0846ab2"} Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.780487 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-tmjrz" event={"ID":"6aea0d1c-6c0f-4d5a-a071-c1c597eea91c","Type":"ContainerStarted","Data":"0b274876dbdcbfe12d137ac46924ee7b8a8b69862f8cd119bdf8fdeab25b8ebf"} Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.816809 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-j8hhw"] Dec 04 10:26:24 crc kubenswrapper[4831]: W1204 10:26:24.845169 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0aeeeda_2835_40be_bc77_78056739952f.slice/crio-6b697a3d23252522127a6db1de413c8ae3b4f63bdaf33c35e4e62028da0b122a WatchSource:0}: Error finding container 6b697a3d23252522127a6db1de413c8ae3b4f63bdaf33c35e4e62028da0b122a: Status 404 returned error can't find the container with id 6b697a3d23252522127a6db1de413c8ae3b4f63bdaf33c35e4e62028da0b122a Dec 04 10:26:24 crc kubenswrapper[4831]: I1204 10:26:24.988423 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-74cccbd8d-6mtfn"] Dec 04 10:26:25 crc kubenswrapper[4831]: W1204 10:26:25.003433 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod392c5061_c622_463b_b71c_961e2495e965.slice/crio-462cc17c5c21296a4a550271bbd11c452b9a7d75bfc456eb455547422958cbce WatchSource:0}: Error finding container 462cc17c5c21296a4a550271bbd11c452b9a7d75bfc456eb455547422958cbce: Status 404 returned error can't find the container with id 462cc17c5c21296a4a550271bbd11c452b9a7d75bfc456eb455547422958cbce Dec 04 10:26:25 crc kubenswrapper[4831]: I1204 10:26:25.078192 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-b4ljp"] Dec 04 10:26:25 crc kubenswrapper[4831]: I1204 10:26:25.786205 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-b4ljp" event={"ID":"515a4768-685f-45a1-b7e7-0b0087ba126e","Type":"ContainerStarted","Data":"d1944792069e68b067af0e265e965b5916ed36c5aab5c3cefb653801a69d6f61"} Dec 04 10:26:25 crc kubenswrapper[4831]: I1204 10:26:25.787370 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-j8hhw" event={"ID":"a0aeeeda-2835-40be-bc77-78056739952f","Type":"ContainerStarted","Data":"6b697a3d23252522127a6db1de413c8ae3b4f63bdaf33c35e4e62028da0b122a"} Dec 04 10:26:25 crc kubenswrapper[4831]: I1204 10:26:25.788731 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-74cccbd8d-6mtfn" event={"ID":"392c5061-c622-463b-b71c-961e2495e965","Type":"ContainerStarted","Data":"462cc17c5c21296a4a550271bbd11c452b9a7d75bfc456eb455547422958cbce"} Dec 04 10:26:40 crc kubenswrapper[4831]: E1204 10:26:40.920333 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:ce7d2904f7b238aa37dfe74a0b76bf73629e7a14fa52bf54b0ecf030ca36f1bb" Dec 04 10:26:40 crc kubenswrapper[4831]: E1204 10:26:40.921192 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:ce7d2904f7b238aa37dfe74a0b76bf73629e7a14fa52bf54b0ecf030ca36f1bb,Command:[],Args:[--namespace=$(NAMESPACE) --images=perses=$(RELATED_IMAGE_PERSES) --images=alertmanager=$(RELATED_IMAGE_ALERTMANAGER) --images=prometheus=$(RELATED_IMAGE_PROMETHEUS) --images=thanos=$(RELATED_IMAGE_THANOS) --images=ui-dashboards=$(RELATED_IMAGE_CONSOLE_DASHBOARDS_PLUGIN) --images=ui-distributed-tracing=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN) --images=ui-distributed-tracing-pf5=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF5) --images=ui-distributed-tracing-pf4=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF4) --images=ui-logging=$(RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN) --images=ui-logging-pf4=$(RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN_PF4) --images=ui-troubleshooting-panel=$(RELATED_IMAGE_CONSOLE_TROUBLESHOOTING_PANEL_PLUGIN) --images=ui-monitoring=$(RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN) --images=ui-monitoring-pf5=$(RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN_PF5) --images=korrel8r=$(RELATED_IMAGE_KORREL8R) --images=health-analyzer=$(RELATED_IMAGE_CLUSTER_HEALTH_ANALYZER) --openshift.enabled=true],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:RELATED_IMAGE_ALERTMANAGER,Value:registry.redhat.io/cluster-observability-operator/alertmanager-rhel9@sha256:e718854a7d6ca8accf0fa72db0eb902e46c44d747ad51dc3f06bba0cefaa3c01,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS,Value:registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:17ea20be390a94ab39f5cdd7f0cbc2498046eebcf77fe3dec9aa288d5c2cf46b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_THANOS,Value:registry.redhat.io/cluster-observability-operator/thanos-rhel9@sha256:d972f4faa5e9c121402d23ed85002f26af48ec36b1b71a7489d677b3913d08b4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PERSES,Value:registry.redhat.io/cluster-observability-operator/perses-rhel9@sha256:91531137fc1dcd740e277e0f65e120a0176a16f788c14c27925b61aa0b792ade,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DASHBOARDS_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/dashboards-console-plugin-rhel9@sha256:a69da8bbca8a28dd2925f864d51cc31cf761b10532c553095ba40b242ef701cb,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-rhel9@sha256:897e1bfad1187062725b54d87107bd0155972257a50d8335dd29e1999b828a4f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF5,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-pf5-rhel9@sha256:95fe5b5746ca8c07ac9217ce2d8ac8e6afad17af210f9d8e0074df1310b209a8,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF4,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-pf4-rhel9@sha256:e9d9a89e4d8126a62b1852055482258ee528cac6398dd5d43ebad75ace0f33c9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/logging-console-plugin-rhel9@sha256:ec684a0645ceb917b019af7ddba68c3533416e356ab0d0320a30e75ca7ebb31b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN_PF4,Value:registry.redhat.io/cluster-observability-operator/logging-console-plugin-pf4-rhel9@sha256:3b9693fcde9b3a9494fb04735b1f7cfd0426f10be820fdc3f024175c0d3df1c9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_TROUBLESHOOTING_PANEL_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/troubleshooting-panel-console-plugin-rhel9@sha256:580606f194180accc8abba099e17a26dca7522ec6d233fa2fdd40312771703e3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/monitoring-console-plugin-rhel9@sha256:e03777be39e71701935059cd877603874a13ac94daa73219d4e5e545599d78a9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN_PF5,Value:registry.redhat.io/cluster-observability-operator/monitoring-console-plugin-pf5-rhel9@sha256:aa47256193cfd2877853878e1ae97d2ab8b8e5deae62b387cbfad02b284d379c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KORREL8R,Value:registry.redhat.io/cluster-observability-operator/korrel8r-rhel9@sha256:c595ff56b2cb85514bf4784db6ddb82e4e657e3e708a7fb695fc4997379a94d4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLUSTER_HEALTH_ANALYZER,Value:registry.redhat.io/cluster-observability-operator/cluster-health-analyzer-rhel9@sha256:45a4ec2a519bcec99e886aa91596d5356a2414a2bd103baaef9fa7838c672eb2,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{400 -3} {} 400m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:observability-operator-tls,ReadOnly:true,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-flv44,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000350000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod observability-operator-d8bb48f5d-j8hhw_openshift-operators(a0aeeeda-2835-40be-bc77-78056739952f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 10:26:40 crc kubenswrapper[4831]: E1204 10:26:40.922475 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/observability-operator-d8bb48f5d-j8hhw" podUID="a0aeeeda-2835-40be-bc77-78056739952f" Dec 04 10:26:41 crc kubenswrapper[4831]: I1204 10:26:41.890797 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-b4ljp" event={"ID":"515a4768-685f-45a1-b7e7-0b0087ba126e","Type":"ContainerStarted","Data":"d3f3af34eade64ad40c67ebdaca00c7ba88ead7cc7c2c87291dc089494bdf187"} Dec 04 10:26:41 crc kubenswrapper[4831]: I1204 10:26:41.891146 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-b4ljp" Dec 04 10:26:41 crc kubenswrapper[4831]: I1204 10:26:41.892178 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-74cccbd8d-6mtfn" event={"ID":"392c5061-c622-463b-b71c-961e2495e965","Type":"ContainerStarted","Data":"0fc092ef8174ff3bd0220bf728bfd6cb14787790243a66947470f6edfdb0fa1d"} Dec 04 10:26:41 crc kubenswrapper[4831]: I1204 10:26:41.893418 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-74cccbd8d-4qkhw" event={"ID":"cd429fe0-b539-4209-a587-a9534c8fcc74","Type":"ContainerStarted","Data":"0639cb2a983087a2caa04d470336c07d398a0d7cf0628155c20f22704afe283d"} Dec 04 10:26:41 crc kubenswrapper[4831]: I1204 10:26:41.894674 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-tmjrz" event={"ID":"6aea0d1c-6c0f-4d5a-a071-c1c597eea91c","Type":"ContainerStarted","Data":"7392015f8d94f08810cf2842321503f3653fe1d02ed6cb88ffa6c87675fff053"} Dec 04 10:26:41 crc kubenswrapper[4831]: E1204 10:26:41.898161 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:ce7d2904f7b238aa37dfe74a0b76bf73629e7a14fa52bf54b0ecf030ca36f1bb\\\"\"" pod="openshift-operators/observability-operator-d8bb48f5d-j8hhw" podUID="a0aeeeda-2835-40be-bc77-78056739952f" Dec 04 10:26:41 crc kubenswrapper[4831]: I1204 10:26:41.918410 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-b4ljp" podStartSLOduration=1.9980594109999998 podStartE2EDuration="17.918393853s" podCreationTimestamp="2025-12-04 10:26:24 +0000 UTC" firstStartedPulling="2025-12-04 10:26:25.076832003 +0000 UTC m=+682.026007317" lastFinishedPulling="2025-12-04 10:26:40.997166445 +0000 UTC m=+697.946341759" observedRunningTime="2025-12-04 10:26:41.913128963 +0000 UTC m=+698.862304277" watchObservedRunningTime="2025-12-04 10:26:41.918393853 +0000 UTC m=+698.867569167" Dec 04 10:26:41 crc kubenswrapper[4831]: I1204 10:26:41.940430 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-tmjrz" podStartSLOduration=2.461215926 podStartE2EDuration="18.94040637s" podCreationTimestamp="2025-12-04 10:26:23 +0000 UTC" firstStartedPulling="2025-12-04 10:26:24.542920886 +0000 UTC m=+681.492096200" lastFinishedPulling="2025-12-04 10:26:41.02211133 +0000 UTC m=+697.971286644" observedRunningTime="2025-12-04 10:26:41.934991726 +0000 UTC m=+698.884167040" watchObservedRunningTime="2025-12-04 10:26:41.94040637 +0000 UTC m=+698.889581704" Dec 04 10:26:41 crc kubenswrapper[4831]: I1204 10:26:41.999021 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-74cccbd8d-6mtfn" podStartSLOduration=2.038241032 podStartE2EDuration="17.999000821s" podCreationTimestamp="2025-12-04 10:26:24 +0000 UTC" firstStartedPulling="2025-12-04 10:26:25.011860412 +0000 UTC m=+681.961035726" lastFinishedPulling="2025-12-04 10:26:40.972620201 +0000 UTC m=+697.921795515" observedRunningTime="2025-12-04 10:26:41.995419826 +0000 UTC m=+698.944595150" watchObservedRunningTime="2025-12-04 10:26:41.999000821 +0000 UTC m=+698.948176135" Dec 04 10:26:54 crc kubenswrapper[4831]: I1204 10:26:54.744121 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-b4ljp" Dec 04 10:26:54 crc kubenswrapper[4831]: I1204 10:26:54.767563 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-74cccbd8d-4qkhw" podStartSLOduration=14.439211414 podStartE2EDuration="30.767545078s" podCreationTimestamp="2025-12-04 10:26:24 +0000 UTC" firstStartedPulling="2025-12-04 10:26:24.6661827 +0000 UTC m=+681.615358014" lastFinishedPulling="2025-12-04 10:26:40.994516364 +0000 UTC m=+697.943691678" observedRunningTime="2025-12-04 10:26:42.016275892 +0000 UTC m=+698.965451206" watchObservedRunningTime="2025-12-04 10:26:54.767545078 +0000 UTC m=+711.716720392" Dec 04 10:26:57 crc kubenswrapper[4831]: I1204 10:26:57.996138 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-j8hhw" event={"ID":"a0aeeeda-2835-40be-bc77-78056739952f","Type":"ContainerStarted","Data":"5a318b077915d0e5d719c581aaf8affbdf4c02ef5b27287044f82191a43046c5"} Dec 04 10:26:57 crc kubenswrapper[4831]: I1204 10:26:57.996993 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-j8hhw" Dec 04 10:26:57 crc kubenswrapper[4831]: I1204 10:26:57.999273 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-j8hhw" Dec 04 10:26:58 crc kubenswrapper[4831]: I1204 10:26:58.020970 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-j8hhw" podStartSLOduration=1.7836352469999999 podStartE2EDuration="34.020948412s" podCreationTimestamp="2025-12-04 10:26:24 +0000 UTC" firstStartedPulling="2025-12-04 10:26:24.849383222 +0000 UTC m=+681.798558536" lastFinishedPulling="2025-12-04 10:26:57.086696387 +0000 UTC m=+714.035871701" observedRunningTime="2025-12-04 10:26:58.018906407 +0000 UTC m=+714.968081741" watchObservedRunningTime="2025-12-04 10:26:58.020948412 +0000 UTC m=+714.970123736" Dec 04 10:27:15 crc kubenswrapper[4831]: I1204 10:27:15.299154 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbwhc2"] Dec 04 10:27:15 crc kubenswrapper[4831]: I1204 10:27:15.300709 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbwhc2" Dec 04 10:27:15 crc kubenswrapper[4831]: I1204 10:27:15.303477 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 04 10:27:15 crc kubenswrapper[4831]: I1204 10:27:15.306332 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbwhc2"] Dec 04 10:27:15 crc kubenswrapper[4831]: I1204 10:27:15.447561 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqkxg\" (UniqueName: \"kubernetes.io/projected/72368101-d91a-4a33-b551-667f279020c6-kube-api-access-fqkxg\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbwhc2\" (UID: \"72368101-d91a-4a33-b551-667f279020c6\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbwhc2" Dec 04 10:27:15 crc kubenswrapper[4831]: I1204 10:27:15.447838 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/72368101-d91a-4a33-b551-667f279020c6-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbwhc2\" (UID: \"72368101-d91a-4a33-b551-667f279020c6\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbwhc2" Dec 04 10:27:15 crc kubenswrapper[4831]: I1204 10:27:15.448017 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/72368101-d91a-4a33-b551-667f279020c6-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbwhc2\" (UID: \"72368101-d91a-4a33-b551-667f279020c6\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbwhc2" Dec 04 10:27:15 crc kubenswrapper[4831]: I1204 10:27:15.549101 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqkxg\" (UniqueName: \"kubernetes.io/projected/72368101-d91a-4a33-b551-667f279020c6-kube-api-access-fqkxg\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbwhc2\" (UID: \"72368101-d91a-4a33-b551-667f279020c6\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbwhc2" Dec 04 10:27:15 crc kubenswrapper[4831]: I1204 10:27:15.549180 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/72368101-d91a-4a33-b551-667f279020c6-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbwhc2\" (UID: \"72368101-d91a-4a33-b551-667f279020c6\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbwhc2" Dec 04 10:27:15 crc kubenswrapper[4831]: I1204 10:27:15.549219 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/72368101-d91a-4a33-b551-667f279020c6-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbwhc2\" (UID: \"72368101-d91a-4a33-b551-667f279020c6\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbwhc2" Dec 04 10:27:15 crc kubenswrapper[4831]: I1204 10:27:15.549675 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/72368101-d91a-4a33-b551-667f279020c6-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbwhc2\" (UID: \"72368101-d91a-4a33-b551-667f279020c6\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbwhc2" Dec 04 10:27:15 crc kubenswrapper[4831]: I1204 10:27:15.549791 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/72368101-d91a-4a33-b551-667f279020c6-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbwhc2\" (UID: \"72368101-d91a-4a33-b551-667f279020c6\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbwhc2" Dec 04 10:27:15 crc kubenswrapper[4831]: I1204 10:27:15.571398 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqkxg\" (UniqueName: \"kubernetes.io/projected/72368101-d91a-4a33-b551-667f279020c6-kube-api-access-fqkxg\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbwhc2\" (UID: \"72368101-d91a-4a33-b551-667f279020c6\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbwhc2" Dec 04 10:27:15 crc kubenswrapper[4831]: I1204 10:27:15.613446 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbwhc2" Dec 04 10:27:16 crc kubenswrapper[4831]: I1204 10:27:16.072524 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbwhc2"] Dec 04 10:27:16 crc kubenswrapper[4831]: W1204 10:27:16.079142 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72368101_d91a_4a33_b551_667f279020c6.slice/crio-40c4ccb0bbce83ad79d50ce7fc9d6e1f4217e603f85070622ce00df16cc4620f WatchSource:0}: Error finding container 40c4ccb0bbce83ad79d50ce7fc9d6e1f4217e603f85070622ce00df16cc4620f: Status 404 returned error can't find the container with id 40c4ccb0bbce83ad79d50ce7fc9d6e1f4217e603f85070622ce00df16cc4620f Dec 04 10:27:16 crc kubenswrapper[4831]: I1204 10:27:16.109850 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbwhc2" event={"ID":"72368101-d91a-4a33-b551-667f279020c6","Type":"ContainerStarted","Data":"40c4ccb0bbce83ad79d50ce7fc9d6e1f4217e603f85070622ce00df16cc4620f"} Dec 04 10:27:18 crc kubenswrapper[4831]: I1204 10:27:18.126909 4831 generic.go:334] "Generic (PLEG): container finished" podID="72368101-d91a-4a33-b551-667f279020c6" containerID="d8f033ea4e2dfbed7aae3abcf484b9f667c8e525c997a905bb1fe4c6d9979a28" exitCode=0 Dec 04 10:27:18 crc kubenswrapper[4831]: I1204 10:27:18.126996 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbwhc2" event={"ID":"72368101-d91a-4a33-b551-667f279020c6","Type":"ContainerDied","Data":"d8f033ea4e2dfbed7aae3abcf484b9f667c8e525c997a905bb1fe4c6d9979a28"} Dec 04 10:27:20 crc kubenswrapper[4831]: I1204 10:27:20.146020 4831 generic.go:334] "Generic (PLEG): container finished" podID="72368101-d91a-4a33-b551-667f279020c6" containerID="3cf16ccc060618b90fad7b1ba300147e9ee1b131319e34af4bde29a5a5363033" exitCode=0 Dec 04 10:27:20 crc kubenswrapper[4831]: I1204 10:27:20.146085 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbwhc2" event={"ID":"72368101-d91a-4a33-b551-667f279020c6","Type":"ContainerDied","Data":"3cf16ccc060618b90fad7b1ba300147e9ee1b131319e34af4bde29a5a5363033"} Dec 04 10:27:21 crc kubenswrapper[4831]: I1204 10:27:21.154047 4831 generic.go:334] "Generic (PLEG): container finished" podID="72368101-d91a-4a33-b551-667f279020c6" containerID="168d2cd4f3b5899027e3dcc824f829faeb6866259640403ccf358d5b8d061b7a" exitCode=0 Dec 04 10:27:21 crc kubenswrapper[4831]: I1204 10:27:21.154089 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbwhc2" event={"ID":"72368101-d91a-4a33-b551-667f279020c6","Type":"ContainerDied","Data":"168d2cd4f3b5899027e3dcc824f829faeb6866259640403ccf358d5b8d061b7a"} Dec 04 10:27:22 crc kubenswrapper[4831]: I1204 10:27:22.387569 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbwhc2" Dec 04 10:27:22 crc kubenswrapper[4831]: I1204 10:27:22.454235 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/72368101-d91a-4a33-b551-667f279020c6-util\") pod \"72368101-d91a-4a33-b551-667f279020c6\" (UID: \"72368101-d91a-4a33-b551-667f279020c6\") " Dec 04 10:27:22 crc kubenswrapper[4831]: I1204 10:27:22.454302 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqkxg\" (UniqueName: \"kubernetes.io/projected/72368101-d91a-4a33-b551-667f279020c6-kube-api-access-fqkxg\") pod \"72368101-d91a-4a33-b551-667f279020c6\" (UID: \"72368101-d91a-4a33-b551-667f279020c6\") " Dec 04 10:27:22 crc kubenswrapper[4831]: I1204 10:27:22.454343 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/72368101-d91a-4a33-b551-667f279020c6-bundle\") pod \"72368101-d91a-4a33-b551-667f279020c6\" (UID: \"72368101-d91a-4a33-b551-667f279020c6\") " Dec 04 10:27:22 crc kubenswrapper[4831]: I1204 10:27:22.455150 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72368101-d91a-4a33-b551-667f279020c6-bundle" (OuterVolumeSpecName: "bundle") pod "72368101-d91a-4a33-b551-667f279020c6" (UID: "72368101-d91a-4a33-b551-667f279020c6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:27:22 crc kubenswrapper[4831]: I1204 10:27:22.461216 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72368101-d91a-4a33-b551-667f279020c6-kube-api-access-fqkxg" (OuterVolumeSpecName: "kube-api-access-fqkxg") pod "72368101-d91a-4a33-b551-667f279020c6" (UID: "72368101-d91a-4a33-b551-667f279020c6"). InnerVolumeSpecName "kube-api-access-fqkxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:27:22 crc kubenswrapper[4831]: I1204 10:27:22.477537 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72368101-d91a-4a33-b551-667f279020c6-util" (OuterVolumeSpecName: "util") pod "72368101-d91a-4a33-b551-667f279020c6" (UID: "72368101-d91a-4a33-b551-667f279020c6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:27:22 crc kubenswrapper[4831]: I1204 10:27:22.555922 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqkxg\" (UniqueName: \"kubernetes.io/projected/72368101-d91a-4a33-b551-667f279020c6-kube-api-access-fqkxg\") on node \"crc\" DevicePath \"\"" Dec 04 10:27:22 crc kubenswrapper[4831]: I1204 10:27:22.555967 4831 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/72368101-d91a-4a33-b551-667f279020c6-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:27:22 crc kubenswrapper[4831]: I1204 10:27:22.555985 4831 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/72368101-d91a-4a33-b551-667f279020c6-util\") on node \"crc\" DevicePath \"\"" Dec 04 10:27:23 crc kubenswrapper[4831]: I1204 10:27:23.171945 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbwhc2" event={"ID":"72368101-d91a-4a33-b551-667f279020c6","Type":"ContainerDied","Data":"40c4ccb0bbce83ad79d50ce7fc9d6e1f4217e603f85070622ce00df16cc4620f"} Dec 04 10:27:23 crc kubenswrapper[4831]: I1204 10:27:23.172034 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40c4ccb0bbce83ad79d50ce7fc9d6e1f4217e603f85070622ce00df16cc4620f" Dec 04 10:27:23 crc kubenswrapper[4831]: I1204 10:27:23.172173 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbwhc2" Dec 04 10:27:26 crc kubenswrapper[4831]: I1204 10:27:26.914044 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-5h7gk"] Dec 04 10:27:26 crc kubenswrapper[4831]: E1204 10:27:26.914703 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72368101-d91a-4a33-b551-667f279020c6" containerName="util" Dec 04 10:27:26 crc kubenswrapper[4831]: I1204 10:27:26.914723 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="72368101-d91a-4a33-b551-667f279020c6" containerName="util" Dec 04 10:27:26 crc kubenswrapper[4831]: E1204 10:27:26.914744 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72368101-d91a-4a33-b551-667f279020c6" containerName="pull" Dec 04 10:27:26 crc kubenswrapper[4831]: I1204 10:27:26.914755 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="72368101-d91a-4a33-b551-667f279020c6" containerName="pull" Dec 04 10:27:26 crc kubenswrapper[4831]: E1204 10:27:26.914789 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72368101-d91a-4a33-b551-667f279020c6" containerName="extract" Dec 04 10:27:26 crc kubenswrapper[4831]: I1204 10:27:26.914800 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="72368101-d91a-4a33-b551-667f279020c6" containerName="extract" Dec 04 10:27:26 crc kubenswrapper[4831]: I1204 10:27:26.914973 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="72368101-d91a-4a33-b551-667f279020c6" containerName="extract" Dec 04 10:27:26 crc kubenswrapper[4831]: I1204 10:27:26.915618 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-5h7gk" Dec 04 10:27:26 crc kubenswrapper[4831]: I1204 10:27:26.917276 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 04 10:27:26 crc kubenswrapper[4831]: I1204 10:27:26.918051 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 04 10:27:26 crc kubenswrapper[4831]: I1204 10:27:26.918574 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-djsh5" Dec 04 10:27:26 crc kubenswrapper[4831]: I1204 10:27:26.928175 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-5h7gk"] Dec 04 10:27:27 crc kubenswrapper[4831]: I1204 10:27:27.017595 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5xmg\" (UniqueName: \"kubernetes.io/projected/28920e6c-c06d-4dd1-8cb4-5cc192c2e8a7-kube-api-access-d5xmg\") pod \"nmstate-operator-5b5b58f5c8-5h7gk\" (UID: \"28920e6c-c06d-4dd1-8cb4-5cc192c2e8a7\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-5h7gk" Dec 04 10:27:27 crc kubenswrapper[4831]: I1204 10:27:27.119192 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5xmg\" (UniqueName: \"kubernetes.io/projected/28920e6c-c06d-4dd1-8cb4-5cc192c2e8a7-kube-api-access-d5xmg\") pod \"nmstate-operator-5b5b58f5c8-5h7gk\" (UID: \"28920e6c-c06d-4dd1-8cb4-5cc192c2e8a7\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-5h7gk" Dec 04 10:27:27 crc kubenswrapper[4831]: I1204 10:27:27.137900 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5xmg\" (UniqueName: \"kubernetes.io/projected/28920e6c-c06d-4dd1-8cb4-5cc192c2e8a7-kube-api-access-d5xmg\") pod \"nmstate-operator-5b5b58f5c8-5h7gk\" (UID: \"28920e6c-c06d-4dd1-8cb4-5cc192c2e8a7\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-5h7gk" Dec 04 10:27:27 crc kubenswrapper[4831]: I1204 10:27:27.232321 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-5h7gk" Dec 04 10:27:27 crc kubenswrapper[4831]: I1204 10:27:27.705054 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-5h7gk"] Dec 04 10:27:28 crc kubenswrapper[4831]: I1204 10:27:28.211817 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-5h7gk" event={"ID":"28920e6c-c06d-4dd1-8cb4-5cc192c2e8a7","Type":"ContainerStarted","Data":"4678db75dbc703d479eaec304dabb9c592c4e913abe5a89e5350da76ec9661b9"} Dec 04 10:27:31 crc kubenswrapper[4831]: I1204 10:27:31.228248 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-5h7gk" event={"ID":"28920e6c-c06d-4dd1-8cb4-5cc192c2e8a7","Type":"ContainerStarted","Data":"de4f57506a7c4009f62260e22b9bf9c14cdc908e515056140e800ff11feff0f3"} Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.467559 4831 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.667054 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-5h7gk" podStartSLOduration=8.033753725 podStartE2EDuration="10.667035915s" podCreationTimestamp="2025-12-04 10:27:26 +0000 UTC" firstStartedPulling="2025-12-04 10:27:27.717604796 +0000 UTC m=+744.666780110" lastFinishedPulling="2025-12-04 10:27:30.350886986 +0000 UTC m=+747.300062300" observedRunningTime="2025-12-04 10:27:31.248920586 +0000 UTC m=+748.198095900" watchObservedRunningTime="2025-12-04 10:27:36.667035915 +0000 UTC m=+753.616211239" Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.669096 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-5xmlv"] Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.670157 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-5xmlv" Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.671814 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-kn7dm" Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.684821 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-5xmlv"] Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.688766 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-xgl8t"] Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.689511 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-xgl8t" Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.694062 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.710345 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-xxbff"] Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.711024 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-xxbff" Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.723352 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-xgl8t"] Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.761781 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/db769c39-2e84-4cf2-b604-73ca1c18c017-nmstate-lock\") pod \"nmstate-handler-xxbff\" (UID: \"db769c39-2e84-4cf2-b604-73ca1c18c017\") " pod="openshift-nmstate/nmstate-handler-xxbff" Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.761848 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/db769c39-2e84-4cf2-b604-73ca1c18c017-dbus-socket\") pod \"nmstate-handler-xxbff\" (UID: \"db769c39-2e84-4cf2-b604-73ca1c18c017\") " pod="openshift-nmstate/nmstate-handler-xxbff" Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.761871 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/db769c39-2e84-4cf2-b604-73ca1c18c017-ovs-socket\") pod \"nmstate-handler-xxbff\" (UID: \"db769c39-2e84-4cf2-b604-73ca1c18c017\") " pod="openshift-nmstate/nmstate-handler-xxbff" Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.761891 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/61015d8b-cce7-496b-abc6-3b8728072665-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-xgl8t\" (UID: \"61015d8b-cce7-496b-abc6-3b8728072665\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-xgl8t" Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.761921 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nknpq\" (UniqueName: \"kubernetes.io/projected/7792787f-3a02-47e6-b818-8875b8c5b1d7-kube-api-access-nknpq\") pod \"nmstate-metrics-7f946cbc9-5xmlv\" (UID: \"7792787f-3a02-47e6-b818-8875b8c5b1d7\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-5xmlv" Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.761943 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djw4c\" (UniqueName: \"kubernetes.io/projected/db769c39-2e84-4cf2-b604-73ca1c18c017-kube-api-access-djw4c\") pod \"nmstate-handler-xxbff\" (UID: \"db769c39-2e84-4cf2-b604-73ca1c18c017\") " pod="openshift-nmstate/nmstate-handler-xxbff" Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.761960 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfhlf\" (UniqueName: \"kubernetes.io/projected/61015d8b-cce7-496b-abc6-3b8728072665-kube-api-access-tfhlf\") pod \"nmstate-webhook-5f6d4c5ccb-xgl8t\" (UID: \"61015d8b-cce7-496b-abc6-3b8728072665\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-xgl8t" Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.816880 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pqz7x"] Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.817497 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pqz7x" Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.820875 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.821039 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.821159 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-clf44" Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.835938 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pqz7x"] Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.863254 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djw4c\" (UniqueName: \"kubernetes.io/projected/db769c39-2e84-4cf2-b604-73ca1c18c017-kube-api-access-djw4c\") pod \"nmstate-handler-xxbff\" (UID: \"db769c39-2e84-4cf2-b604-73ca1c18c017\") " pod="openshift-nmstate/nmstate-handler-xxbff" Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.863521 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfhlf\" (UniqueName: \"kubernetes.io/projected/61015d8b-cce7-496b-abc6-3b8728072665-kube-api-access-tfhlf\") pod \"nmstate-webhook-5f6d4c5ccb-xgl8t\" (UID: \"61015d8b-cce7-496b-abc6-3b8728072665\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-xgl8t" Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.863713 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/662bab54-215c-4d55-89c1-9ae40f088ae8-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-pqz7x\" (UID: \"662bab54-215c-4d55-89c1-9ae40f088ae8\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pqz7x" Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.863750 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtvxh\" (UniqueName: \"kubernetes.io/projected/662bab54-215c-4d55-89c1-9ae40f088ae8-kube-api-access-jtvxh\") pod \"nmstate-console-plugin-7fbb5f6569-pqz7x\" (UID: \"662bab54-215c-4d55-89c1-9ae40f088ae8\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pqz7x" Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.863775 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/db769c39-2e84-4cf2-b604-73ca1c18c017-nmstate-lock\") pod \"nmstate-handler-xxbff\" (UID: \"db769c39-2e84-4cf2-b604-73ca1c18c017\") " pod="openshift-nmstate/nmstate-handler-xxbff" Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.863846 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/662bab54-215c-4d55-89c1-9ae40f088ae8-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-pqz7x\" (UID: \"662bab54-215c-4d55-89c1-9ae40f088ae8\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pqz7x" Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.863890 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/db769c39-2e84-4cf2-b604-73ca1c18c017-dbus-socket\") pod \"nmstate-handler-xxbff\" (UID: \"db769c39-2e84-4cf2-b604-73ca1c18c017\") " pod="openshift-nmstate/nmstate-handler-xxbff" Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.863926 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/db769c39-2e84-4cf2-b604-73ca1c18c017-ovs-socket\") pod \"nmstate-handler-xxbff\" (UID: \"db769c39-2e84-4cf2-b604-73ca1c18c017\") " pod="openshift-nmstate/nmstate-handler-xxbff" Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.863954 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/61015d8b-cce7-496b-abc6-3b8728072665-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-xgl8t\" (UID: \"61015d8b-cce7-496b-abc6-3b8728072665\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-xgl8t" Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.864010 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nknpq\" (UniqueName: \"kubernetes.io/projected/7792787f-3a02-47e6-b818-8875b8c5b1d7-kube-api-access-nknpq\") pod \"nmstate-metrics-7f946cbc9-5xmlv\" (UID: \"7792787f-3a02-47e6-b818-8875b8c5b1d7\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-5xmlv" Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.864200 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/db769c39-2e84-4cf2-b604-73ca1c18c017-nmstate-lock\") pod \"nmstate-handler-xxbff\" (UID: \"db769c39-2e84-4cf2-b604-73ca1c18c017\") " pod="openshift-nmstate/nmstate-handler-xxbff" Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.864458 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/db769c39-2e84-4cf2-b604-73ca1c18c017-dbus-socket\") pod \"nmstate-handler-xxbff\" (UID: \"db769c39-2e84-4cf2-b604-73ca1c18c017\") " pod="openshift-nmstate/nmstate-handler-xxbff" Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.864487 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/db769c39-2e84-4cf2-b604-73ca1c18c017-ovs-socket\") pod \"nmstate-handler-xxbff\" (UID: \"db769c39-2e84-4cf2-b604-73ca1c18c017\") " pod="openshift-nmstate/nmstate-handler-xxbff" Dec 04 10:27:36 crc kubenswrapper[4831]: E1204 10:27:36.864538 4831 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Dec 04 10:27:36 crc kubenswrapper[4831]: E1204 10:27:36.864574 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61015d8b-cce7-496b-abc6-3b8728072665-tls-key-pair podName:61015d8b-cce7-496b-abc6-3b8728072665 nodeName:}" failed. No retries permitted until 2025-12-04 10:27:37.364559158 +0000 UTC m=+754.313734472 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/61015d8b-cce7-496b-abc6-3b8728072665-tls-key-pair") pod "nmstate-webhook-5f6d4c5ccb-xgl8t" (UID: "61015d8b-cce7-496b-abc6-3b8728072665") : secret "openshift-nmstate-webhook" not found Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.882449 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nknpq\" (UniqueName: \"kubernetes.io/projected/7792787f-3a02-47e6-b818-8875b8c5b1d7-kube-api-access-nknpq\") pod \"nmstate-metrics-7f946cbc9-5xmlv\" (UID: \"7792787f-3a02-47e6-b818-8875b8c5b1d7\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-5xmlv" Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.883265 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfhlf\" (UniqueName: \"kubernetes.io/projected/61015d8b-cce7-496b-abc6-3b8728072665-kube-api-access-tfhlf\") pod \"nmstate-webhook-5f6d4c5ccb-xgl8t\" (UID: \"61015d8b-cce7-496b-abc6-3b8728072665\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-xgl8t" Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.889219 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djw4c\" (UniqueName: \"kubernetes.io/projected/db769c39-2e84-4cf2-b604-73ca1c18c017-kube-api-access-djw4c\") pod \"nmstate-handler-xxbff\" (UID: \"db769c39-2e84-4cf2-b604-73ca1c18c017\") " pod="openshift-nmstate/nmstate-handler-xxbff" Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.964877 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/662bab54-215c-4d55-89c1-9ae40f088ae8-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-pqz7x\" (UID: \"662bab54-215c-4d55-89c1-9ae40f088ae8\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pqz7x" Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.964981 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/662bab54-215c-4d55-89c1-9ae40f088ae8-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-pqz7x\" (UID: \"662bab54-215c-4d55-89c1-9ae40f088ae8\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pqz7x" Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.965001 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtvxh\" (UniqueName: \"kubernetes.io/projected/662bab54-215c-4d55-89c1-9ae40f088ae8-kube-api-access-jtvxh\") pod \"nmstate-console-plugin-7fbb5f6569-pqz7x\" (UID: \"662bab54-215c-4d55-89c1-9ae40f088ae8\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pqz7x" Dec 04 10:27:36 crc kubenswrapper[4831]: E1204 10:27:36.965099 4831 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Dec 04 10:27:36 crc kubenswrapper[4831]: E1204 10:27:36.965175 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/662bab54-215c-4d55-89c1-9ae40f088ae8-plugin-serving-cert podName:662bab54-215c-4d55-89c1-9ae40f088ae8 nodeName:}" failed. No retries permitted until 2025-12-04 10:27:37.465159159 +0000 UTC m=+754.414334473 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/662bab54-215c-4d55-89c1-9ae40f088ae8-plugin-serving-cert") pod "nmstate-console-plugin-7fbb5f6569-pqz7x" (UID: "662bab54-215c-4d55-89c1-9ae40f088ae8") : secret "plugin-serving-cert" not found Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.965858 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/662bab54-215c-4d55-89c1-9ae40f088ae8-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-pqz7x\" (UID: \"662bab54-215c-4d55-89c1-9ae40f088ae8\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pqz7x" Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.983373 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtvxh\" (UniqueName: \"kubernetes.io/projected/662bab54-215c-4d55-89c1-9ae40f088ae8-kube-api-access-jtvxh\") pod \"nmstate-console-plugin-7fbb5f6569-pqz7x\" (UID: \"662bab54-215c-4d55-89c1-9ae40f088ae8\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pqz7x" Dec 04 10:27:36 crc kubenswrapper[4831]: I1204 10:27:36.987869 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-5xmlv" Dec 04 10:27:37 crc kubenswrapper[4831]: I1204 10:27:37.034070 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-xxbff" Dec 04 10:27:37 crc kubenswrapper[4831]: I1204 10:27:37.145225 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-547798d967-44879"] Dec 04 10:27:37 crc kubenswrapper[4831]: I1204 10:27:37.146156 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-547798d967-44879" Dec 04 10:27:37 crc kubenswrapper[4831]: I1204 10:27:37.153092 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-547798d967-44879"] Dec 04 10:27:37 crc kubenswrapper[4831]: W1204 10:27:37.200528 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7792787f_3a02_47e6_b818_8875b8c5b1d7.slice/crio-d475aff891d0a5d7277c17fad3c8512bc046b5741906639e59537b323f8d1aeb WatchSource:0}: Error finding container d475aff891d0a5d7277c17fad3c8512bc046b5741906639e59537b323f8d1aeb: Status 404 returned error can't find the container with id d475aff891d0a5d7277c17fad3c8512bc046b5741906639e59537b323f8d1aeb Dec 04 10:27:37 crc kubenswrapper[4831]: I1204 10:27:37.204152 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-5xmlv"] Dec 04 10:27:37 crc kubenswrapper[4831]: I1204 10:27:37.261998 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-xxbff" event={"ID":"db769c39-2e84-4cf2-b604-73ca1c18c017","Type":"ContainerStarted","Data":"ffd365448f406645dae71fbf268bc4848721a5876852884e41ba6e981d2d84e5"} Dec 04 10:27:37 crc kubenswrapper[4831]: I1204 10:27:37.263502 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-5xmlv" event={"ID":"7792787f-3a02-47e6-b818-8875b8c5b1d7","Type":"ContainerStarted","Data":"d475aff891d0a5d7277c17fad3c8512bc046b5741906639e59537b323f8d1aeb"} Dec 04 10:27:37 crc kubenswrapper[4831]: I1204 10:27:37.273135 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq7pq\" (UniqueName: \"kubernetes.io/projected/ba9e90fe-291c-4d0e-bf67-f5d77f62613d-kube-api-access-mq7pq\") pod \"console-547798d967-44879\" (UID: \"ba9e90fe-291c-4d0e-bf67-f5d77f62613d\") " pod="openshift-console/console-547798d967-44879" Dec 04 10:27:37 crc kubenswrapper[4831]: I1204 10:27:37.273173 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ba9e90fe-291c-4d0e-bf67-f5d77f62613d-oauth-serving-cert\") pod \"console-547798d967-44879\" (UID: \"ba9e90fe-291c-4d0e-bf67-f5d77f62613d\") " pod="openshift-console/console-547798d967-44879" Dec 04 10:27:37 crc kubenswrapper[4831]: I1204 10:27:37.273192 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba9e90fe-291c-4d0e-bf67-f5d77f62613d-trusted-ca-bundle\") pod \"console-547798d967-44879\" (UID: \"ba9e90fe-291c-4d0e-bf67-f5d77f62613d\") " pod="openshift-console/console-547798d967-44879" Dec 04 10:27:37 crc kubenswrapper[4831]: I1204 10:27:37.273218 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ba9e90fe-291c-4d0e-bf67-f5d77f62613d-service-ca\") pod \"console-547798d967-44879\" (UID: \"ba9e90fe-291c-4d0e-bf67-f5d77f62613d\") " pod="openshift-console/console-547798d967-44879" Dec 04 10:27:37 crc kubenswrapper[4831]: I1204 10:27:37.273239 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ba9e90fe-291c-4d0e-bf67-f5d77f62613d-console-config\") pod \"console-547798d967-44879\" (UID: \"ba9e90fe-291c-4d0e-bf67-f5d77f62613d\") " pod="openshift-console/console-547798d967-44879" Dec 04 10:27:37 crc kubenswrapper[4831]: I1204 10:27:37.273257 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba9e90fe-291c-4d0e-bf67-f5d77f62613d-console-serving-cert\") pod \"console-547798d967-44879\" (UID: \"ba9e90fe-291c-4d0e-bf67-f5d77f62613d\") " pod="openshift-console/console-547798d967-44879" Dec 04 10:27:37 crc kubenswrapper[4831]: I1204 10:27:37.273293 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ba9e90fe-291c-4d0e-bf67-f5d77f62613d-console-oauth-config\") pod \"console-547798d967-44879\" (UID: \"ba9e90fe-291c-4d0e-bf67-f5d77f62613d\") " pod="openshift-console/console-547798d967-44879" Dec 04 10:27:37 crc kubenswrapper[4831]: I1204 10:27:37.374685 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/61015d8b-cce7-496b-abc6-3b8728072665-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-xgl8t\" (UID: \"61015d8b-cce7-496b-abc6-3b8728072665\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-xgl8t" Dec 04 10:27:37 crc kubenswrapper[4831]: I1204 10:27:37.374780 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq7pq\" (UniqueName: \"kubernetes.io/projected/ba9e90fe-291c-4d0e-bf67-f5d77f62613d-kube-api-access-mq7pq\") pod \"console-547798d967-44879\" (UID: \"ba9e90fe-291c-4d0e-bf67-f5d77f62613d\") " pod="openshift-console/console-547798d967-44879" Dec 04 10:27:37 crc kubenswrapper[4831]: I1204 10:27:37.374805 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ba9e90fe-291c-4d0e-bf67-f5d77f62613d-oauth-serving-cert\") pod \"console-547798d967-44879\" (UID: \"ba9e90fe-291c-4d0e-bf67-f5d77f62613d\") " pod="openshift-console/console-547798d967-44879" Dec 04 10:27:37 crc kubenswrapper[4831]: I1204 10:27:37.374831 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba9e90fe-291c-4d0e-bf67-f5d77f62613d-trusted-ca-bundle\") pod \"console-547798d967-44879\" (UID: \"ba9e90fe-291c-4d0e-bf67-f5d77f62613d\") " pod="openshift-console/console-547798d967-44879" Dec 04 10:27:37 crc kubenswrapper[4831]: I1204 10:27:37.374864 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ba9e90fe-291c-4d0e-bf67-f5d77f62613d-service-ca\") pod \"console-547798d967-44879\" (UID: \"ba9e90fe-291c-4d0e-bf67-f5d77f62613d\") " pod="openshift-console/console-547798d967-44879" Dec 04 10:27:37 crc kubenswrapper[4831]: I1204 10:27:37.374890 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ba9e90fe-291c-4d0e-bf67-f5d77f62613d-console-config\") pod \"console-547798d967-44879\" (UID: \"ba9e90fe-291c-4d0e-bf67-f5d77f62613d\") " pod="openshift-console/console-547798d967-44879" Dec 04 10:27:37 crc kubenswrapper[4831]: I1204 10:27:37.374924 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba9e90fe-291c-4d0e-bf67-f5d77f62613d-console-serving-cert\") pod \"console-547798d967-44879\" (UID: \"ba9e90fe-291c-4d0e-bf67-f5d77f62613d\") " pod="openshift-console/console-547798d967-44879" Dec 04 10:27:37 crc kubenswrapper[4831]: I1204 10:27:37.374958 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ba9e90fe-291c-4d0e-bf67-f5d77f62613d-console-oauth-config\") pod \"console-547798d967-44879\" (UID: \"ba9e90fe-291c-4d0e-bf67-f5d77f62613d\") " pod="openshift-console/console-547798d967-44879" Dec 04 10:27:37 crc kubenswrapper[4831]: I1204 10:27:37.375970 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ba9e90fe-291c-4d0e-bf67-f5d77f62613d-oauth-serving-cert\") pod \"console-547798d967-44879\" (UID: \"ba9e90fe-291c-4d0e-bf67-f5d77f62613d\") " pod="openshift-console/console-547798d967-44879" Dec 04 10:27:37 crc kubenswrapper[4831]: I1204 10:27:37.377145 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba9e90fe-291c-4d0e-bf67-f5d77f62613d-trusted-ca-bundle\") pod \"console-547798d967-44879\" (UID: \"ba9e90fe-291c-4d0e-bf67-f5d77f62613d\") " pod="openshift-console/console-547798d967-44879" Dec 04 10:27:37 crc kubenswrapper[4831]: I1204 10:27:37.377571 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ba9e90fe-291c-4d0e-bf67-f5d77f62613d-console-config\") pod \"console-547798d967-44879\" (UID: \"ba9e90fe-291c-4d0e-bf67-f5d77f62613d\") " pod="openshift-console/console-547798d967-44879" Dec 04 10:27:37 crc kubenswrapper[4831]: I1204 10:27:37.378583 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ba9e90fe-291c-4d0e-bf67-f5d77f62613d-service-ca\") pod \"console-547798d967-44879\" (UID: \"ba9e90fe-291c-4d0e-bf67-f5d77f62613d\") " pod="openshift-console/console-547798d967-44879" Dec 04 10:27:37 crc kubenswrapper[4831]: I1204 10:27:37.379482 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/61015d8b-cce7-496b-abc6-3b8728072665-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-xgl8t\" (UID: \"61015d8b-cce7-496b-abc6-3b8728072665\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-xgl8t" Dec 04 10:27:37 crc kubenswrapper[4831]: I1204 10:27:37.380752 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ba9e90fe-291c-4d0e-bf67-f5d77f62613d-console-oauth-config\") pod \"console-547798d967-44879\" (UID: \"ba9e90fe-291c-4d0e-bf67-f5d77f62613d\") " pod="openshift-console/console-547798d967-44879" Dec 04 10:27:37 crc kubenswrapper[4831]: I1204 10:27:37.380837 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba9e90fe-291c-4d0e-bf67-f5d77f62613d-console-serving-cert\") pod \"console-547798d967-44879\" (UID: \"ba9e90fe-291c-4d0e-bf67-f5d77f62613d\") " pod="openshift-console/console-547798d967-44879" Dec 04 10:27:37 crc kubenswrapper[4831]: I1204 10:27:37.404021 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq7pq\" (UniqueName: \"kubernetes.io/projected/ba9e90fe-291c-4d0e-bf67-f5d77f62613d-kube-api-access-mq7pq\") pod \"console-547798d967-44879\" (UID: \"ba9e90fe-291c-4d0e-bf67-f5d77f62613d\") " pod="openshift-console/console-547798d967-44879" Dec 04 10:27:37 crc kubenswrapper[4831]: I1204 10:27:37.476080 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/662bab54-215c-4d55-89c1-9ae40f088ae8-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-pqz7x\" (UID: \"662bab54-215c-4d55-89c1-9ae40f088ae8\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pqz7x" Dec 04 10:27:37 crc kubenswrapper[4831]: I1204 10:27:37.481516 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/662bab54-215c-4d55-89c1-9ae40f088ae8-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-pqz7x\" (UID: \"662bab54-215c-4d55-89c1-9ae40f088ae8\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pqz7x" Dec 04 10:27:37 crc kubenswrapper[4831]: I1204 10:27:37.486763 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-547798d967-44879" Dec 04 10:27:37 crc kubenswrapper[4831]: I1204 10:27:37.605520 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-xgl8t" Dec 04 10:27:37 crc kubenswrapper[4831]: I1204 10:27:37.748761 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pqz7x" Dec 04 10:27:37 crc kubenswrapper[4831]: I1204 10:27:37.845478 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-xgl8t"] Dec 04 10:27:37 crc kubenswrapper[4831]: W1204 10:27:37.861847 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61015d8b_cce7_496b_abc6_3b8728072665.slice/crio-55f97388cfde41bfb3a90c294cc93e5d812679399881a36ec617d3b831fa475a WatchSource:0}: Error finding container 55f97388cfde41bfb3a90c294cc93e5d812679399881a36ec617d3b831fa475a: Status 404 returned error can't find the container with id 55f97388cfde41bfb3a90c294cc93e5d812679399881a36ec617d3b831fa475a Dec 04 10:27:37 crc kubenswrapper[4831]: I1204 10:27:37.937257 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-547798d967-44879"] Dec 04 10:27:37 crc kubenswrapper[4831]: W1204 10:27:37.947364 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba9e90fe_291c_4d0e_bf67_f5d77f62613d.slice/crio-b324efda454812a98487dca5bba0f22ad9a5e651da23052d58d72609aabc928c WatchSource:0}: Error finding container b324efda454812a98487dca5bba0f22ad9a5e651da23052d58d72609aabc928c: Status 404 returned error can't find the container with id b324efda454812a98487dca5bba0f22ad9a5e651da23052d58d72609aabc928c Dec 04 10:27:37 crc kubenswrapper[4831]: I1204 10:27:37.961313 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pqz7x"] Dec 04 10:27:37 crc kubenswrapper[4831]: W1204 10:27:37.966575 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod662bab54_215c_4d55_89c1_9ae40f088ae8.slice/crio-33337241feddb8804693c3873e2df11aec1dd53c2d0e270d587250e86f288e82 WatchSource:0}: Error finding container 33337241feddb8804693c3873e2df11aec1dd53c2d0e270d587250e86f288e82: Status 404 returned error can't find the container with id 33337241feddb8804693c3873e2df11aec1dd53c2d0e270d587250e86f288e82 Dec 04 10:27:38 crc kubenswrapper[4831]: I1204 10:27:38.275466 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pqz7x" event={"ID":"662bab54-215c-4d55-89c1-9ae40f088ae8","Type":"ContainerStarted","Data":"33337241feddb8804693c3873e2df11aec1dd53c2d0e270d587250e86f288e82"} Dec 04 10:27:38 crc kubenswrapper[4831]: I1204 10:27:38.278148 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-xgl8t" event={"ID":"61015d8b-cce7-496b-abc6-3b8728072665","Type":"ContainerStarted","Data":"55f97388cfde41bfb3a90c294cc93e5d812679399881a36ec617d3b831fa475a"} Dec 04 10:27:38 crc kubenswrapper[4831]: I1204 10:27:38.279809 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-547798d967-44879" event={"ID":"ba9e90fe-291c-4d0e-bf67-f5d77f62613d","Type":"ContainerStarted","Data":"dffb6237208c7e8a00c093918bae8d64cc3dde2d5e9fe6aec05c84aa3bb412ec"} Dec 04 10:27:38 crc kubenswrapper[4831]: I1204 10:27:38.279848 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-547798d967-44879" event={"ID":"ba9e90fe-291c-4d0e-bf67-f5d77f62613d","Type":"ContainerStarted","Data":"b324efda454812a98487dca5bba0f22ad9a5e651da23052d58d72609aabc928c"} Dec 04 10:27:38 crc kubenswrapper[4831]: I1204 10:27:38.301240 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-547798d967-44879" podStartSLOduration=1.301217111 podStartE2EDuration="1.301217111s" podCreationTimestamp="2025-12-04 10:27:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:27:38.298305763 +0000 UTC m=+755.247481097" watchObservedRunningTime="2025-12-04 10:27:38.301217111 +0000 UTC m=+755.250392435" Dec 04 10:27:41 crc kubenswrapper[4831]: I1204 10:27:41.319304 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-5xmlv" event={"ID":"7792787f-3a02-47e6-b818-8875b8c5b1d7","Type":"ContainerStarted","Data":"015e389a26b63ae2b4b0b0abfb571ed2c1b30438289dbb4d2d3fe9fb51fdcec1"} Dec 04 10:27:41 crc kubenswrapper[4831]: I1204 10:27:41.322930 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pqz7x" event={"ID":"662bab54-215c-4d55-89c1-9ae40f088ae8","Type":"ContainerStarted","Data":"10ef708cc95582755478d66c0fe10f5ad376b304bb288e85b3a42a183c3f56a8"} Dec 04 10:27:41 crc kubenswrapper[4831]: I1204 10:27:41.325250 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-xxbff" event={"ID":"db769c39-2e84-4cf2-b604-73ca1c18c017","Type":"ContainerStarted","Data":"21579f8b3431991db3c36b536594712fd442faaa898ef850db921b6314af8924"} Dec 04 10:27:41 crc kubenswrapper[4831]: I1204 10:27:41.326709 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-xxbff" Dec 04 10:27:41 crc kubenswrapper[4831]: I1204 10:27:41.328410 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-xgl8t" event={"ID":"61015d8b-cce7-496b-abc6-3b8728072665","Type":"ContainerStarted","Data":"1d47c5277afa5f2cbe6d6517fb3902233f950622a093a99a5689025c7655f20f"} Dec 04 10:27:41 crc kubenswrapper[4831]: I1204 10:27:41.328756 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-xgl8t" Dec 04 10:27:41 crc kubenswrapper[4831]: I1204 10:27:41.344758 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pqz7x" podStartSLOduration=3.1302070300000002 podStartE2EDuration="5.344732382s" podCreationTimestamp="2025-12-04 10:27:36 +0000 UTC" firstStartedPulling="2025-12-04 10:27:37.969351517 +0000 UTC m=+754.918526831" lastFinishedPulling="2025-12-04 10:27:40.183876869 +0000 UTC m=+757.133052183" observedRunningTime="2025-12-04 10:27:41.343220031 +0000 UTC m=+758.292395375" watchObservedRunningTime="2025-12-04 10:27:41.344732382 +0000 UTC m=+758.293907726" Dec 04 10:27:41 crc kubenswrapper[4831]: I1204 10:27:41.389801 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-xgl8t" podStartSLOduration=3.074739151 podStartE2EDuration="5.389782182s" podCreationTimestamp="2025-12-04 10:27:36 +0000 UTC" firstStartedPulling="2025-12-04 10:27:37.864435421 +0000 UTC m=+754.813610735" lastFinishedPulling="2025-12-04 10:27:40.179478452 +0000 UTC m=+757.128653766" observedRunningTime="2025-12-04 10:27:41.388554659 +0000 UTC m=+758.337729993" watchObservedRunningTime="2025-12-04 10:27:41.389782182 +0000 UTC m=+758.338957486" Dec 04 10:27:41 crc kubenswrapper[4831]: I1204 10:27:41.393069 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-xxbff" podStartSLOduration=2.3556245000000002 podStartE2EDuration="5.393057349s" podCreationTimestamp="2025-12-04 10:27:36 +0000 UTC" firstStartedPulling="2025-12-04 10:27:37.05938389 +0000 UTC m=+754.008559204" lastFinishedPulling="2025-12-04 10:27:40.096816739 +0000 UTC m=+757.045992053" observedRunningTime="2025-12-04 10:27:41.371569367 +0000 UTC m=+758.320744681" watchObservedRunningTime="2025-12-04 10:27:41.393057349 +0000 UTC m=+758.342232663" Dec 04 10:27:43 crc kubenswrapper[4831]: I1204 10:27:43.343212 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-5xmlv" event={"ID":"7792787f-3a02-47e6-b818-8875b8c5b1d7","Type":"ContainerStarted","Data":"16e30a89d20b13808bad81fe223b427d1dc54cfa503eae86faf02bf60f55bf71"} Dec 04 10:27:43 crc kubenswrapper[4831]: I1204 10:27:43.373891 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-5xmlv" podStartSLOduration=1.869683242 podStartE2EDuration="7.373865363s" podCreationTimestamp="2025-12-04 10:27:36 +0000 UTC" firstStartedPulling="2025-12-04 10:27:37.202577075 +0000 UTC m=+754.151752389" lastFinishedPulling="2025-12-04 10:27:42.706759186 +0000 UTC m=+759.655934510" observedRunningTime="2025-12-04 10:27:43.363409415 +0000 UTC m=+760.312584749" watchObservedRunningTime="2025-12-04 10:27:43.373865363 +0000 UTC m=+760.323040707" Dec 04 10:27:47 crc kubenswrapper[4831]: I1204 10:27:47.068527 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-xxbff" Dec 04 10:27:47 crc kubenswrapper[4831]: I1204 10:27:47.529320 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-547798d967-44879" Dec 04 10:27:47 crc kubenswrapper[4831]: I1204 10:27:47.529877 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-547798d967-44879" Dec 04 10:27:47 crc kubenswrapper[4831]: I1204 10:27:47.537512 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-547798d967-44879" Dec 04 10:27:48 crc kubenswrapper[4831]: I1204 10:27:48.386765 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-547798d967-44879" Dec 04 10:27:48 crc kubenswrapper[4831]: I1204 10:27:48.459679 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-twg79"] Dec 04 10:27:57 crc kubenswrapper[4831]: I1204 10:27:57.615416 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-xgl8t" Dec 04 10:28:12 crc kubenswrapper[4831]: I1204 10:28:12.476296 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ljd2v"] Dec 04 10:28:12 crc kubenswrapper[4831]: I1204 10:28:12.477853 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ljd2v" Dec 04 10:28:12 crc kubenswrapper[4831]: I1204 10:28:12.480712 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 04 10:28:12 crc kubenswrapper[4831]: I1204 10:28:12.495936 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ljd2v"] Dec 04 10:28:12 crc kubenswrapper[4831]: I1204 10:28:12.510755 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5359fc14-ea59-4f53-b1e4-c89f65453df0-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ljd2v\" (UID: \"5359fc14-ea59-4f53-b1e4-c89f65453df0\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ljd2v" Dec 04 10:28:12 crc kubenswrapper[4831]: I1204 10:28:12.510830 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk9zq\" (UniqueName: \"kubernetes.io/projected/5359fc14-ea59-4f53-b1e4-c89f65453df0-kube-api-access-sk9zq\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ljd2v\" (UID: \"5359fc14-ea59-4f53-b1e4-c89f65453df0\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ljd2v" Dec 04 10:28:12 crc kubenswrapper[4831]: I1204 10:28:12.510895 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5359fc14-ea59-4f53-b1e4-c89f65453df0-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ljd2v\" (UID: \"5359fc14-ea59-4f53-b1e4-c89f65453df0\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ljd2v" Dec 04 10:28:12 crc kubenswrapper[4831]: I1204 10:28:12.611647 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5359fc14-ea59-4f53-b1e4-c89f65453df0-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ljd2v\" (UID: \"5359fc14-ea59-4f53-b1e4-c89f65453df0\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ljd2v" Dec 04 10:28:12 crc kubenswrapper[4831]: I1204 10:28:12.611766 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk9zq\" (UniqueName: \"kubernetes.io/projected/5359fc14-ea59-4f53-b1e4-c89f65453df0-kube-api-access-sk9zq\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ljd2v\" (UID: \"5359fc14-ea59-4f53-b1e4-c89f65453df0\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ljd2v" Dec 04 10:28:12 crc kubenswrapper[4831]: I1204 10:28:12.611844 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5359fc14-ea59-4f53-b1e4-c89f65453df0-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ljd2v\" (UID: \"5359fc14-ea59-4f53-b1e4-c89f65453df0\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ljd2v" Dec 04 10:28:12 crc kubenswrapper[4831]: I1204 10:28:12.612442 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5359fc14-ea59-4f53-b1e4-c89f65453df0-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ljd2v\" (UID: \"5359fc14-ea59-4f53-b1e4-c89f65453df0\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ljd2v" Dec 04 10:28:12 crc kubenswrapper[4831]: I1204 10:28:12.612491 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5359fc14-ea59-4f53-b1e4-c89f65453df0-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ljd2v\" (UID: \"5359fc14-ea59-4f53-b1e4-c89f65453df0\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ljd2v" Dec 04 10:28:12 crc kubenswrapper[4831]: I1204 10:28:12.634486 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk9zq\" (UniqueName: \"kubernetes.io/projected/5359fc14-ea59-4f53-b1e4-c89f65453df0-kube-api-access-sk9zq\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ljd2v\" (UID: \"5359fc14-ea59-4f53-b1e4-c89f65453df0\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ljd2v" Dec 04 10:28:12 crc kubenswrapper[4831]: I1204 10:28:12.799157 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ljd2v" Dec 04 10:28:13 crc kubenswrapper[4831]: I1204 10:28:13.245562 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ljd2v"] Dec 04 10:28:13 crc kubenswrapper[4831]: I1204 10:28:13.518908 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-twg79" podUID="bc1e4c46-b909-410a-980d-a045d5b3a636" containerName="console" containerID="cri-o://7c69ff7524e7910118f58d65c9bd57f9a010ee22ca20405cca0beb413861e01b" gracePeriod=15 Dec 04 10:28:13 crc kubenswrapper[4831]: I1204 10:28:13.568577 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ljd2v" event={"ID":"5359fc14-ea59-4f53-b1e4-c89f65453df0","Type":"ContainerStarted","Data":"d282e62614ab669f299b479082050ddffc517a4f1616b39bc5c2ccd138b695f3"} Dec 04 10:28:13 crc kubenswrapper[4831]: I1204 10:28:13.568624 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ljd2v" event={"ID":"5359fc14-ea59-4f53-b1e4-c89f65453df0","Type":"ContainerStarted","Data":"e66482f270ad5f67ad7b39dc92d9ebf60ddfb171724243bc32dbc6718025d171"} Dec 04 10:28:13 crc kubenswrapper[4831]: E1204 10:28:13.584001 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc1e4c46_b909_410a_980d_a045d5b3a636.slice/crio-conmon-7c69ff7524e7910118f58d65c9bd57f9a010ee22ca20405cca0beb413861e01b.scope\": RecentStats: unable to find data in memory cache]" Dec 04 10:28:13 crc kubenswrapper[4831]: I1204 10:28:13.872852 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-twg79_bc1e4c46-b909-410a-980d-a045d5b3a636/console/0.log" Dec 04 10:28:13 crc kubenswrapper[4831]: I1204 10:28:13.873163 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-twg79" Dec 04 10:28:13 crc kubenswrapper[4831]: I1204 10:28:13.927510 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bc1e4c46-b909-410a-980d-a045d5b3a636-oauth-serving-cert\") pod \"bc1e4c46-b909-410a-980d-a045d5b3a636\" (UID: \"bc1e4c46-b909-410a-980d-a045d5b3a636\") " Dec 04 10:28:13 crc kubenswrapper[4831]: I1204 10:28:13.928295 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc1e4c46-b909-410a-980d-a045d5b3a636-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "bc1e4c46-b909-410a-980d-a045d5b3a636" (UID: "bc1e4c46-b909-410a-980d-a045d5b3a636"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:28:14 crc kubenswrapper[4831]: I1204 10:28:14.028719 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bc1e4c46-b909-410a-980d-a045d5b3a636-service-ca\") pod \"bc1e4c46-b909-410a-980d-a045d5b3a636\" (UID: \"bc1e4c46-b909-410a-980d-a045d5b3a636\") " Dec 04 10:28:14 crc kubenswrapper[4831]: I1204 10:28:14.028853 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc1e4c46-b909-410a-980d-a045d5b3a636-trusted-ca-bundle\") pod \"bc1e4c46-b909-410a-980d-a045d5b3a636\" (UID: \"bc1e4c46-b909-410a-980d-a045d5b3a636\") " Dec 04 10:28:14 crc kubenswrapper[4831]: I1204 10:28:14.028929 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc1e4c46-b909-410a-980d-a045d5b3a636-console-serving-cert\") pod \"bc1e4c46-b909-410a-980d-a045d5b3a636\" (UID: \"bc1e4c46-b909-410a-980d-a045d5b3a636\") " Dec 04 10:28:14 crc kubenswrapper[4831]: I1204 10:28:14.028961 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bc1e4c46-b909-410a-980d-a045d5b3a636-console-oauth-config\") pod \"bc1e4c46-b909-410a-980d-a045d5b3a636\" (UID: \"bc1e4c46-b909-410a-980d-a045d5b3a636\") " Dec 04 10:28:14 crc kubenswrapper[4831]: I1204 10:28:14.029000 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bc1e4c46-b909-410a-980d-a045d5b3a636-console-config\") pod \"bc1e4c46-b909-410a-980d-a045d5b3a636\" (UID: \"bc1e4c46-b909-410a-980d-a045d5b3a636\") " Dec 04 10:28:14 crc kubenswrapper[4831]: I1204 10:28:14.029028 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97bjx\" (UniqueName: \"kubernetes.io/projected/bc1e4c46-b909-410a-980d-a045d5b3a636-kube-api-access-97bjx\") pod \"bc1e4c46-b909-410a-980d-a045d5b3a636\" (UID: \"bc1e4c46-b909-410a-980d-a045d5b3a636\") " Dec 04 10:28:14 crc kubenswrapper[4831]: I1204 10:28:14.029474 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc1e4c46-b909-410a-980d-a045d5b3a636-service-ca" (OuterVolumeSpecName: "service-ca") pod "bc1e4c46-b909-410a-980d-a045d5b3a636" (UID: "bc1e4c46-b909-410a-980d-a045d5b3a636"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:28:14 crc kubenswrapper[4831]: I1204 10:28:14.029487 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc1e4c46-b909-410a-980d-a045d5b3a636-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "bc1e4c46-b909-410a-980d-a045d5b3a636" (UID: "bc1e4c46-b909-410a-980d-a045d5b3a636"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:28:14 crc kubenswrapper[4831]: I1204 10:28:14.029735 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc1e4c46-b909-410a-980d-a045d5b3a636-console-config" (OuterVolumeSpecName: "console-config") pod "bc1e4c46-b909-410a-980d-a045d5b3a636" (UID: "bc1e4c46-b909-410a-980d-a045d5b3a636"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:28:14 crc kubenswrapper[4831]: I1204 10:28:14.030051 4831 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bc1e4c46-b909-410a-980d-a045d5b3a636-console-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:28:14 crc kubenswrapper[4831]: I1204 10:28:14.030085 4831 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bc1e4c46-b909-410a-980d-a045d5b3a636-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 10:28:14 crc kubenswrapper[4831]: I1204 10:28:14.030103 4831 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bc1e4c46-b909-410a-980d-a045d5b3a636-service-ca\") on node \"crc\" DevicePath \"\"" Dec 04 10:28:14 crc kubenswrapper[4831]: I1204 10:28:14.030120 4831 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc1e4c46-b909-410a-980d-a045d5b3a636-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:28:14 crc kubenswrapper[4831]: I1204 10:28:14.035526 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc1e4c46-b909-410a-980d-a045d5b3a636-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "bc1e4c46-b909-410a-980d-a045d5b3a636" (UID: "bc1e4c46-b909-410a-980d-a045d5b3a636"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:28:14 crc kubenswrapper[4831]: I1204 10:28:14.035572 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc1e4c46-b909-410a-980d-a045d5b3a636-kube-api-access-97bjx" (OuterVolumeSpecName: "kube-api-access-97bjx") pod "bc1e4c46-b909-410a-980d-a045d5b3a636" (UID: "bc1e4c46-b909-410a-980d-a045d5b3a636"). InnerVolumeSpecName "kube-api-access-97bjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:28:14 crc kubenswrapper[4831]: I1204 10:28:14.035880 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc1e4c46-b909-410a-980d-a045d5b3a636-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "bc1e4c46-b909-410a-980d-a045d5b3a636" (UID: "bc1e4c46-b909-410a-980d-a045d5b3a636"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:28:14 crc kubenswrapper[4831]: I1204 10:28:14.130878 4831 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc1e4c46-b909-410a-980d-a045d5b3a636-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 10:28:14 crc kubenswrapper[4831]: I1204 10:28:14.130916 4831 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bc1e4c46-b909-410a-980d-a045d5b3a636-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:28:14 crc kubenswrapper[4831]: I1204 10:28:14.130929 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97bjx\" (UniqueName: \"kubernetes.io/projected/bc1e4c46-b909-410a-980d-a045d5b3a636-kube-api-access-97bjx\") on node \"crc\" DevicePath \"\"" Dec 04 10:28:14 crc kubenswrapper[4831]: I1204 10:28:14.579286 4831 generic.go:334] "Generic (PLEG): container finished" podID="5359fc14-ea59-4f53-b1e4-c89f65453df0" containerID="d282e62614ab669f299b479082050ddffc517a4f1616b39bc5c2ccd138b695f3" exitCode=0 Dec 04 10:28:14 crc kubenswrapper[4831]: I1204 10:28:14.579462 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ljd2v" event={"ID":"5359fc14-ea59-4f53-b1e4-c89f65453df0","Type":"ContainerDied","Data":"d282e62614ab669f299b479082050ddffc517a4f1616b39bc5c2ccd138b695f3"} Dec 04 10:28:14 crc kubenswrapper[4831]: I1204 10:28:14.589323 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-twg79_bc1e4c46-b909-410a-980d-a045d5b3a636/console/0.log" Dec 04 10:28:14 crc kubenswrapper[4831]: I1204 10:28:14.589417 4831 generic.go:334] "Generic (PLEG): container finished" podID="bc1e4c46-b909-410a-980d-a045d5b3a636" containerID="7c69ff7524e7910118f58d65c9bd57f9a010ee22ca20405cca0beb413861e01b" exitCode=2 Dec 04 10:28:14 crc kubenswrapper[4831]: I1204 10:28:14.589470 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-twg79" event={"ID":"bc1e4c46-b909-410a-980d-a045d5b3a636","Type":"ContainerDied","Data":"7c69ff7524e7910118f58d65c9bd57f9a010ee22ca20405cca0beb413861e01b"} Dec 04 10:28:14 crc kubenswrapper[4831]: I1204 10:28:14.589515 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-twg79" event={"ID":"bc1e4c46-b909-410a-980d-a045d5b3a636","Type":"ContainerDied","Data":"917523e78438bd882750464ab15c15a4f9116a44cf5d4b1095f0318ce2615eba"} Dec 04 10:28:14 crc kubenswrapper[4831]: I1204 10:28:14.589553 4831 scope.go:117] "RemoveContainer" containerID="7c69ff7524e7910118f58d65c9bd57f9a010ee22ca20405cca0beb413861e01b" Dec 04 10:28:14 crc kubenswrapper[4831]: I1204 10:28:14.589808 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-twg79" Dec 04 10:28:14 crc kubenswrapper[4831]: I1204 10:28:14.616265 4831 scope.go:117] "RemoveContainer" containerID="7c69ff7524e7910118f58d65c9bd57f9a010ee22ca20405cca0beb413861e01b" Dec 04 10:28:14 crc kubenswrapper[4831]: E1204 10:28:14.617186 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c69ff7524e7910118f58d65c9bd57f9a010ee22ca20405cca0beb413861e01b\": container with ID starting with 7c69ff7524e7910118f58d65c9bd57f9a010ee22ca20405cca0beb413861e01b not found: ID does not exist" containerID="7c69ff7524e7910118f58d65c9bd57f9a010ee22ca20405cca0beb413861e01b" Dec 04 10:28:14 crc kubenswrapper[4831]: I1204 10:28:14.617230 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c69ff7524e7910118f58d65c9bd57f9a010ee22ca20405cca0beb413861e01b"} err="failed to get container status \"7c69ff7524e7910118f58d65c9bd57f9a010ee22ca20405cca0beb413861e01b\": rpc error: code = NotFound desc = could not find container \"7c69ff7524e7910118f58d65c9bd57f9a010ee22ca20405cca0beb413861e01b\": container with ID starting with 7c69ff7524e7910118f58d65c9bd57f9a010ee22ca20405cca0beb413861e01b not found: ID does not exist" Dec 04 10:28:14 crc kubenswrapper[4831]: I1204 10:28:14.630347 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-twg79"] Dec 04 10:28:14 crc kubenswrapper[4831]: I1204 10:28:14.634140 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-twg79"] Dec 04 10:28:15 crc kubenswrapper[4831]: I1204 10:28:15.299096 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc1e4c46-b909-410a-980d-a045d5b3a636" path="/var/lib/kubelet/pods/bc1e4c46-b909-410a-980d-a045d5b3a636/volumes" Dec 04 10:28:16 crc kubenswrapper[4831]: I1204 10:28:16.042643 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-l7qqj"] Dec 04 10:28:16 crc kubenswrapper[4831]: E1204 10:28:16.043266 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc1e4c46-b909-410a-980d-a045d5b3a636" containerName="console" Dec 04 10:28:16 crc kubenswrapper[4831]: I1204 10:28:16.043283 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc1e4c46-b909-410a-980d-a045d5b3a636" containerName="console" Dec 04 10:28:16 crc kubenswrapper[4831]: I1204 10:28:16.043476 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc1e4c46-b909-410a-980d-a045d5b3a636" containerName="console" Dec 04 10:28:16 crc kubenswrapper[4831]: I1204 10:28:16.044504 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l7qqj" Dec 04 10:28:16 crc kubenswrapper[4831]: I1204 10:28:16.065039 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l7qqj"] Dec 04 10:28:16 crc kubenswrapper[4831]: I1204 10:28:16.160378 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44f8k\" (UniqueName: \"kubernetes.io/projected/25e578f8-031d-4501-8e6f-debb13a956d0-kube-api-access-44f8k\") pod \"redhat-operators-l7qqj\" (UID: \"25e578f8-031d-4501-8e6f-debb13a956d0\") " pod="openshift-marketplace/redhat-operators-l7qqj" Dec 04 10:28:16 crc kubenswrapper[4831]: I1204 10:28:16.160444 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25e578f8-031d-4501-8e6f-debb13a956d0-catalog-content\") pod \"redhat-operators-l7qqj\" (UID: \"25e578f8-031d-4501-8e6f-debb13a956d0\") " pod="openshift-marketplace/redhat-operators-l7qqj" Dec 04 10:28:16 crc kubenswrapper[4831]: I1204 10:28:16.160647 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25e578f8-031d-4501-8e6f-debb13a956d0-utilities\") pod \"redhat-operators-l7qqj\" (UID: \"25e578f8-031d-4501-8e6f-debb13a956d0\") " pod="openshift-marketplace/redhat-operators-l7qqj" Dec 04 10:28:16 crc kubenswrapper[4831]: I1204 10:28:16.261519 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25e578f8-031d-4501-8e6f-debb13a956d0-utilities\") pod \"redhat-operators-l7qqj\" (UID: \"25e578f8-031d-4501-8e6f-debb13a956d0\") " pod="openshift-marketplace/redhat-operators-l7qqj" Dec 04 10:28:16 crc kubenswrapper[4831]: I1204 10:28:16.261604 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44f8k\" (UniqueName: \"kubernetes.io/projected/25e578f8-031d-4501-8e6f-debb13a956d0-kube-api-access-44f8k\") pod \"redhat-operators-l7qqj\" (UID: \"25e578f8-031d-4501-8e6f-debb13a956d0\") " pod="openshift-marketplace/redhat-operators-l7qqj" Dec 04 10:28:16 crc kubenswrapper[4831]: I1204 10:28:16.261678 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25e578f8-031d-4501-8e6f-debb13a956d0-catalog-content\") pod \"redhat-operators-l7qqj\" (UID: \"25e578f8-031d-4501-8e6f-debb13a956d0\") " pod="openshift-marketplace/redhat-operators-l7qqj" Dec 04 10:28:16 crc kubenswrapper[4831]: I1204 10:28:16.262090 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25e578f8-031d-4501-8e6f-debb13a956d0-utilities\") pod \"redhat-operators-l7qqj\" (UID: \"25e578f8-031d-4501-8e6f-debb13a956d0\") " pod="openshift-marketplace/redhat-operators-l7qqj" Dec 04 10:28:16 crc kubenswrapper[4831]: I1204 10:28:16.262146 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25e578f8-031d-4501-8e6f-debb13a956d0-catalog-content\") pod \"redhat-operators-l7qqj\" (UID: \"25e578f8-031d-4501-8e6f-debb13a956d0\") " pod="openshift-marketplace/redhat-operators-l7qqj" Dec 04 10:28:16 crc kubenswrapper[4831]: I1204 10:28:16.281852 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44f8k\" (UniqueName: \"kubernetes.io/projected/25e578f8-031d-4501-8e6f-debb13a956d0-kube-api-access-44f8k\") pod \"redhat-operators-l7qqj\" (UID: \"25e578f8-031d-4501-8e6f-debb13a956d0\") " pod="openshift-marketplace/redhat-operators-l7qqj" Dec 04 10:28:16 crc kubenswrapper[4831]: I1204 10:28:16.425089 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l7qqj" Dec 04 10:28:16 crc kubenswrapper[4831]: I1204 10:28:16.606743 4831 generic.go:334] "Generic (PLEG): container finished" podID="5359fc14-ea59-4f53-b1e4-c89f65453df0" containerID="433f8a6306172b5083733541fda5e2a2b05959f4c198d0d5ca0824038daed96d" exitCode=0 Dec 04 10:28:16 crc kubenswrapper[4831]: I1204 10:28:16.606780 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ljd2v" event={"ID":"5359fc14-ea59-4f53-b1e4-c89f65453df0","Type":"ContainerDied","Data":"433f8a6306172b5083733541fda5e2a2b05959f4c198d0d5ca0824038daed96d"} Dec 04 10:28:16 crc kubenswrapper[4831]: I1204 10:28:16.628967 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l7qqj"] Dec 04 10:28:16 crc kubenswrapper[4831]: W1204 10:28:16.639173 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25e578f8_031d_4501_8e6f_debb13a956d0.slice/crio-62f89da0803bb9f1c1c86bc71dd92143ced734f6b4264befe5dcbded6b142025 WatchSource:0}: Error finding container 62f89da0803bb9f1c1c86bc71dd92143ced734f6b4264befe5dcbded6b142025: Status 404 returned error can't find the container with id 62f89da0803bb9f1c1c86bc71dd92143ced734f6b4264befe5dcbded6b142025 Dec 04 10:28:17 crc kubenswrapper[4831]: I1204 10:28:17.616647 4831 generic.go:334] "Generic (PLEG): container finished" podID="25e578f8-031d-4501-8e6f-debb13a956d0" containerID="eb9ed83af7d983ff5c31fa535f2ab53b0f5cd0ca34310a50557fe73ed366148c" exitCode=0 Dec 04 10:28:17 crc kubenswrapper[4831]: I1204 10:28:17.616734 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l7qqj" event={"ID":"25e578f8-031d-4501-8e6f-debb13a956d0","Type":"ContainerDied","Data":"eb9ed83af7d983ff5c31fa535f2ab53b0f5cd0ca34310a50557fe73ed366148c"} Dec 04 10:28:17 crc kubenswrapper[4831]: I1204 10:28:17.617098 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l7qqj" event={"ID":"25e578f8-031d-4501-8e6f-debb13a956d0","Type":"ContainerStarted","Data":"62f89da0803bb9f1c1c86bc71dd92143ced734f6b4264befe5dcbded6b142025"} Dec 04 10:28:17 crc kubenswrapper[4831]: I1204 10:28:17.620739 4831 generic.go:334] "Generic (PLEG): container finished" podID="5359fc14-ea59-4f53-b1e4-c89f65453df0" containerID="4cda28e00be2829ebb3fbd241b32e6b3e4878d815b4aeaaf7b90cc1271c71e1a" exitCode=0 Dec 04 10:28:17 crc kubenswrapper[4831]: I1204 10:28:17.620809 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ljd2v" event={"ID":"5359fc14-ea59-4f53-b1e4-c89f65453df0","Type":"ContainerDied","Data":"4cda28e00be2829ebb3fbd241b32e6b3e4878d815b4aeaaf7b90cc1271c71e1a"} Dec 04 10:28:18 crc kubenswrapper[4831]: I1204 10:28:18.627721 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l7qqj" event={"ID":"25e578f8-031d-4501-8e6f-debb13a956d0","Type":"ContainerStarted","Data":"b828a684a8223919df67e8f34c3140530417e81acc499998e28ce609830af4b8"} Dec 04 10:28:18 crc kubenswrapper[4831]: I1204 10:28:18.869502 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ljd2v" Dec 04 10:28:18 crc kubenswrapper[4831]: I1204 10:28:18.995770 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5359fc14-ea59-4f53-b1e4-c89f65453df0-util\") pod \"5359fc14-ea59-4f53-b1e4-c89f65453df0\" (UID: \"5359fc14-ea59-4f53-b1e4-c89f65453df0\") " Dec 04 10:28:18 crc kubenswrapper[4831]: I1204 10:28:18.995879 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sk9zq\" (UniqueName: \"kubernetes.io/projected/5359fc14-ea59-4f53-b1e4-c89f65453df0-kube-api-access-sk9zq\") pod \"5359fc14-ea59-4f53-b1e4-c89f65453df0\" (UID: \"5359fc14-ea59-4f53-b1e4-c89f65453df0\") " Dec 04 10:28:18 crc kubenswrapper[4831]: I1204 10:28:18.995936 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5359fc14-ea59-4f53-b1e4-c89f65453df0-bundle\") pod \"5359fc14-ea59-4f53-b1e4-c89f65453df0\" (UID: \"5359fc14-ea59-4f53-b1e4-c89f65453df0\") " Dec 04 10:28:18 crc kubenswrapper[4831]: I1204 10:28:18.997220 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5359fc14-ea59-4f53-b1e4-c89f65453df0-bundle" (OuterVolumeSpecName: "bundle") pod "5359fc14-ea59-4f53-b1e4-c89f65453df0" (UID: "5359fc14-ea59-4f53-b1e4-c89f65453df0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:28:19 crc kubenswrapper[4831]: I1204 10:28:19.003703 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5359fc14-ea59-4f53-b1e4-c89f65453df0-kube-api-access-sk9zq" (OuterVolumeSpecName: "kube-api-access-sk9zq") pod "5359fc14-ea59-4f53-b1e4-c89f65453df0" (UID: "5359fc14-ea59-4f53-b1e4-c89f65453df0"). InnerVolumeSpecName "kube-api-access-sk9zq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:28:19 crc kubenswrapper[4831]: I1204 10:28:19.098086 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sk9zq\" (UniqueName: \"kubernetes.io/projected/5359fc14-ea59-4f53-b1e4-c89f65453df0-kube-api-access-sk9zq\") on node \"crc\" DevicePath \"\"" Dec 04 10:28:19 crc kubenswrapper[4831]: I1204 10:28:19.098147 4831 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5359fc14-ea59-4f53-b1e4-c89f65453df0-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:28:19 crc kubenswrapper[4831]: I1204 10:28:19.592288 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5359fc14-ea59-4f53-b1e4-c89f65453df0-util" (OuterVolumeSpecName: "util") pod "5359fc14-ea59-4f53-b1e4-c89f65453df0" (UID: "5359fc14-ea59-4f53-b1e4-c89f65453df0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:28:19 crc kubenswrapper[4831]: I1204 10:28:19.605478 4831 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5359fc14-ea59-4f53-b1e4-c89f65453df0-util\") on node \"crc\" DevicePath \"\"" Dec 04 10:28:19 crc kubenswrapper[4831]: I1204 10:28:19.639827 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ljd2v" Dec 04 10:28:19 crc kubenswrapper[4831]: I1204 10:28:19.639834 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ljd2v" event={"ID":"5359fc14-ea59-4f53-b1e4-c89f65453df0","Type":"ContainerDied","Data":"e66482f270ad5f67ad7b39dc92d9ebf60ddfb171724243bc32dbc6718025d171"} Dec 04 10:28:19 crc kubenswrapper[4831]: I1204 10:28:19.639929 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e66482f270ad5f67ad7b39dc92d9ebf60ddfb171724243bc32dbc6718025d171" Dec 04 10:28:19 crc kubenswrapper[4831]: I1204 10:28:19.643488 4831 generic.go:334] "Generic (PLEG): container finished" podID="25e578f8-031d-4501-8e6f-debb13a956d0" containerID="b828a684a8223919df67e8f34c3140530417e81acc499998e28ce609830af4b8" exitCode=0 Dec 04 10:28:19 crc kubenswrapper[4831]: I1204 10:28:19.643615 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l7qqj" event={"ID":"25e578f8-031d-4501-8e6f-debb13a956d0","Type":"ContainerDied","Data":"b828a684a8223919df67e8f34c3140530417e81acc499998e28ce609830af4b8"} Dec 04 10:28:21 crc kubenswrapper[4831]: I1204 10:28:21.971574 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:28:21 crc kubenswrapper[4831]: I1204 10:28:21.972156 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:28:22 crc kubenswrapper[4831]: I1204 10:28:22.666277 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l7qqj" event={"ID":"25e578f8-031d-4501-8e6f-debb13a956d0","Type":"ContainerStarted","Data":"afea79b792dd20c86a1598e896c1f01b9c5ecc6ee33b67303f521071bde6b3b4"} Dec 04 10:28:22 crc kubenswrapper[4831]: I1204 10:28:22.685086 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-l7qqj" podStartSLOduration=2.692749832 podStartE2EDuration="6.685063657s" podCreationTimestamp="2025-12-04 10:28:16 +0000 UTC" firstStartedPulling="2025-12-04 10:28:17.619236546 +0000 UTC m=+794.568411860" lastFinishedPulling="2025-12-04 10:28:21.611550361 +0000 UTC m=+798.560725685" observedRunningTime="2025-12-04 10:28:22.681811381 +0000 UTC m=+799.630986705" watchObservedRunningTime="2025-12-04 10:28:22.685063657 +0000 UTC m=+799.634238981" Dec 04 10:28:26 crc kubenswrapper[4831]: I1204 10:28:26.425455 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-l7qqj" Dec 04 10:28:26 crc kubenswrapper[4831]: I1204 10:28:26.425741 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l7qqj" Dec 04 10:28:27 crc kubenswrapper[4831]: I1204 10:28:27.474407 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l7qqj" podUID="25e578f8-031d-4501-8e6f-debb13a956d0" containerName="registry-server" probeResult="failure" output=< Dec 04 10:28:27 crc kubenswrapper[4831]: timeout: failed to connect service ":50051" within 1s Dec 04 10:28:27 crc kubenswrapper[4831]: > Dec 04 10:28:28 crc kubenswrapper[4831]: I1204 10:28:28.726412 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-797466985-hc4vf"] Dec 04 10:28:28 crc kubenswrapper[4831]: E1204 10:28:28.727479 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5359fc14-ea59-4f53-b1e4-c89f65453df0" containerName="util" Dec 04 10:28:28 crc kubenswrapper[4831]: I1204 10:28:28.727583 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="5359fc14-ea59-4f53-b1e4-c89f65453df0" containerName="util" Dec 04 10:28:28 crc kubenswrapper[4831]: E1204 10:28:28.727684 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5359fc14-ea59-4f53-b1e4-c89f65453df0" containerName="pull" Dec 04 10:28:28 crc kubenswrapper[4831]: I1204 10:28:28.727776 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="5359fc14-ea59-4f53-b1e4-c89f65453df0" containerName="pull" Dec 04 10:28:28 crc kubenswrapper[4831]: E1204 10:28:28.727860 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5359fc14-ea59-4f53-b1e4-c89f65453df0" containerName="extract" Dec 04 10:28:28 crc kubenswrapper[4831]: I1204 10:28:28.727934 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="5359fc14-ea59-4f53-b1e4-c89f65453df0" containerName="extract" Dec 04 10:28:28 crc kubenswrapper[4831]: I1204 10:28:28.728134 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="5359fc14-ea59-4f53-b1e4-c89f65453df0" containerName="extract" Dec 04 10:28:28 crc kubenswrapper[4831]: I1204 10:28:28.728723 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-797466985-hc4vf" Dec 04 10:28:28 crc kubenswrapper[4831]: I1204 10:28:28.730826 4831 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-58hqb" Dec 04 10:28:28 crc kubenswrapper[4831]: I1204 10:28:28.730905 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 04 10:28:28 crc kubenswrapper[4831]: I1204 10:28:28.730943 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 04 10:28:28 crc kubenswrapper[4831]: I1204 10:28:28.731226 4831 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 04 10:28:28 crc kubenswrapper[4831]: I1204 10:28:28.733352 4831 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 04 10:28:28 crc kubenswrapper[4831]: I1204 10:28:28.742939 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-797466985-hc4vf"] Dec 04 10:28:28 crc kubenswrapper[4831]: I1204 10:28:28.824237 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2f3f650b-08d4-44fb-80c0-eed2144aa7fd-webhook-cert\") pod \"metallb-operator-controller-manager-797466985-hc4vf\" (UID: \"2f3f650b-08d4-44fb-80c0-eed2144aa7fd\") " pod="metallb-system/metallb-operator-controller-manager-797466985-hc4vf" Dec 04 10:28:28 crc kubenswrapper[4831]: I1204 10:28:28.824376 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjhdr\" (UniqueName: \"kubernetes.io/projected/2f3f650b-08d4-44fb-80c0-eed2144aa7fd-kube-api-access-bjhdr\") pod \"metallb-operator-controller-manager-797466985-hc4vf\" (UID: \"2f3f650b-08d4-44fb-80c0-eed2144aa7fd\") " pod="metallb-system/metallb-operator-controller-manager-797466985-hc4vf" Dec 04 10:28:28 crc kubenswrapper[4831]: I1204 10:28:28.824500 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2f3f650b-08d4-44fb-80c0-eed2144aa7fd-apiservice-cert\") pod \"metallb-operator-controller-manager-797466985-hc4vf\" (UID: \"2f3f650b-08d4-44fb-80c0-eed2144aa7fd\") " pod="metallb-system/metallb-operator-controller-manager-797466985-hc4vf" Dec 04 10:28:28 crc kubenswrapper[4831]: I1204 10:28:28.925675 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2f3f650b-08d4-44fb-80c0-eed2144aa7fd-webhook-cert\") pod \"metallb-operator-controller-manager-797466985-hc4vf\" (UID: \"2f3f650b-08d4-44fb-80c0-eed2144aa7fd\") " pod="metallb-system/metallb-operator-controller-manager-797466985-hc4vf" Dec 04 10:28:28 crc kubenswrapper[4831]: I1204 10:28:28.925747 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjhdr\" (UniqueName: \"kubernetes.io/projected/2f3f650b-08d4-44fb-80c0-eed2144aa7fd-kube-api-access-bjhdr\") pod \"metallb-operator-controller-manager-797466985-hc4vf\" (UID: \"2f3f650b-08d4-44fb-80c0-eed2144aa7fd\") " pod="metallb-system/metallb-operator-controller-manager-797466985-hc4vf" Dec 04 10:28:28 crc kubenswrapper[4831]: I1204 10:28:28.925797 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2f3f650b-08d4-44fb-80c0-eed2144aa7fd-apiservice-cert\") pod \"metallb-operator-controller-manager-797466985-hc4vf\" (UID: \"2f3f650b-08d4-44fb-80c0-eed2144aa7fd\") " pod="metallb-system/metallb-operator-controller-manager-797466985-hc4vf" Dec 04 10:28:28 crc kubenswrapper[4831]: I1204 10:28:28.931445 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2f3f650b-08d4-44fb-80c0-eed2144aa7fd-apiservice-cert\") pod \"metallb-operator-controller-manager-797466985-hc4vf\" (UID: \"2f3f650b-08d4-44fb-80c0-eed2144aa7fd\") " pod="metallb-system/metallb-operator-controller-manager-797466985-hc4vf" Dec 04 10:28:28 crc kubenswrapper[4831]: I1204 10:28:28.934361 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2f3f650b-08d4-44fb-80c0-eed2144aa7fd-webhook-cert\") pod \"metallb-operator-controller-manager-797466985-hc4vf\" (UID: \"2f3f650b-08d4-44fb-80c0-eed2144aa7fd\") " pod="metallb-system/metallb-operator-controller-manager-797466985-hc4vf" Dec 04 10:28:28 crc kubenswrapper[4831]: I1204 10:28:28.948145 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjhdr\" (UniqueName: \"kubernetes.io/projected/2f3f650b-08d4-44fb-80c0-eed2144aa7fd-kube-api-access-bjhdr\") pod \"metallb-operator-controller-manager-797466985-hc4vf\" (UID: \"2f3f650b-08d4-44fb-80c0-eed2144aa7fd\") " pod="metallb-system/metallb-operator-controller-manager-797466985-hc4vf" Dec 04 10:28:29 crc kubenswrapper[4831]: I1204 10:28:29.019685 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-fb84d4944-xtmbd"] Dec 04 10:28:29 crc kubenswrapper[4831]: I1204 10:28:29.020694 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-fb84d4944-xtmbd" Dec 04 10:28:29 crc kubenswrapper[4831]: I1204 10:28:29.022159 4831 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 04 10:28:29 crc kubenswrapper[4831]: I1204 10:28:29.022605 4831 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-wwkfb" Dec 04 10:28:29 crc kubenswrapper[4831]: I1204 10:28:29.022632 4831 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 04 10:28:29 crc kubenswrapper[4831]: I1204 10:28:29.026628 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/513753c7-a982-42db-b4f9-a06f04c2f806-apiservice-cert\") pod \"metallb-operator-webhook-server-fb84d4944-xtmbd\" (UID: \"513753c7-a982-42db-b4f9-a06f04c2f806\") " pod="metallb-system/metallb-operator-webhook-server-fb84d4944-xtmbd" Dec 04 10:28:29 crc kubenswrapper[4831]: I1204 10:28:29.026687 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/513753c7-a982-42db-b4f9-a06f04c2f806-webhook-cert\") pod \"metallb-operator-webhook-server-fb84d4944-xtmbd\" (UID: \"513753c7-a982-42db-b4f9-a06f04c2f806\") " pod="metallb-system/metallb-operator-webhook-server-fb84d4944-xtmbd" Dec 04 10:28:29 crc kubenswrapper[4831]: I1204 10:28:29.026986 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sg7w\" (UniqueName: \"kubernetes.io/projected/513753c7-a982-42db-b4f9-a06f04c2f806-kube-api-access-9sg7w\") pod \"metallb-operator-webhook-server-fb84d4944-xtmbd\" (UID: \"513753c7-a982-42db-b4f9-a06f04c2f806\") " pod="metallb-system/metallb-operator-webhook-server-fb84d4944-xtmbd" Dec 04 10:28:29 crc kubenswrapper[4831]: I1204 10:28:29.032085 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-fb84d4944-xtmbd"] Dec 04 10:28:29 crc kubenswrapper[4831]: I1204 10:28:29.047255 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-797466985-hc4vf" Dec 04 10:28:29 crc kubenswrapper[4831]: I1204 10:28:29.128193 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sg7w\" (UniqueName: \"kubernetes.io/projected/513753c7-a982-42db-b4f9-a06f04c2f806-kube-api-access-9sg7w\") pod \"metallb-operator-webhook-server-fb84d4944-xtmbd\" (UID: \"513753c7-a982-42db-b4f9-a06f04c2f806\") " pod="metallb-system/metallb-operator-webhook-server-fb84d4944-xtmbd" Dec 04 10:28:29 crc kubenswrapper[4831]: I1204 10:28:29.128278 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/513753c7-a982-42db-b4f9-a06f04c2f806-apiservice-cert\") pod \"metallb-operator-webhook-server-fb84d4944-xtmbd\" (UID: \"513753c7-a982-42db-b4f9-a06f04c2f806\") " pod="metallb-system/metallb-operator-webhook-server-fb84d4944-xtmbd" Dec 04 10:28:29 crc kubenswrapper[4831]: I1204 10:28:29.128310 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/513753c7-a982-42db-b4f9-a06f04c2f806-webhook-cert\") pod \"metallb-operator-webhook-server-fb84d4944-xtmbd\" (UID: \"513753c7-a982-42db-b4f9-a06f04c2f806\") " pod="metallb-system/metallb-operator-webhook-server-fb84d4944-xtmbd" Dec 04 10:28:29 crc kubenswrapper[4831]: I1204 10:28:29.134141 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/513753c7-a982-42db-b4f9-a06f04c2f806-webhook-cert\") pod \"metallb-operator-webhook-server-fb84d4944-xtmbd\" (UID: \"513753c7-a982-42db-b4f9-a06f04c2f806\") " pod="metallb-system/metallb-operator-webhook-server-fb84d4944-xtmbd" Dec 04 10:28:29 crc kubenswrapper[4831]: I1204 10:28:29.135146 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/513753c7-a982-42db-b4f9-a06f04c2f806-apiservice-cert\") pod \"metallb-operator-webhook-server-fb84d4944-xtmbd\" (UID: \"513753c7-a982-42db-b4f9-a06f04c2f806\") " pod="metallb-system/metallb-operator-webhook-server-fb84d4944-xtmbd" Dec 04 10:28:29 crc kubenswrapper[4831]: I1204 10:28:29.161637 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sg7w\" (UniqueName: \"kubernetes.io/projected/513753c7-a982-42db-b4f9-a06f04c2f806-kube-api-access-9sg7w\") pod \"metallb-operator-webhook-server-fb84d4944-xtmbd\" (UID: \"513753c7-a982-42db-b4f9-a06f04c2f806\") " pod="metallb-system/metallb-operator-webhook-server-fb84d4944-xtmbd" Dec 04 10:28:29 crc kubenswrapper[4831]: I1204 10:28:29.285398 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-797466985-hc4vf"] Dec 04 10:28:29 crc kubenswrapper[4831]: I1204 10:28:29.335562 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-fb84d4944-xtmbd" Dec 04 10:28:29 crc kubenswrapper[4831]: I1204 10:28:29.532239 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-fb84d4944-xtmbd"] Dec 04 10:28:29 crc kubenswrapper[4831]: W1204 10:28:29.535218 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod513753c7_a982_42db_b4f9_a06f04c2f806.slice/crio-b24e68f958da5b822278342087986ccf7ae3570c71ee3c3ba11aa25979503750 WatchSource:0}: Error finding container b24e68f958da5b822278342087986ccf7ae3570c71ee3c3ba11aa25979503750: Status 404 returned error can't find the container with id b24e68f958da5b822278342087986ccf7ae3570c71ee3c3ba11aa25979503750 Dec 04 10:28:29 crc kubenswrapper[4831]: I1204 10:28:29.708449 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-797466985-hc4vf" event={"ID":"2f3f650b-08d4-44fb-80c0-eed2144aa7fd","Type":"ContainerStarted","Data":"0d21038f797d710ed280143c0902083e33429ddb12a011a6590d7d2ca076ec08"} Dec 04 10:28:29 crc kubenswrapper[4831]: I1204 10:28:29.709168 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-fb84d4944-xtmbd" event={"ID":"513753c7-a982-42db-b4f9-a06f04c2f806","Type":"ContainerStarted","Data":"b24e68f958da5b822278342087986ccf7ae3570c71ee3c3ba11aa25979503750"} Dec 04 10:28:36 crc kubenswrapper[4831]: I1204 10:28:36.464567 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-l7qqj" Dec 04 10:28:36 crc kubenswrapper[4831]: I1204 10:28:36.527265 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-l7qqj" Dec 04 10:28:36 crc kubenswrapper[4831]: I1204 10:28:36.700195 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l7qqj"] Dec 04 10:28:36 crc kubenswrapper[4831]: I1204 10:28:36.754915 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-fb84d4944-xtmbd" event={"ID":"513753c7-a982-42db-b4f9-a06f04c2f806","Type":"ContainerStarted","Data":"6d9e66f3cd10826012eba9fcd2130b9b607224485b495b707ef38947b154967a"} Dec 04 10:28:36 crc kubenswrapper[4831]: I1204 10:28:36.755023 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-fb84d4944-xtmbd" Dec 04 10:28:36 crc kubenswrapper[4831]: I1204 10:28:36.758831 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-797466985-hc4vf" event={"ID":"2f3f650b-08d4-44fb-80c0-eed2144aa7fd","Type":"ContainerStarted","Data":"34e1efe2777938e46ddeb01cd36d6dceb5f68797f1bd153b70d78f19338f4056"} Dec 04 10:28:36 crc kubenswrapper[4831]: I1204 10:28:36.759104 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-797466985-hc4vf" Dec 04 10:28:36 crc kubenswrapper[4831]: I1204 10:28:36.797289 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-fb84d4944-xtmbd" podStartSLOduration=1.154321713 podStartE2EDuration="7.797259379s" podCreationTimestamp="2025-12-04 10:28:29 +0000 UTC" firstStartedPulling="2025-12-04 10:28:29.539608342 +0000 UTC m=+806.488783656" lastFinishedPulling="2025-12-04 10:28:36.182545998 +0000 UTC m=+813.131721322" observedRunningTime="2025-12-04 10:28:36.792332978 +0000 UTC m=+813.741508342" watchObservedRunningTime="2025-12-04 10:28:36.797259379 +0000 UTC m=+813.746434743" Dec 04 10:28:36 crc kubenswrapper[4831]: I1204 10:28:36.850580 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-797466985-hc4vf" podStartSLOduration=2.038737572 podStartE2EDuration="8.850549419s" podCreationTimestamp="2025-12-04 10:28:28 +0000 UTC" firstStartedPulling="2025-12-04 10:28:29.298134747 +0000 UTC m=+806.247310061" lastFinishedPulling="2025-12-04 10:28:36.109946594 +0000 UTC m=+813.059121908" observedRunningTime="2025-12-04 10:28:36.84421316 +0000 UTC m=+813.793388484" watchObservedRunningTime="2025-12-04 10:28:36.850549419 +0000 UTC m=+813.799724763" Dec 04 10:28:37 crc kubenswrapper[4831]: I1204 10:28:37.767090 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-l7qqj" podUID="25e578f8-031d-4501-8e6f-debb13a956d0" containerName="registry-server" containerID="cri-o://afea79b792dd20c86a1598e896c1f01b9c5ecc6ee33b67303f521071bde6b3b4" gracePeriod=2 Dec 04 10:28:38 crc kubenswrapper[4831]: I1204 10:28:38.112944 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l7qqj" Dec 04 10:28:38 crc kubenswrapper[4831]: I1204 10:28:38.254113 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25e578f8-031d-4501-8e6f-debb13a956d0-catalog-content\") pod \"25e578f8-031d-4501-8e6f-debb13a956d0\" (UID: \"25e578f8-031d-4501-8e6f-debb13a956d0\") " Dec 04 10:28:38 crc kubenswrapper[4831]: I1204 10:28:38.254213 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25e578f8-031d-4501-8e6f-debb13a956d0-utilities\") pod \"25e578f8-031d-4501-8e6f-debb13a956d0\" (UID: \"25e578f8-031d-4501-8e6f-debb13a956d0\") " Dec 04 10:28:38 crc kubenswrapper[4831]: I1204 10:28:38.254311 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44f8k\" (UniqueName: \"kubernetes.io/projected/25e578f8-031d-4501-8e6f-debb13a956d0-kube-api-access-44f8k\") pod \"25e578f8-031d-4501-8e6f-debb13a956d0\" (UID: \"25e578f8-031d-4501-8e6f-debb13a956d0\") " Dec 04 10:28:38 crc kubenswrapper[4831]: I1204 10:28:38.255923 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25e578f8-031d-4501-8e6f-debb13a956d0-utilities" (OuterVolumeSpecName: "utilities") pod "25e578f8-031d-4501-8e6f-debb13a956d0" (UID: "25e578f8-031d-4501-8e6f-debb13a956d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:28:38 crc kubenswrapper[4831]: I1204 10:28:38.259585 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e578f8-031d-4501-8e6f-debb13a956d0-kube-api-access-44f8k" (OuterVolumeSpecName: "kube-api-access-44f8k") pod "25e578f8-031d-4501-8e6f-debb13a956d0" (UID: "25e578f8-031d-4501-8e6f-debb13a956d0"). InnerVolumeSpecName "kube-api-access-44f8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:28:38 crc kubenswrapper[4831]: I1204 10:28:38.362813 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25e578f8-031d-4501-8e6f-debb13a956d0-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:28:38 crc kubenswrapper[4831]: I1204 10:28:38.362843 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44f8k\" (UniqueName: \"kubernetes.io/projected/25e578f8-031d-4501-8e6f-debb13a956d0-kube-api-access-44f8k\") on node \"crc\" DevicePath \"\"" Dec 04 10:28:38 crc kubenswrapper[4831]: I1204 10:28:38.373064 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25e578f8-031d-4501-8e6f-debb13a956d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "25e578f8-031d-4501-8e6f-debb13a956d0" (UID: "25e578f8-031d-4501-8e6f-debb13a956d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:28:38 crc kubenswrapper[4831]: I1204 10:28:38.464004 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25e578f8-031d-4501-8e6f-debb13a956d0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:28:38 crc kubenswrapper[4831]: I1204 10:28:38.789833 4831 generic.go:334] "Generic (PLEG): container finished" podID="25e578f8-031d-4501-8e6f-debb13a956d0" containerID="afea79b792dd20c86a1598e896c1f01b9c5ecc6ee33b67303f521071bde6b3b4" exitCode=0 Dec 04 10:28:38 crc kubenswrapper[4831]: I1204 10:28:38.789895 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l7qqj" event={"ID":"25e578f8-031d-4501-8e6f-debb13a956d0","Type":"ContainerDied","Data":"afea79b792dd20c86a1598e896c1f01b9c5ecc6ee33b67303f521071bde6b3b4"} Dec 04 10:28:38 crc kubenswrapper[4831]: I1204 10:28:38.789966 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l7qqj" event={"ID":"25e578f8-031d-4501-8e6f-debb13a956d0","Type":"ContainerDied","Data":"62f89da0803bb9f1c1c86bc71dd92143ced734f6b4264befe5dcbded6b142025"} Dec 04 10:28:38 crc kubenswrapper[4831]: I1204 10:28:38.789963 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l7qqj" Dec 04 10:28:38 crc kubenswrapper[4831]: I1204 10:28:38.789990 4831 scope.go:117] "RemoveContainer" containerID="afea79b792dd20c86a1598e896c1f01b9c5ecc6ee33b67303f521071bde6b3b4" Dec 04 10:28:38 crc kubenswrapper[4831]: I1204 10:28:38.815532 4831 scope.go:117] "RemoveContainer" containerID="b828a684a8223919df67e8f34c3140530417e81acc499998e28ce609830af4b8" Dec 04 10:28:38 crc kubenswrapper[4831]: I1204 10:28:38.826446 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l7qqj"] Dec 04 10:28:38 crc kubenswrapper[4831]: I1204 10:28:38.832077 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-l7qqj"] Dec 04 10:28:38 crc kubenswrapper[4831]: I1204 10:28:38.849117 4831 scope.go:117] "RemoveContainer" containerID="eb9ed83af7d983ff5c31fa535f2ab53b0f5cd0ca34310a50557fe73ed366148c" Dec 04 10:28:38 crc kubenswrapper[4831]: I1204 10:28:38.867912 4831 scope.go:117] "RemoveContainer" containerID="afea79b792dd20c86a1598e896c1f01b9c5ecc6ee33b67303f521071bde6b3b4" Dec 04 10:28:38 crc kubenswrapper[4831]: E1204 10:28:38.868358 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afea79b792dd20c86a1598e896c1f01b9c5ecc6ee33b67303f521071bde6b3b4\": container with ID starting with afea79b792dd20c86a1598e896c1f01b9c5ecc6ee33b67303f521071bde6b3b4 not found: ID does not exist" containerID="afea79b792dd20c86a1598e896c1f01b9c5ecc6ee33b67303f521071bde6b3b4" Dec 04 10:28:38 crc kubenswrapper[4831]: I1204 10:28:38.868411 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afea79b792dd20c86a1598e896c1f01b9c5ecc6ee33b67303f521071bde6b3b4"} err="failed to get container status \"afea79b792dd20c86a1598e896c1f01b9c5ecc6ee33b67303f521071bde6b3b4\": rpc error: code = NotFound desc = could not find container \"afea79b792dd20c86a1598e896c1f01b9c5ecc6ee33b67303f521071bde6b3b4\": container with ID starting with afea79b792dd20c86a1598e896c1f01b9c5ecc6ee33b67303f521071bde6b3b4 not found: ID does not exist" Dec 04 10:28:38 crc kubenswrapper[4831]: I1204 10:28:38.868460 4831 scope.go:117] "RemoveContainer" containerID="b828a684a8223919df67e8f34c3140530417e81acc499998e28ce609830af4b8" Dec 04 10:28:38 crc kubenswrapper[4831]: E1204 10:28:38.868811 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b828a684a8223919df67e8f34c3140530417e81acc499998e28ce609830af4b8\": container with ID starting with b828a684a8223919df67e8f34c3140530417e81acc499998e28ce609830af4b8 not found: ID does not exist" containerID="b828a684a8223919df67e8f34c3140530417e81acc499998e28ce609830af4b8" Dec 04 10:28:38 crc kubenswrapper[4831]: I1204 10:28:38.868852 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b828a684a8223919df67e8f34c3140530417e81acc499998e28ce609830af4b8"} err="failed to get container status \"b828a684a8223919df67e8f34c3140530417e81acc499998e28ce609830af4b8\": rpc error: code = NotFound desc = could not find container \"b828a684a8223919df67e8f34c3140530417e81acc499998e28ce609830af4b8\": container with ID starting with b828a684a8223919df67e8f34c3140530417e81acc499998e28ce609830af4b8 not found: ID does not exist" Dec 04 10:28:38 crc kubenswrapper[4831]: I1204 10:28:38.868879 4831 scope.go:117] "RemoveContainer" containerID="eb9ed83af7d983ff5c31fa535f2ab53b0f5cd0ca34310a50557fe73ed366148c" Dec 04 10:28:38 crc kubenswrapper[4831]: E1204 10:28:38.869270 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb9ed83af7d983ff5c31fa535f2ab53b0f5cd0ca34310a50557fe73ed366148c\": container with ID starting with eb9ed83af7d983ff5c31fa535f2ab53b0f5cd0ca34310a50557fe73ed366148c not found: ID does not exist" containerID="eb9ed83af7d983ff5c31fa535f2ab53b0f5cd0ca34310a50557fe73ed366148c" Dec 04 10:28:38 crc kubenswrapper[4831]: I1204 10:28:38.869311 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb9ed83af7d983ff5c31fa535f2ab53b0f5cd0ca34310a50557fe73ed366148c"} err="failed to get container status \"eb9ed83af7d983ff5c31fa535f2ab53b0f5cd0ca34310a50557fe73ed366148c\": rpc error: code = NotFound desc = could not find container \"eb9ed83af7d983ff5c31fa535f2ab53b0f5cd0ca34310a50557fe73ed366148c\": container with ID starting with eb9ed83af7d983ff5c31fa535f2ab53b0f5cd0ca34310a50557fe73ed366148c not found: ID does not exist" Dec 04 10:28:39 crc kubenswrapper[4831]: I1204 10:28:39.285145 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e578f8-031d-4501-8e6f-debb13a956d0" path="/var/lib/kubelet/pods/25e578f8-031d-4501-8e6f-debb13a956d0/volumes" Dec 04 10:28:49 crc kubenswrapper[4831]: I1204 10:28:49.340100 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-fb84d4944-xtmbd" Dec 04 10:28:51 crc kubenswrapper[4831]: I1204 10:28:51.972164 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:28:51 crc kubenswrapper[4831]: I1204 10:28:51.972278 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:29:09 crc kubenswrapper[4831]: I1204 10:29:09.052564 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-797466985-hc4vf" Dec 04 10:29:09 crc kubenswrapper[4831]: I1204 10:29:09.837328 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-fw678"] Dec 04 10:29:09 crc kubenswrapper[4831]: E1204 10:29:09.837651 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25e578f8-031d-4501-8e6f-debb13a956d0" containerName="extract-content" Dec 04 10:29:09 crc kubenswrapper[4831]: I1204 10:29:09.837703 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="25e578f8-031d-4501-8e6f-debb13a956d0" containerName="extract-content" Dec 04 10:29:09 crc kubenswrapper[4831]: E1204 10:29:09.837724 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25e578f8-031d-4501-8e6f-debb13a956d0" containerName="extract-utilities" Dec 04 10:29:09 crc kubenswrapper[4831]: I1204 10:29:09.837736 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="25e578f8-031d-4501-8e6f-debb13a956d0" containerName="extract-utilities" Dec 04 10:29:09 crc kubenswrapper[4831]: E1204 10:29:09.837767 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25e578f8-031d-4501-8e6f-debb13a956d0" containerName="registry-server" Dec 04 10:29:09 crc kubenswrapper[4831]: I1204 10:29:09.837778 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="25e578f8-031d-4501-8e6f-debb13a956d0" containerName="registry-server" Dec 04 10:29:09 crc kubenswrapper[4831]: I1204 10:29:09.837978 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="25e578f8-031d-4501-8e6f-debb13a956d0" containerName="registry-server" Dec 04 10:29:09 crc kubenswrapper[4831]: I1204 10:29:09.838580 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-fw678" Dec 04 10:29:09 crc kubenswrapper[4831]: I1204 10:29:09.842288 4831 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 04 10:29:09 crc kubenswrapper[4831]: I1204 10:29:09.843921 4831 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-gngrf" Dec 04 10:29:09 crc kubenswrapper[4831]: I1204 10:29:09.856305 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-4d7f7"] Dec 04 10:29:09 crc kubenswrapper[4831]: I1204 10:29:09.859147 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-4d7f7" Dec 04 10:29:09 crc kubenswrapper[4831]: I1204 10:29:09.862923 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 04 10:29:09 crc kubenswrapper[4831]: I1204 10:29:09.862923 4831 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 04 10:29:09 crc kubenswrapper[4831]: I1204 10:29:09.870158 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-fw678"] Dec 04 10:29:09 crc kubenswrapper[4831]: I1204 10:29:09.913946 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwwlk\" (UniqueName: \"kubernetes.io/projected/f49f5bce-b752-4729-aa67-28847f9f04b1-kube-api-access-nwwlk\") pod \"frr-k8s-webhook-server-7fcb986d4-fw678\" (UID: \"f49f5bce-b752-4729-aa67-28847f9f04b1\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-fw678" Dec 04 10:29:09 crc kubenswrapper[4831]: I1204 10:29:09.914092 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f49f5bce-b752-4729-aa67-28847f9f04b1-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-fw678\" (UID: \"f49f5bce-b752-4729-aa67-28847f9f04b1\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-fw678" Dec 04 10:29:09 crc kubenswrapper[4831]: I1204 10:29:09.931982 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-rgd8b"] Dec 04 10:29:09 crc kubenswrapper[4831]: I1204 10:29:09.933158 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-rgd8b" Dec 04 10:29:09 crc kubenswrapper[4831]: I1204 10:29:09.934885 4831 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-x7kmp" Dec 04 10:29:09 crc kubenswrapper[4831]: I1204 10:29:09.935145 4831 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 04 10:29:09 crc kubenswrapper[4831]: I1204 10:29:09.935534 4831 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 04 10:29:09 crc kubenswrapper[4831]: I1204 10:29:09.938801 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-m2fz5"] Dec 04 10:29:09 crc kubenswrapper[4831]: I1204 10:29:09.939952 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-m2fz5" Dec 04 10:29:09 crc kubenswrapper[4831]: I1204 10:29:09.941152 4831 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 04 10:29:09 crc kubenswrapper[4831]: I1204 10:29:09.948544 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 04 10:29:09 crc kubenswrapper[4831]: I1204 10:29:09.963053 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-m2fz5"] Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.015243 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fc7c8ea8-2f48-4b72-8209-3763c0fe74e4-cert\") pod \"controller-f8648f98b-m2fz5\" (UID: \"fc7c8ea8-2f48-4b72-8209-3763c0fe74e4\") " pod="metallb-system/controller-f8648f98b-m2fz5" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.015310 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh52x\" (UniqueName: \"kubernetes.io/projected/c082e7e3-e171-4637-84a9-84f8aa17b51e-kube-api-access-bh52x\") pod \"frr-k8s-4d7f7\" (UID: \"c082e7e3-e171-4637-84a9-84f8aa17b51e\") " pod="metallb-system/frr-k8s-4d7f7" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.015410 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/c082e7e3-e171-4637-84a9-84f8aa17b51e-reloader\") pod \"frr-k8s-4d7f7\" (UID: \"c082e7e3-e171-4637-84a9-84f8aa17b51e\") " pod="metallb-system/frr-k8s-4d7f7" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.015469 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/c082e7e3-e171-4637-84a9-84f8aa17b51e-metrics\") pod \"frr-k8s-4d7f7\" (UID: \"c082e7e3-e171-4637-84a9-84f8aa17b51e\") " pod="metallb-system/frr-k8s-4d7f7" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.015485 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/c082e7e3-e171-4637-84a9-84f8aa17b51e-frr-conf\") pod \"frr-k8s-4d7f7\" (UID: \"c082e7e3-e171-4637-84a9-84f8aa17b51e\") " pod="metallb-system/frr-k8s-4d7f7" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.015502 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c082e7e3-e171-4637-84a9-84f8aa17b51e-metrics-certs\") pod \"frr-k8s-4d7f7\" (UID: \"c082e7e3-e171-4637-84a9-84f8aa17b51e\") " pod="metallb-system/frr-k8s-4d7f7" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.015533 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/43cdb7df-ed6c-4f7b-b460-467a22bfe06c-metallb-excludel2\") pod \"speaker-rgd8b\" (UID: \"43cdb7df-ed6c-4f7b-b460-467a22bfe06c\") " pod="metallb-system/speaker-rgd8b" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.015553 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkqhw\" (UniqueName: \"kubernetes.io/projected/43cdb7df-ed6c-4f7b-b460-467a22bfe06c-kube-api-access-jkqhw\") pod \"speaker-rgd8b\" (UID: \"43cdb7df-ed6c-4f7b-b460-467a22bfe06c\") " pod="metallb-system/speaker-rgd8b" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.015577 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2422n\" (UniqueName: \"kubernetes.io/projected/fc7c8ea8-2f48-4b72-8209-3763c0fe74e4-kube-api-access-2422n\") pod \"controller-f8648f98b-m2fz5\" (UID: \"fc7c8ea8-2f48-4b72-8209-3763c0fe74e4\") " pod="metallb-system/controller-f8648f98b-m2fz5" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.015594 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/43cdb7df-ed6c-4f7b-b460-467a22bfe06c-memberlist\") pod \"speaker-rgd8b\" (UID: \"43cdb7df-ed6c-4f7b-b460-467a22bfe06c\") " pod="metallb-system/speaker-rgd8b" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.015633 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/c082e7e3-e171-4637-84a9-84f8aa17b51e-frr-sockets\") pod \"frr-k8s-4d7f7\" (UID: \"c082e7e3-e171-4637-84a9-84f8aa17b51e\") " pod="metallb-system/frr-k8s-4d7f7" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.015791 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f49f5bce-b752-4729-aa67-28847f9f04b1-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-fw678\" (UID: \"f49f5bce-b752-4729-aa67-28847f9f04b1\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-fw678" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.015844 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/43cdb7df-ed6c-4f7b-b460-467a22bfe06c-metrics-certs\") pod \"speaker-rgd8b\" (UID: \"43cdb7df-ed6c-4f7b-b460-467a22bfe06c\") " pod="metallb-system/speaker-rgd8b" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.015889 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc7c8ea8-2f48-4b72-8209-3763c0fe74e4-metrics-certs\") pod \"controller-f8648f98b-m2fz5\" (UID: \"fc7c8ea8-2f48-4b72-8209-3763c0fe74e4\") " pod="metallb-system/controller-f8648f98b-m2fz5" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.015956 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/c082e7e3-e171-4637-84a9-84f8aa17b51e-frr-startup\") pod \"frr-k8s-4d7f7\" (UID: \"c082e7e3-e171-4637-84a9-84f8aa17b51e\") " pod="metallb-system/frr-k8s-4d7f7" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.016020 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwwlk\" (UniqueName: \"kubernetes.io/projected/f49f5bce-b752-4729-aa67-28847f9f04b1-kube-api-access-nwwlk\") pod \"frr-k8s-webhook-server-7fcb986d4-fw678\" (UID: \"f49f5bce-b752-4729-aa67-28847f9f04b1\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-fw678" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.022437 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f49f5bce-b752-4729-aa67-28847f9f04b1-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-fw678\" (UID: \"f49f5bce-b752-4729-aa67-28847f9f04b1\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-fw678" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.033312 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwwlk\" (UniqueName: \"kubernetes.io/projected/f49f5bce-b752-4729-aa67-28847f9f04b1-kube-api-access-nwwlk\") pod \"frr-k8s-webhook-server-7fcb986d4-fw678\" (UID: \"f49f5bce-b752-4729-aa67-28847f9f04b1\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-fw678" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.117547 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fc7c8ea8-2f48-4b72-8209-3763c0fe74e4-cert\") pod \"controller-f8648f98b-m2fz5\" (UID: \"fc7c8ea8-2f48-4b72-8209-3763c0fe74e4\") " pod="metallb-system/controller-f8648f98b-m2fz5" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.117597 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/c082e7e3-e171-4637-84a9-84f8aa17b51e-reloader\") pod \"frr-k8s-4d7f7\" (UID: \"c082e7e3-e171-4637-84a9-84f8aa17b51e\") " pod="metallb-system/frr-k8s-4d7f7" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.117613 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bh52x\" (UniqueName: \"kubernetes.io/projected/c082e7e3-e171-4637-84a9-84f8aa17b51e-kube-api-access-bh52x\") pod \"frr-k8s-4d7f7\" (UID: \"c082e7e3-e171-4637-84a9-84f8aa17b51e\") " pod="metallb-system/frr-k8s-4d7f7" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.117639 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/c082e7e3-e171-4637-84a9-84f8aa17b51e-metrics\") pod \"frr-k8s-4d7f7\" (UID: \"c082e7e3-e171-4637-84a9-84f8aa17b51e\") " pod="metallb-system/frr-k8s-4d7f7" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.117656 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/c082e7e3-e171-4637-84a9-84f8aa17b51e-frr-conf\") pod \"frr-k8s-4d7f7\" (UID: \"c082e7e3-e171-4637-84a9-84f8aa17b51e\") " pod="metallb-system/frr-k8s-4d7f7" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.117685 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c082e7e3-e171-4637-84a9-84f8aa17b51e-metrics-certs\") pod \"frr-k8s-4d7f7\" (UID: \"c082e7e3-e171-4637-84a9-84f8aa17b51e\") " pod="metallb-system/frr-k8s-4d7f7" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.117702 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/43cdb7df-ed6c-4f7b-b460-467a22bfe06c-metallb-excludel2\") pod \"speaker-rgd8b\" (UID: \"43cdb7df-ed6c-4f7b-b460-467a22bfe06c\") " pod="metallb-system/speaker-rgd8b" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.117720 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkqhw\" (UniqueName: \"kubernetes.io/projected/43cdb7df-ed6c-4f7b-b460-467a22bfe06c-kube-api-access-jkqhw\") pod \"speaker-rgd8b\" (UID: \"43cdb7df-ed6c-4f7b-b460-467a22bfe06c\") " pod="metallb-system/speaker-rgd8b" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.117739 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2422n\" (UniqueName: \"kubernetes.io/projected/fc7c8ea8-2f48-4b72-8209-3763c0fe74e4-kube-api-access-2422n\") pod \"controller-f8648f98b-m2fz5\" (UID: \"fc7c8ea8-2f48-4b72-8209-3763c0fe74e4\") " pod="metallb-system/controller-f8648f98b-m2fz5" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.117753 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/43cdb7df-ed6c-4f7b-b460-467a22bfe06c-memberlist\") pod \"speaker-rgd8b\" (UID: \"43cdb7df-ed6c-4f7b-b460-467a22bfe06c\") " pod="metallb-system/speaker-rgd8b" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.117769 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/c082e7e3-e171-4637-84a9-84f8aa17b51e-frr-sockets\") pod \"frr-k8s-4d7f7\" (UID: \"c082e7e3-e171-4637-84a9-84f8aa17b51e\") " pod="metallb-system/frr-k8s-4d7f7" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.117794 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/43cdb7df-ed6c-4f7b-b460-467a22bfe06c-metrics-certs\") pod \"speaker-rgd8b\" (UID: \"43cdb7df-ed6c-4f7b-b460-467a22bfe06c\") " pod="metallb-system/speaker-rgd8b" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.117811 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc7c8ea8-2f48-4b72-8209-3763c0fe74e4-metrics-certs\") pod \"controller-f8648f98b-m2fz5\" (UID: \"fc7c8ea8-2f48-4b72-8209-3763c0fe74e4\") " pod="metallb-system/controller-f8648f98b-m2fz5" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.117844 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/c082e7e3-e171-4637-84a9-84f8aa17b51e-frr-startup\") pod \"frr-k8s-4d7f7\" (UID: \"c082e7e3-e171-4637-84a9-84f8aa17b51e\") " pod="metallb-system/frr-k8s-4d7f7" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.118161 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/c082e7e3-e171-4637-84a9-84f8aa17b51e-frr-conf\") pod \"frr-k8s-4d7f7\" (UID: \"c082e7e3-e171-4637-84a9-84f8aa17b51e\") " pod="metallb-system/frr-k8s-4d7f7" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.118370 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/c082e7e3-e171-4637-84a9-84f8aa17b51e-reloader\") pod \"frr-k8s-4d7f7\" (UID: \"c082e7e3-e171-4637-84a9-84f8aa17b51e\") " pod="metallb-system/frr-k8s-4d7f7" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.118561 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/c082e7e3-e171-4637-84a9-84f8aa17b51e-frr-sockets\") pod \"frr-k8s-4d7f7\" (UID: \"c082e7e3-e171-4637-84a9-84f8aa17b51e\") " pod="metallb-system/frr-k8s-4d7f7" Dec 04 10:29:10 crc kubenswrapper[4831]: E1204 10:29:10.118641 4831 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 04 10:29:10 crc kubenswrapper[4831]: E1204 10:29:10.118728 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43cdb7df-ed6c-4f7b-b460-467a22bfe06c-memberlist podName:43cdb7df-ed6c-4f7b-b460-467a22bfe06c nodeName:}" failed. No retries permitted until 2025-12-04 10:29:10.618709466 +0000 UTC m=+847.567884780 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/43cdb7df-ed6c-4f7b-b460-467a22bfe06c-memberlist") pod "speaker-rgd8b" (UID: "43cdb7df-ed6c-4f7b-b460-467a22bfe06c") : secret "metallb-memberlist" not found Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.118750 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/c082e7e3-e171-4637-84a9-84f8aa17b51e-metrics\") pod \"frr-k8s-4d7f7\" (UID: \"c082e7e3-e171-4637-84a9-84f8aa17b51e\") " pod="metallb-system/frr-k8s-4d7f7" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.118765 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/c082e7e3-e171-4637-84a9-84f8aa17b51e-frr-startup\") pod \"frr-k8s-4d7f7\" (UID: \"c082e7e3-e171-4637-84a9-84f8aa17b51e\") " pod="metallb-system/frr-k8s-4d7f7" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.119131 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/43cdb7df-ed6c-4f7b-b460-467a22bfe06c-metallb-excludel2\") pod \"speaker-rgd8b\" (UID: \"43cdb7df-ed6c-4f7b-b460-467a22bfe06c\") " pod="metallb-system/speaker-rgd8b" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.119756 4831 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.120850 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc7c8ea8-2f48-4b72-8209-3763c0fe74e4-metrics-certs\") pod \"controller-f8648f98b-m2fz5\" (UID: \"fc7c8ea8-2f48-4b72-8209-3763c0fe74e4\") " pod="metallb-system/controller-f8648f98b-m2fz5" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.121161 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c082e7e3-e171-4637-84a9-84f8aa17b51e-metrics-certs\") pod \"frr-k8s-4d7f7\" (UID: \"c082e7e3-e171-4637-84a9-84f8aa17b51e\") " pod="metallb-system/frr-k8s-4d7f7" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.121728 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/43cdb7df-ed6c-4f7b-b460-467a22bfe06c-metrics-certs\") pod \"speaker-rgd8b\" (UID: \"43cdb7df-ed6c-4f7b-b460-467a22bfe06c\") " pod="metallb-system/speaker-rgd8b" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.137687 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fc7c8ea8-2f48-4b72-8209-3763c0fe74e4-cert\") pod \"controller-f8648f98b-m2fz5\" (UID: \"fc7c8ea8-2f48-4b72-8209-3763c0fe74e4\") " pod="metallb-system/controller-f8648f98b-m2fz5" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.141706 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkqhw\" (UniqueName: \"kubernetes.io/projected/43cdb7df-ed6c-4f7b-b460-467a22bfe06c-kube-api-access-jkqhw\") pod \"speaker-rgd8b\" (UID: \"43cdb7df-ed6c-4f7b-b460-467a22bfe06c\") " pod="metallb-system/speaker-rgd8b" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.141869 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2422n\" (UniqueName: \"kubernetes.io/projected/fc7c8ea8-2f48-4b72-8209-3763c0fe74e4-kube-api-access-2422n\") pod \"controller-f8648f98b-m2fz5\" (UID: \"fc7c8ea8-2f48-4b72-8209-3763c0fe74e4\") " pod="metallb-system/controller-f8648f98b-m2fz5" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.144637 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh52x\" (UniqueName: \"kubernetes.io/projected/c082e7e3-e171-4637-84a9-84f8aa17b51e-kube-api-access-bh52x\") pod \"frr-k8s-4d7f7\" (UID: \"c082e7e3-e171-4637-84a9-84f8aa17b51e\") " pod="metallb-system/frr-k8s-4d7f7" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.159531 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-fw678" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.178717 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-4d7f7" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.258684 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-m2fz5" Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.597613 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-fw678"] Dec 04 10:29:10 crc kubenswrapper[4831]: W1204 10:29:10.604303 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf49f5bce_b752_4729_aa67_28847f9f04b1.slice/crio-5e0770eb4bc239459165f5dcbf689d2f6487c896082fcc4db5a2a7dcf053d89c WatchSource:0}: Error finding container 5e0770eb4bc239459165f5dcbf689d2f6487c896082fcc4db5a2a7dcf053d89c: Status 404 returned error can't find the container with id 5e0770eb4bc239459165f5dcbf689d2f6487c896082fcc4db5a2a7dcf053d89c Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.623017 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/43cdb7df-ed6c-4f7b-b460-467a22bfe06c-memberlist\") pod \"speaker-rgd8b\" (UID: \"43cdb7df-ed6c-4f7b-b460-467a22bfe06c\") " pod="metallb-system/speaker-rgd8b" Dec 04 10:29:10 crc kubenswrapper[4831]: E1204 10:29:10.623199 4831 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 04 10:29:10 crc kubenswrapper[4831]: E1204 10:29:10.623280 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43cdb7df-ed6c-4f7b-b460-467a22bfe06c-memberlist podName:43cdb7df-ed6c-4f7b-b460-467a22bfe06c nodeName:}" failed. No retries permitted until 2025-12-04 10:29:11.623260607 +0000 UTC m=+848.572435921 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/43cdb7df-ed6c-4f7b-b460-467a22bfe06c-memberlist") pod "speaker-rgd8b" (UID: "43cdb7df-ed6c-4f7b-b460-467a22bfe06c") : secret "metallb-memberlist" not found Dec 04 10:29:10 crc kubenswrapper[4831]: I1204 10:29:10.736457 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-m2fz5"] Dec 04 10:29:10 crc kubenswrapper[4831]: W1204 10:29:10.742220 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc7c8ea8_2f48_4b72_8209_3763c0fe74e4.slice/crio-ccb094b92c24cc9e8975ca361cc107c57c99ae1f791733e7e60b3739406599b5 WatchSource:0}: Error finding container ccb094b92c24cc9e8975ca361cc107c57c99ae1f791733e7e60b3739406599b5: Status 404 returned error can't find the container with id ccb094b92c24cc9e8975ca361cc107c57c99ae1f791733e7e60b3739406599b5 Dec 04 10:29:11 crc kubenswrapper[4831]: I1204 10:29:11.012004 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4d7f7" event={"ID":"c082e7e3-e171-4637-84a9-84f8aa17b51e","Type":"ContainerStarted","Data":"ac80185826b730311e5c32055255084c0862c7f796e9139b9438b802a3404031"} Dec 04 10:29:11 crc kubenswrapper[4831]: I1204 10:29:11.013248 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-fw678" event={"ID":"f49f5bce-b752-4729-aa67-28847f9f04b1","Type":"ContainerStarted","Data":"5e0770eb4bc239459165f5dcbf689d2f6487c896082fcc4db5a2a7dcf053d89c"} Dec 04 10:29:11 crc kubenswrapper[4831]: I1204 10:29:11.015084 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-m2fz5" event={"ID":"fc7c8ea8-2f48-4b72-8209-3763c0fe74e4","Type":"ContainerStarted","Data":"0fb39864f67b0f69b04aafbd2c6d17656d3d307096012893cbcf4fba36831a20"} Dec 04 10:29:11 crc kubenswrapper[4831]: I1204 10:29:11.015104 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-m2fz5" event={"ID":"fc7c8ea8-2f48-4b72-8209-3763c0fe74e4","Type":"ContainerStarted","Data":"ccb094b92c24cc9e8975ca361cc107c57c99ae1f791733e7e60b3739406599b5"} Dec 04 10:29:11 crc kubenswrapper[4831]: I1204 10:29:11.635620 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/43cdb7df-ed6c-4f7b-b460-467a22bfe06c-memberlist\") pod \"speaker-rgd8b\" (UID: \"43cdb7df-ed6c-4f7b-b460-467a22bfe06c\") " pod="metallb-system/speaker-rgd8b" Dec 04 10:29:11 crc kubenswrapper[4831]: I1204 10:29:11.643095 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/43cdb7df-ed6c-4f7b-b460-467a22bfe06c-memberlist\") pod \"speaker-rgd8b\" (UID: \"43cdb7df-ed6c-4f7b-b460-467a22bfe06c\") " pod="metallb-system/speaker-rgd8b" Dec 04 10:29:11 crc kubenswrapper[4831]: I1204 10:29:11.748715 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-rgd8b" Dec 04 10:29:12 crc kubenswrapper[4831]: I1204 10:29:12.030649 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-rgd8b" event={"ID":"43cdb7df-ed6c-4f7b-b460-467a22bfe06c","Type":"ContainerStarted","Data":"d2a23639dd50fbabbbfb7b0e0684e182a33011295095c834e70dfd3d2e79ffb5"} Dec 04 10:29:12 crc kubenswrapper[4831]: I1204 10:29:12.036710 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-m2fz5" event={"ID":"fc7c8ea8-2f48-4b72-8209-3763c0fe74e4","Type":"ContainerStarted","Data":"c3f79df57d18ebc6ad4a661274ae093d7463a7ed9783bbc9d6a337a0615fb3b4"} Dec 04 10:29:12 crc kubenswrapper[4831]: I1204 10:29:12.036781 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-m2fz5" Dec 04 10:29:12 crc kubenswrapper[4831]: I1204 10:29:12.052216 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-m2fz5" podStartSLOduration=3.052197628 podStartE2EDuration="3.052197628s" podCreationTimestamp="2025-12-04 10:29:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:29:12.05001917 +0000 UTC m=+848.999194484" watchObservedRunningTime="2025-12-04 10:29:12.052197628 +0000 UTC m=+849.001372942" Dec 04 10:29:13 crc kubenswrapper[4831]: I1204 10:29:13.048706 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-rgd8b" event={"ID":"43cdb7df-ed6c-4f7b-b460-467a22bfe06c","Type":"ContainerStarted","Data":"e3b382bfa891b65ca0ff26335431b7ccba2f6c12dc4a23d7f9d084f3427cb089"} Dec 04 10:29:13 crc kubenswrapper[4831]: I1204 10:29:13.049016 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-rgd8b" event={"ID":"43cdb7df-ed6c-4f7b-b460-467a22bfe06c","Type":"ContainerStarted","Data":"d0e481f2ea75d86aa537ff9bf150f627a5a01f1f83348428f69b21be2d4eb099"} Dec 04 10:29:13 crc kubenswrapper[4831]: I1204 10:29:13.049034 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-rgd8b" Dec 04 10:29:13 crc kubenswrapper[4831]: I1204 10:29:13.304427 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-rgd8b" podStartSLOduration=4.304408201 podStartE2EDuration="4.304408201s" podCreationTimestamp="2025-12-04 10:29:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:29:13.06840272 +0000 UTC m=+850.017578044" watchObservedRunningTime="2025-12-04 10:29:13.304408201 +0000 UTC m=+850.253583515" Dec 04 10:29:18 crc kubenswrapper[4831]: I1204 10:29:18.091639 4831 generic.go:334] "Generic (PLEG): container finished" podID="c082e7e3-e171-4637-84a9-84f8aa17b51e" containerID="63cf7b437efb8a138e4fa93f98a08068f0fd473a47cbcefb436d1db8e9eba646" exitCode=0 Dec 04 10:29:18 crc kubenswrapper[4831]: I1204 10:29:18.091819 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4d7f7" event={"ID":"c082e7e3-e171-4637-84a9-84f8aa17b51e","Type":"ContainerDied","Data":"63cf7b437efb8a138e4fa93f98a08068f0fd473a47cbcefb436d1db8e9eba646"} Dec 04 10:29:18 crc kubenswrapper[4831]: I1204 10:29:18.095594 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-fw678" event={"ID":"f49f5bce-b752-4729-aa67-28847f9f04b1","Type":"ContainerStarted","Data":"02c5b52f1749d7778856310ae6e0cc290825faa7fe67565a665343569f5e1d2f"} Dec 04 10:29:18 crc kubenswrapper[4831]: I1204 10:29:18.095874 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-fw678" Dec 04 10:29:19 crc kubenswrapper[4831]: I1204 10:29:19.108392 4831 generic.go:334] "Generic (PLEG): container finished" podID="c082e7e3-e171-4637-84a9-84f8aa17b51e" containerID="884a2bebcdee27c100d6548547293e718b92b04ecefca40ecfad413bfde4f280" exitCode=0 Dec 04 10:29:19 crc kubenswrapper[4831]: I1204 10:29:19.109865 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4d7f7" event={"ID":"c082e7e3-e171-4637-84a9-84f8aa17b51e","Type":"ContainerDied","Data":"884a2bebcdee27c100d6548547293e718b92b04ecefca40ecfad413bfde4f280"} Dec 04 10:29:19 crc kubenswrapper[4831]: I1204 10:29:19.161370 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-fw678" podStartSLOduration=3.225005963 podStartE2EDuration="10.161342775s" podCreationTimestamp="2025-12-04 10:29:09 +0000 UTC" firstStartedPulling="2025-12-04 10:29:10.606104689 +0000 UTC m=+847.555280003" lastFinishedPulling="2025-12-04 10:29:17.542441491 +0000 UTC m=+854.491616815" observedRunningTime="2025-12-04 10:29:18.15815309 +0000 UTC m=+855.107328484" watchObservedRunningTime="2025-12-04 10:29:19.161342775 +0000 UTC m=+856.110518129" Dec 04 10:29:20 crc kubenswrapper[4831]: I1204 10:29:20.117577 4831 generic.go:334] "Generic (PLEG): container finished" podID="c082e7e3-e171-4637-84a9-84f8aa17b51e" containerID="69d524943e007e7f4478f927ae26ae3f5b034fe21a78ba1aa97c6d4290547bb9" exitCode=0 Dec 04 10:29:20 crc kubenswrapper[4831]: I1204 10:29:20.117636 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4d7f7" event={"ID":"c082e7e3-e171-4637-84a9-84f8aa17b51e","Type":"ContainerDied","Data":"69d524943e007e7f4478f927ae26ae3f5b034fe21a78ba1aa97c6d4290547bb9"} Dec 04 10:29:20 crc kubenswrapper[4831]: I1204 10:29:20.262229 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-m2fz5" Dec 04 10:29:21 crc kubenswrapper[4831]: I1204 10:29:21.125331 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4d7f7" event={"ID":"c082e7e3-e171-4637-84a9-84f8aa17b51e","Type":"ContainerStarted","Data":"7b0fad2c47e2ee6ee908aa0604675adfe6458900602b62a12953c76b72248800"} Dec 04 10:29:21 crc kubenswrapper[4831]: I1204 10:29:21.125572 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4d7f7" event={"ID":"c082e7e3-e171-4637-84a9-84f8aa17b51e","Type":"ContainerStarted","Data":"be5bdfbcf170c7293f3025dd0aa6d2b928d943d56a1007de98bb0de487410e1c"} Dec 04 10:29:21 crc kubenswrapper[4831]: I1204 10:29:21.125582 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4d7f7" event={"ID":"c082e7e3-e171-4637-84a9-84f8aa17b51e","Type":"ContainerStarted","Data":"c7d04f35e9fcf74d0a7fedc981a33dfcec498bbb8d73562cfa2a07f05fe09607"} Dec 04 10:29:21 crc kubenswrapper[4831]: I1204 10:29:21.125591 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4d7f7" event={"ID":"c082e7e3-e171-4637-84a9-84f8aa17b51e","Type":"ContainerStarted","Data":"7895516c41c9b0071819b9c00af403696b9659fa00644c90593075a27c93f132"} Dec 04 10:29:21 crc kubenswrapper[4831]: I1204 10:29:21.971756 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:29:21 crc kubenswrapper[4831]: I1204 10:29:21.971873 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:29:21 crc kubenswrapper[4831]: I1204 10:29:21.971972 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" Dec 04 10:29:21 crc kubenswrapper[4831]: I1204 10:29:21.973109 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3282642ca73576095ec0c4e4a39a50ef22334c6e41917a860b9796381706e28c"} pod="openshift-machine-config-operator/machine-config-daemon-g76nn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 10:29:21 crc kubenswrapper[4831]: I1204 10:29:21.973246 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" containerID="cri-o://3282642ca73576095ec0c4e4a39a50ef22334c6e41917a860b9796381706e28c" gracePeriod=600 Dec 04 10:29:22 crc kubenswrapper[4831]: I1204 10:29:22.151237 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4d7f7" event={"ID":"c082e7e3-e171-4637-84a9-84f8aa17b51e","Type":"ContainerStarted","Data":"65b09d5c85abb636fc161fc1819f06036791c47bcf579e8155fc966797a8d219"} Dec 04 10:29:22 crc kubenswrapper[4831]: I1204 10:29:22.151278 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4d7f7" event={"ID":"c082e7e3-e171-4637-84a9-84f8aa17b51e","Type":"ContainerStarted","Data":"8744e0da3509c8a6b3c2084c65c5794a57bb13137fc94f7454d3e0d5846f51d9"} Dec 04 10:29:22 crc kubenswrapper[4831]: I1204 10:29:22.151451 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-4d7f7" Dec 04 10:29:22 crc kubenswrapper[4831]: I1204 10:29:22.155407 4831 generic.go:334] "Generic (PLEG): container finished" podID="8475bb26-8864-4d49-935b-db7d4cb73387" containerID="3282642ca73576095ec0c4e4a39a50ef22334c6e41917a860b9796381706e28c" exitCode=0 Dec 04 10:29:22 crc kubenswrapper[4831]: I1204 10:29:22.155441 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" event={"ID":"8475bb26-8864-4d49-935b-db7d4cb73387","Type":"ContainerDied","Data":"3282642ca73576095ec0c4e4a39a50ef22334c6e41917a860b9796381706e28c"} Dec 04 10:29:22 crc kubenswrapper[4831]: I1204 10:29:22.155464 4831 scope.go:117] "RemoveContainer" containerID="5f944904d013e4dfa65943141ad3d8f85fcad518eb19d11e801287afe4feca7d" Dec 04 10:29:22 crc kubenswrapper[4831]: I1204 10:29:22.183743 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-4d7f7" podStartSLOduration=5.965294545 podStartE2EDuration="13.183728439s" podCreationTimestamp="2025-12-04 10:29:09 +0000 UTC" firstStartedPulling="2025-12-04 10:29:10.321899721 +0000 UTC m=+847.271075035" lastFinishedPulling="2025-12-04 10:29:17.540333615 +0000 UTC m=+854.489508929" observedRunningTime="2025-12-04 10:29:22.182719852 +0000 UTC m=+859.131895176" watchObservedRunningTime="2025-12-04 10:29:22.183728439 +0000 UTC m=+859.132903753" Dec 04 10:29:24 crc kubenswrapper[4831]: I1204 10:29:23.171307 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" event={"ID":"8475bb26-8864-4d49-935b-db7d4cb73387","Type":"ContainerStarted","Data":"e3d48d2b893c2eb24b90277e2cc2d8a14727460b3bf0732b9f1999efdd5e7c27"} Dec 04 10:29:25 crc kubenswrapper[4831]: I1204 10:29:25.179299 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-4d7f7" Dec 04 10:29:25 crc kubenswrapper[4831]: I1204 10:29:25.257656 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-4d7f7" Dec 04 10:29:30 crc kubenswrapper[4831]: I1204 10:29:30.165410 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-fw678" Dec 04 10:29:30 crc kubenswrapper[4831]: I1204 10:29:30.181605 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-4d7f7" Dec 04 10:29:31 crc kubenswrapper[4831]: I1204 10:29:31.754779 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-rgd8b" Dec 04 10:29:34 crc kubenswrapper[4831]: I1204 10:29:34.877121 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-gz9v7"] Dec 04 10:29:34 crc kubenswrapper[4831]: I1204 10:29:34.878601 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-gz9v7" Dec 04 10:29:34 crc kubenswrapper[4831]: I1204 10:29:34.883863 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 04 10:29:34 crc kubenswrapper[4831]: I1204 10:29:34.890860 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 04 10:29:34 crc kubenswrapper[4831]: I1204 10:29:34.892576 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-zjwdc" Dec 04 10:29:34 crc kubenswrapper[4831]: I1204 10:29:34.906122 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-gz9v7"] Dec 04 10:29:35 crc kubenswrapper[4831]: I1204 10:29:35.022099 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h22xb\" (UniqueName: \"kubernetes.io/projected/1611e770-dd83-4a9d-a496-1c36c7246ef0-kube-api-access-h22xb\") pod \"openstack-operator-index-gz9v7\" (UID: \"1611e770-dd83-4a9d-a496-1c36c7246ef0\") " pod="openstack-operators/openstack-operator-index-gz9v7" Dec 04 10:29:35 crc kubenswrapper[4831]: I1204 10:29:35.123423 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h22xb\" (UniqueName: \"kubernetes.io/projected/1611e770-dd83-4a9d-a496-1c36c7246ef0-kube-api-access-h22xb\") pod \"openstack-operator-index-gz9v7\" (UID: \"1611e770-dd83-4a9d-a496-1c36c7246ef0\") " pod="openstack-operators/openstack-operator-index-gz9v7" Dec 04 10:29:35 crc kubenswrapper[4831]: I1204 10:29:35.143788 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h22xb\" (UniqueName: \"kubernetes.io/projected/1611e770-dd83-4a9d-a496-1c36c7246ef0-kube-api-access-h22xb\") pod \"openstack-operator-index-gz9v7\" (UID: \"1611e770-dd83-4a9d-a496-1c36c7246ef0\") " pod="openstack-operators/openstack-operator-index-gz9v7" Dec 04 10:29:35 crc kubenswrapper[4831]: I1204 10:29:35.238612 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-gz9v7" Dec 04 10:29:35 crc kubenswrapper[4831]: I1204 10:29:35.471225 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-gz9v7"] Dec 04 10:29:36 crc kubenswrapper[4831]: I1204 10:29:36.317397 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-gz9v7" event={"ID":"1611e770-dd83-4a9d-a496-1c36c7246ef0","Type":"ContainerStarted","Data":"71f3b00d152adf3b907c5cf7648b78dd399763a8810742b683a40d89daf47bfa"} Dec 04 10:29:39 crc kubenswrapper[4831]: I1204 10:29:39.336226 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-gz9v7" event={"ID":"1611e770-dd83-4a9d-a496-1c36c7246ef0","Type":"ContainerStarted","Data":"06bc47b0656a60ddeacefb6e8e84844c5beff829596bfcb4e9bcf2ea023127e1"} Dec 04 10:29:39 crc kubenswrapper[4831]: I1204 10:29:39.354488 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-gz9v7" podStartSLOduration=2.405011521 podStartE2EDuration="5.354467749s" podCreationTimestamp="2025-12-04 10:29:34 +0000 UTC" firstStartedPulling="2025-12-04 10:29:35.481387052 +0000 UTC m=+872.430562376" lastFinishedPulling="2025-12-04 10:29:38.43084329 +0000 UTC m=+875.380018604" observedRunningTime="2025-12-04 10:29:39.351564971 +0000 UTC m=+876.300740325" watchObservedRunningTime="2025-12-04 10:29:39.354467749 +0000 UTC m=+876.303643083" Dec 04 10:29:45 crc kubenswrapper[4831]: I1204 10:29:45.239148 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-gz9v7" Dec 04 10:29:45 crc kubenswrapper[4831]: I1204 10:29:45.241340 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-gz9v7" Dec 04 10:29:45 crc kubenswrapper[4831]: I1204 10:29:45.300835 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-gz9v7" Dec 04 10:29:45 crc kubenswrapper[4831]: I1204 10:29:45.421897 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-gz9v7" Dec 04 10:29:51 crc kubenswrapper[4831]: I1204 10:29:51.964142 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/2bb7770a7d2a9c343b136e044363c452b5840c9b8cf22f31979453c8f154n8q"] Dec 04 10:29:52 crc kubenswrapper[4831]: I1204 10:29:51.967337 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2bb7770a7d2a9c343b136e044363c452b5840c9b8cf22f31979453c8f154n8q" Dec 04 10:29:52 crc kubenswrapper[4831]: I1204 10:29:51.973922 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-65pwn" Dec 04 10:29:52 crc kubenswrapper[4831]: I1204 10:29:51.984342 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/2bb7770a7d2a9c343b136e044363c452b5840c9b8cf22f31979453c8f154n8q"] Dec 04 10:29:52 crc kubenswrapper[4831]: I1204 10:29:52.071145 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kjxd\" (UniqueName: \"kubernetes.io/projected/b1dcbe95-c901-4e40-a828-144992b53376-kube-api-access-8kjxd\") pod \"2bb7770a7d2a9c343b136e044363c452b5840c9b8cf22f31979453c8f154n8q\" (UID: \"b1dcbe95-c901-4e40-a828-144992b53376\") " pod="openstack-operators/2bb7770a7d2a9c343b136e044363c452b5840c9b8cf22f31979453c8f154n8q" Dec 04 10:29:52 crc kubenswrapper[4831]: I1204 10:29:52.071227 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1dcbe95-c901-4e40-a828-144992b53376-util\") pod \"2bb7770a7d2a9c343b136e044363c452b5840c9b8cf22f31979453c8f154n8q\" (UID: \"b1dcbe95-c901-4e40-a828-144992b53376\") " pod="openstack-operators/2bb7770a7d2a9c343b136e044363c452b5840c9b8cf22f31979453c8f154n8q" Dec 04 10:29:52 crc kubenswrapper[4831]: I1204 10:29:52.071294 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1dcbe95-c901-4e40-a828-144992b53376-bundle\") pod \"2bb7770a7d2a9c343b136e044363c452b5840c9b8cf22f31979453c8f154n8q\" (UID: \"b1dcbe95-c901-4e40-a828-144992b53376\") " pod="openstack-operators/2bb7770a7d2a9c343b136e044363c452b5840c9b8cf22f31979453c8f154n8q" Dec 04 10:29:52 crc kubenswrapper[4831]: I1204 10:29:52.172503 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kjxd\" (UniqueName: \"kubernetes.io/projected/b1dcbe95-c901-4e40-a828-144992b53376-kube-api-access-8kjxd\") pod \"2bb7770a7d2a9c343b136e044363c452b5840c9b8cf22f31979453c8f154n8q\" (UID: \"b1dcbe95-c901-4e40-a828-144992b53376\") " pod="openstack-operators/2bb7770a7d2a9c343b136e044363c452b5840c9b8cf22f31979453c8f154n8q" Dec 04 10:29:52 crc kubenswrapper[4831]: I1204 10:29:52.172592 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1dcbe95-c901-4e40-a828-144992b53376-util\") pod \"2bb7770a7d2a9c343b136e044363c452b5840c9b8cf22f31979453c8f154n8q\" (UID: \"b1dcbe95-c901-4e40-a828-144992b53376\") " pod="openstack-operators/2bb7770a7d2a9c343b136e044363c452b5840c9b8cf22f31979453c8f154n8q" Dec 04 10:29:52 crc kubenswrapper[4831]: I1204 10:29:52.172639 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1dcbe95-c901-4e40-a828-144992b53376-bundle\") pod \"2bb7770a7d2a9c343b136e044363c452b5840c9b8cf22f31979453c8f154n8q\" (UID: \"b1dcbe95-c901-4e40-a828-144992b53376\") " pod="openstack-operators/2bb7770a7d2a9c343b136e044363c452b5840c9b8cf22f31979453c8f154n8q" Dec 04 10:29:52 crc kubenswrapper[4831]: I1204 10:29:52.173106 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1dcbe95-c901-4e40-a828-144992b53376-bundle\") pod \"2bb7770a7d2a9c343b136e044363c452b5840c9b8cf22f31979453c8f154n8q\" (UID: \"b1dcbe95-c901-4e40-a828-144992b53376\") " pod="openstack-operators/2bb7770a7d2a9c343b136e044363c452b5840c9b8cf22f31979453c8f154n8q" Dec 04 10:29:52 crc kubenswrapper[4831]: I1204 10:29:52.173245 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1dcbe95-c901-4e40-a828-144992b53376-util\") pod \"2bb7770a7d2a9c343b136e044363c452b5840c9b8cf22f31979453c8f154n8q\" (UID: \"b1dcbe95-c901-4e40-a828-144992b53376\") " pod="openstack-operators/2bb7770a7d2a9c343b136e044363c452b5840c9b8cf22f31979453c8f154n8q" Dec 04 10:29:52 crc kubenswrapper[4831]: I1204 10:29:52.193427 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kjxd\" (UniqueName: \"kubernetes.io/projected/b1dcbe95-c901-4e40-a828-144992b53376-kube-api-access-8kjxd\") pod \"2bb7770a7d2a9c343b136e044363c452b5840c9b8cf22f31979453c8f154n8q\" (UID: \"b1dcbe95-c901-4e40-a828-144992b53376\") " pod="openstack-operators/2bb7770a7d2a9c343b136e044363c452b5840c9b8cf22f31979453c8f154n8q" Dec 04 10:29:52 crc kubenswrapper[4831]: I1204 10:29:52.298913 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2bb7770a7d2a9c343b136e044363c452b5840c9b8cf22f31979453c8f154n8q" Dec 04 10:29:52 crc kubenswrapper[4831]: W1204 10:29:52.546756 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1dcbe95_c901_4e40_a828_144992b53376.slice/crio-d30d76ea5f8baa5d64c8d89a29facec30bf4301875366c478e7c6aa37448fb3e WatchSource:0}: Error finding container d30d76ea5f8baa5d64c8d89a29facec30bf4301875366c478e7c6aa37448fb3e: Status 404 returned error can't find the container with id d30d76ea5f8baa5d64c8d89a29facec30bf4301875366c478e7c6aa37448fb3e Dec 04 10:29:52 crc kubenswrapper[4831]: I1204 10:29:52.547268 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/2bb7770a7d2a9c343b136e044363c452b5840c9b8cf22f31979453c8f154n8q"] Dec 04 10:29:53 crc kubenswrapper[4831]: I1204 10:29:53.442328 4831 generic.go:334] "Generic (PLEG): container finished" podID="b1dcbe95-c901-4e40-a828-144992b53376" containerID="8919ab906fc543ecdff9b8a8c931be7020c1d036dd90e1155d5f74b96a0877a9" exitCode=0 Dec 04 10:29:53 crc kubenswrapper[4831]: I1204 10:29:53.442457 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2bb7770a7d2a9c343b136e044363c452b5840c9b8cf22f31979453c8f154n8q" event={"ID":"b1dcbe95-c901-4e40-a828-144992b53376","Type":"ContainerDied","Data":"8919ab906fc543ecdff9b8a8c931be7020c1d036dd90e1155d5f74b96a0877a9"} Dec 04 10:29:53 crc kubenswrapper[4831]: I1204 10:29:53.443151 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2bb7770a7d2a9c343b136e044363c452b5840c9b8cf22f31979453c8f154n8q" event={"ID":"b1dcbe95-c901-4e40-a828-144992b53376","Type":"ContainerStarted","Data":"d30d76ea5f8baa5d64c8d89a29facec30bf4301875366c478e7c6aa37448fb3e"} Dec 04 10:29:54 crc kubenswrapper[4831]: I1204 10:29:54.458196 4831 generic.go:334] "Generic (PLEG): container finished" podID="b1dcbe95-c901-4e40-a828-144992b53376" containerID="7d5fd584cf42840c617b7316f735669d1060dadf47e5fd6628a6c580d53b5e3f" exitCode=0 Dec 04 10:29:54 crc kubenswrapper[4831]: I1204 10:29:54.458262 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2bb7770a7d2a9c343b136e044363c452b5840c9b8cf22f31979453c8f154n8q" event={"ID":"b1dcbe95-c901-4e40-a828-144992b53376","Type":"ContainerDied","Data":"7d5fd584cf42840c617b7316f735669d1060dadf47e5fd6628a6c580d53b5e3f"} Dec 04 10:29:55 crc kubenswrapper[4831]: I1204 10:29:55.470334 4831 generic.go:334] "Generic (PLEG): container finished" podID="b1dcbe95-c901-4e40-a828-144992b53376" containerID="22623af7e8ecca4e707cf3cf6858fbe74cf4bdae751995ce07ae64421b05897d" exitCode=0 Dec 04 10:29:55 crc kubenswrapper[4831]: I1204 10:29:55.470409 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2bb7770a7d2a9c343b136e044363c452b5840c9b8cf22f31979453c8f154n8q" event={"ID":"b1dcbe95-c901-4e40-a828-144992b53376","Type":"ContainerDied","Data":"22623af7e8ecca4e707cf3cf6858fbe74cf4bdae751995ce07ae64421b05897d"} Dec 04 10:29:56 crc kubenswrapper[4831]: I1204 10:29:56.857935 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2bb7770a7d2a9c343b136e044363c452b5840c9b8cf22f31979453c8f154n8q" Dec 04 10:29:56 crc kubenswrapper[4831]: I1204 10:29:56.914426 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7hbg8"] Dec 04 10:29:56 crc kubenswrapper[4831]: E1204 10:29:56.914806 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1dcbe95-c901-4e40-a828-144992b53376" containerName="util" Dec 04 10:29:56 crc kubenswrapper[4831]: I1204 10:29:56.914830 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1dcbe95-c901-4e40-a828-144992b53376" containerName="util" Dec 04 10:29:56 crc kubenswrapper[4831]: E1204 10:29:56.914846 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1dcbe95-c901-4e40-a828-144992b53376" containerName="pull" Dec 04 10:29:56 crc kubenswrapper[4831]: I1204 10:29:56.914855 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1dcbe95-c901-4e40-a828-144992b53376" containerName="pull" Dec 04 10:29:56 crc kubenswrapper[4831]: E1204 10:29:56.914874 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1dcbe95-c901-4e40-a828-144992b53376" containerName="extract" Dec 04 10:29:56 crc kubenswrapper[4831]: I1204 10:29:56.914884 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1dcbe95-c901-4e40-a828-144992b53376" containerName="extract" Dec 04 10:29:56 crc kubenswrapper[4831]: I1204 10:29:56.915062 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1dcbe95-c901-4e40-a828-144992b53376" containerName="extract" Dec 04 10:29:56 crc kubenswrapper[4831]: I1204 10:29:56.916527 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7hbg8" Dec 04 10:29:56 crc kubenswrapper[4831]: I1204 10:29:56.946650 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7hbg8"] Dec 04 10:29:56 crc kubenswrapper[4831]: I1204 10:29:56.967600 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1dcbe95-c901-4e40-a828-144992b53376-bundle\") pod \"b1dcbe95-c901-4e40-a828-144992b53376\" (UID: \"b1dcbe95-c901-4e40-a828-144992b53376\") " Dec 04 10:29:56 crc kubenswrapper[4831]: I1204 10:29:56.967732 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kjxd\" (UniqueName: \"kubernetes.io/projected/b1dcbe95-c901-4e40-a828-144992b53376-kube-api-access-8kjxd\") pod \"b1dcbe95-c901-4e40-a828-144992b53376\" (UID: \"b1dcbe95-c901-4e40-a828-144992b53376\") " Dec 04 10:29:56 crc kubenswrapper[4831]: I1204 10:29:56.967784 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1dcbe95-c901-4e40-a828-144992b53376-util\") pod \"b1dcbe95-c901-4e40-a828-144992b53376\" (UID: \"b1dcbe95-c901-4e40-a828-144992b53376\") " Dec 04 10:29:56 crc kubenswrapper[4831]: I1204 10:29:56.967995 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7b2k\" (UniqueName: \"kubernetes.io/projected/c4ad7a05-4435-4e15-9402-60bcdb3cda0f-kube-api-access-q7b2k\") pod \"community-operators-7hbg8\" (UID: \"c4ad7a05-4435-4e15-9402-60bcdb3cda0f\") " pod="openshift-marketplace/community-operators-7hbg8" Dec 04 10:29:56 crc kubenswrapper[4831]: I1204 10:29:56.968060 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ad7a05-4435-4e15-9402-60bcdb3cda0f-utilities\") pod \"community-operators-7hbg8\" (UID: \"c4ad7a05-4435-4e15-9402-60bcdb3cda0f\") " pod="openshift-marketplace/community-operators-7hbg8" Dec 04 10:29:56 crc kubenswrapper[4831]: I1204 10:29:56.968094 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ad7a05-4435-4e15-9402-60bcdb3cda0f-catalog-content\") pod \"community-operators-7hbg8\" (UID: \"c4ad7a05-4435-4e15-9402-60bcdb3cda0f\") " pod="openshift-marketplace/community-operators-7hbg8" Dec 04 10:29:56 crc kubenswrapper[4831]: I1204 10:29:56.968578 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1dcbe95-c901-4e40-a828-144992b53376-bundle" (OuterVolumeSpecName: "bundle") pod "b1dcbe95-c901-4e40-a828-144992b53376" (UID: "b1dcbe95-c901-4e40-a828-144992b53376"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:29:56 crc kubenswrapper[4831]: I1204 10:29:56.973641 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1dcbe95-c901-4e40-a828-144992b53376-kube-api-access-8kjxd" (OuterVolumeSpecName: "kube-api-access-8kjxd") pod "b1dcbe95-c901-4e40-a828-144992b53376" (UID: "b1dcbe95-c901-4e40-a828-144992b53376"). InnerVolumeSpecName "kube-api-access-8kjxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:29:56 crc kubenswrapper[4831]: I1204 10:29:56.984918 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1dcbe95-c901-4e40-a828-144992b53376-util" (OuterVolumeSpecName: "util") pod "b1dcbe95-c901-4e40-a828-144992b53376" (UID: "b1dcbe95-c901-4e40-a828-144992b53376"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:29:57 crc kubenswrapper[4831]: I1204 10:29:57.069402 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ad7a05-4435-4e15-9402-60bcdb3cda0f-utilities\") pod \"community-operators-7hbg8\" (UID: \"c4ad7a05-4435-4e15-9402-60bcdb3cda0f\") " pod="openshift-marketplace/community-operators-7hbg8" Dec 04 10:29:57 crc kubenswrapper[4831]: I1204 10:29:57.069453 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ad7a05-4435-4e15-9402-60bcdb3cda0f-catalog-content\") pod \"community-operators-7hbg8\" (UID: \"c4ad7a05-4435-4e15-9402-60bcdb3cda0f\") " pod="openshift-marketplace/community-operators-7hbg8" Dec 04 10:29:57 crc kubenswrapper[4831]: I1204 10:29:57.069518 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7b2k\" (UniqueName: \"kubernetes.io/projected/c4ad7a05-4435-4e15-9402-60bcdb3cda0f-kube-api-access-q7b2k\") pod \"community-operators-7hbg8\" (UID: \"c4ad7a05-4435-4e15-9402-60bcdb3cda0f\") " pod="openshift-marketplace/community-operators-7hbg8" Dec 04 10:29:57 crc kubenswrapper[4831]: I1204 10:29:57.070078 4831 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1dcbe95-c901-4e40-a828-144992b53376-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:29:57 crc kubenswrapper[4831]: I1204 10:29:57.070098 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kjxd\" (UniqueName: \"kubernetes.io/projected/b1dcbe95-c901-4e40-a828-144992b53376-kube-api-access-8kjxd\") on node \"crc\" DevicePath \"\"" Dec 04 10:29:57 crc kubenswrapper[4831]: I1204 10:29:57.070110 4831 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1dcbe95-c901-4e40-a828-144992b53376-util\") on node \"crc\" DevicePath \"\"" Dec 04 10:29:57 crc kubenswrapper[4831]: I1204 10:29:57.071412 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ad7a05-4435-4e15-9402-60bcdb3cda0f-catalog-content\") pod \"community-operators-7hbg8\" (UID: \"c4ad7a05-4435-4e15-9402-60bcdb3cda0f\") " pod="openshift-marketplace/community-operators-7hbg8" Dec 04 10:29:57 crc kubenswrapper[4831]: I1204 10:29:57.071798 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ad7a05-4435-4e15-9402-60bcdb3cda0f-utilities\") pod \"community-operators-7hbg8\" (UID: \"c4ad7a05-4435-4e15-9402-60bcdb3cda0f\") " pod="openshift-marketplace/community-operators-7hbg8" Dec 04 10:29:57 crc kubenswrapper[4831]: I1204 10:29:57.098590 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7b2k\" (UniqueName: \"kubernetes.io/projected/c4ad7a05-4435-4e15-9402-60bcdb3cda0f-kube-api-access-q7b2k\") pod \"community-operators-7hbg8\" (UID: \"c4ad7a05-4435-4e15-9402-60bcdb3cda0f\") " pod="openshift-marketplace/community-operators-7hbg8" Dec 04 10:29:57 crc kubenswrapper[4831]: I1204 10:29:57.242750 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7hbg8" Dec 04 10:29:57 crc kubenswrapper[4831]: I1204 10:29:57.503034 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2bb7770a7d2a9c343b136e044363c452b5840c9b8cf22f31979453c8f154n8q" event={"ID":"b1dcbe95-c901-4e40-a828-144992b53376","Type":"ContainerDied","Data":"d30d76ea5f8baa5d64c8d89a29facec30bf4301875366c478e7c6aa37448fb3e"} Dec 04 10:29:57 crc kubenswrapper[4831]: I1204 10:29:57.503318 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d30d76ea5f8baa5d64c8d89a29facec30bf4301875366c478e7c6aa37448fb3e" Dec 04 10:29:57 crc kubenswrapper[4831]: I1204 10:29:57.503396 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2bb7770a7d2a9c343b136e044363c452b5840c9b8cf22f31979453c8f154n8q" Dec 04 10:29:57 crc kubenswrapper[4831]: I1204 10:29:57.524077 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7hbg8"] Dec 04 10:29:58 crc kubenswrapper[4831]: I1204 10:29:58.513819 4831 generic.go:334] "Generic (PLEG): container finished" podID="c4ad7a05-4435-4e15-9402-60bcdb3cda0f" containerID="ea79bc6d4ecff0be8ec3ddba7a7934d9cfa97de6d116981d27d44b492a44bfbd" exitCode=0 Dec 04 10:29:58 crc kubenswrapper[4831]: I1204 10:29:58.513906 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hbg8" event={"ID":"c4ad7a05-4435-4e15-9402-60bcdb3cda0f","Type":"ContainerDied","Data":"ea79bc6d4ecff0be8ec3ddba7a7934d9cfa97de6d116981d27d44b492a44bfbd"} Dec 04 10:29:58 crc kubenswrapper[4831]: I1204 10:29:58.513980 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hbg8" event={"ID":"c4ad7a05-4435-4e15-9402-60bcdb3cda0f","Type":"ContainerStarted","Data":"bb3c2c5cf04a7a3af276090d1b2e9047a9d36ffa2e4327fa2165ebbe1ce54569"} Dec 04 10:29:59 crc kubenswrapper[4831]: I1204 10:29:59.525419 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hbg8" event={"ID":"c4ad7a05-4435-4e15-9402-60bcdb3cda0f","Type":"ContainerStarted","Data":"9ffb4d77dd5f562344e56442c4d9461af8b6db87480f4450099891c0e75bb413"} Dec 04 10:30:00 crc kubenswrapper[4831]: I1204 10:30:00.174401 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414070-rl7v9"] Dec 04 10:30:00 crc kubenswrapper[4831]: I1204 10:30:00.175286 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-rl7v9" Dec 04 10:30:00 crc kubenswrapper[4831]: I1204 10:30:00.178498 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 10:30:00 crc kubenswrapper[4831]: I1204 10:30:00.179066 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 10:30:00 crc kubenswrapper[4831]: I1204 10:30:00.190032 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414070-rl7v9"] Dec 04 10:30:00 crc kubenswrapper[4831]: I1204 10:30:00.319965 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/afb08d97-21c7-4452-b8a7-8a8776ee28dd-secret-volume\") pod \"collect-profiles-29414070-rl7v9\" (UID: \"afb08d97-21c7-4452-b8a7-8a8776ee28dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-rl7v9" Dec 04 10:30:00 crc kubenswrapper[4831]: I1204 10:30:00.320568 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/afb08d97-21c7-4452-b8a7-8a8776ee28dd-config-volume\") pod \"collect-profiles-29414070-rl7v9\" (UID: \"afb08d97-21c7-4452-b8a7-8a8776ee28dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-rl7v9" Dec 04 10:30:00 crc kubenswrapper[4831]: I1204 10:30:00.320785 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckcj9\" (UniqueName: \"kubernetes.io/projected/afb08d97-21c7-4452-b8a7-8a8776ee28dd-kube-api-access-ckcj9\") pod \"collect-profiles-29414070-rl7v9\" (UID: \"afb08d97-21c7-4452-b8a7-8a8776ee28dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-rl7v9" Dec 04 10:30:00 crc kubenswrapper[4831]: I1204 10:30:00.421846 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckcj9\" (UniqueName: \"kubernetes.io/projected/afb08d97-21c7-4452-b8a7-8a8776ee28dd-kube-api-access-ckcj9\") pod \"collect-profiles-29414070-rl7v9\" (UID: \"afb08d97-21c7-4452-b8a7-8a8776ee28dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-rl7v9" Dec 04 10:30:00 crc kubenswrapper[4831]: I1204 10:30:00.422008 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/afb08d97-21c7-4452-b8a7-8a8776ee28dd-secret-volume\") pod \"collect-profiles-29414070-rl7v9\" (UID: \"afb08d97-21c7-4452-b8a7-8a8776ee28dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-rl7v9" Dec 04 10:30:00 crc kubenswrapper[4831]: I1204 10:30:00.422083 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/afb08d97-21c7-4452-b8a7-8a8776ee28dd-config-volume\") pod \"collect-profiles-29414070-rl7v9\" (UID: \"afb08d97-21c7-4452-b8a7-8a8776ee28dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-rl7v9" Dec 04 10:30:00 crc kubenswrapper[4831]: I1204 10:30:00.423846 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/afb08d97-21c7-4452-b8a7-8a8776ee28dd-config-volume\") pod \"collect-profiles-29414070-rl7v9\" (UID: \"afb08d97-21c7-4452-b8a7-8a8776ee28dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-rl7v9" Dec 04 10:30:00 crc kubenswrapper[4831]: I1204 10:30:00.432168 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/afb08d97-21c7-4452-b8a7-8a8776ee28dd-secret-volume\") pod \"collect-profiles-29414070-rl7v9\" (UID: \"afb08d97-21c7-4452-b8a7-8a8776ee28dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-rl7v9" Dec 04 10:30:00 crc kubenswrapper[4831]: I1204 10:30:00.451353 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckcj9\" (UniqueName: \"kubernetes.io/projected/afb08d97-21c7-4452-b8a7-8a8776ee28dd-kube-api-access-ckcj9\") pod \"collect-profiles-29414070-rl7v9\" (UID: \"afb08d97-21c7-4452-b8a7-8a8776ee28dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-rl7v9" Dec 04 10:30:00 crc kubenswrapper[4831]: I1204 10:30:00.513532 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-rl7v9" Dec 04 10:30:00 crc kubenswrapper[4831]: I1204 10:30:00.538500 4831 generic.go:334] "Generic (PLEG): container finished" podID="c4ad7a05-4435-4e15-9402-60bcdb3cda0f" containerID="9ffb4d77dd5f562344e56442c4d9461af8b6db87480f4450099891c0e75bb413" exitCode=0 Dec 04 10:30:00 crc kubenswrapper[4831]: I1204 10:30:00.540428 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hbg8" event={"ID":"c4ad7a05-4435-4e15-9402-60bcdb3cda0f","Type":"ContainerDied","Data":"9ffb4d77dd5f562344e56442c4d9461af8b6db87480f4450099891c0e75bb413"} Dec 04 10:30:00 crc kubenswrapper[4831]: I1204 10:30:00.985028 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414070-rl7v9"] Dec 04 10:30:01 crc kubenswrapper[4831]: I1204 10:30:01.549456 4831 generic.go:334] "Generic (PLEG): container finished" podID="afb08d97-21c7-4452-b8a7-8a8776ee28dd" containerID="bab3605276bec84e3f327d7aaa6d5632752bdf9a166e75a92d25d378b0f8e676" exitCode=0 Dec 04 10:30:01 crc kubenswrapper[4831]: I1204 10:30:01.549554 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-rl7v9" event={"ID":"afb08d97-21c7-4452-b8a7-8a8776ee28dd","Type":"ContainerDied","Data":"bab3605276bec84e3f327d7aaa6d5632752bdf9a166e75a92d25d378b0f8e676"} Dec 04 10:30:01 crc kubenswrapper[4831]: I1204 10:30:01.551217 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-rl7v9" event={"ID":"afb08d97-21c7-4452-b8a7-8a8776ee28dd","Type":"ContainerStarted","Data":"9ce2821116e97934ca3e1fadd0575a8ee98bfa81d79d4644d265aba027247a6c"} Dec 04 10:30:01 crc kubenswrapper[4831]: I1204 10:30:01.596743 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7hbg8" podStartSLOduration=2.803473056 podStartE2EDuration="5.596715502s" podCreationTimestamp="2025-12-04 10:29:56 +0000 UTC" firstStartedPulling="2025-12-04 10:29:58.515649411 +0000 UTC m=+895.464824725" lastFinishedPulling="2025-12-04 10:30:01.308891847 +0000 UTC m=+898.258067171" observedRunningTime="2025-12-04 10:30:01.594091402 +0000 UTC m=+898.543266786" watchObservedRunningTime="2025-12-04 10:30:01.596715502 +0000 UTC m=+898.545890836" Dec 04 10:30:02 crc kubenswrapper[4831]: I1204 10:30:02.566249 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hbg8" event={"ID":"c4ad7a05-4435-4e15-9402-60bcdb3cda0f","Type":"ContainerStarted","Data":"92eed03502958aea74b5c6d5a2647c5529485c1904e83c617602bcecb84fecb3"} Dec 04 10:30:02 crc kubenswrapper[4831]: I1204 10:30:02.845909 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-rl7v9" Dec 04 10:30:02 crc kubenswrapper[4831]: I1204 10:30:02.957344 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/afb08d97-21c7-4452-b8a7-8a8776ee28dd-secret-volume\") pod \"afb08d97-21c7-4452-b8a7-8a8776ee28dd\" (UID: \"afb08d97-21c7-4452-b8a7-8a8776ee28dd\") " Dec 04 10:30:02 crc kubenswrapper[4831]: I1204 10:30:02.957410 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckcj9\" (UniqueName: \"kubernetes.io/projected/afb08d97-21c7-4452-b8a7-8a8776ee28dd-kube-api-access-ckcj9\") pod \"afb08d97-21c7-4452-b8a7-8a8776ee28dd\" (UID: \"afb08d97-21c7-4452-b8a7-8a8776ee28dd\") " Dec 04 10:30:02 crc kubenswrapper[4831]: I1204 10:30:02.957466 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/afb08d97-21c7-4452-b8a7-8a8776ee28dd-config-volume\") pod \"afb08d97-21c7-4452-b8a7-8a8776ee28dd\" (UID: \"afb08d97-21c7-4452-b8a7-8a8776ee28dd\") " Dec 04 10:30:02 crc kubenswrapper[4831]: I1204 10:30:02.958444 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afb08d97-21c7-4452-b8a7-8a8776ee28dd-config-volume" (OuterVolumeSpecName: "config-volume") pod "afb08d97-21c7-4452-b8a7-8a8776ee28dd" (UID: "afb08d97-21c7-4452-b8a7-8a8776ee28dd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:30:02 crc kubenswrapper[4831]: I1204 10:30:02.963348 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afb08d97-21c7-4452-b8a7-8a8776ee28dd-kube-api-access-ckcj9" (OuterVolumeSpecName: "kube-api-access-ckcj9") pod "afb08d97-21c7-4452-b8a7-8a8776ee28dd" (UID: "afb08d97-21c7-4452-b8a7-8a8776ee28dd"). InnerVolumeSpecName "kube-api-access-ckcj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:30:02 crc kubenswrapper[4831]: I1204 10:30:02.963897 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afb08d97-21c7-4452-b8a7-8a8776ee28dd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "afb08d97-21c7-4452-b8a7-8a8776ee28dd" (UID: "afb08d97-21c7-4452-b8a7-8a8776ee28dd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:30:03 crc kubenswrapper[4831]: I1204 10:30:03.059294 4831 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/afb08d97-21c7-4452-b8a7-8a8776ee28dd-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 10:30:03 crc kubenswrapper[4831]: I1204 10:30:03.059323 4831 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/afb08d97-21c7-4452-b8a7-8a8776ee28dd-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 10:30:03 crc kubenswrapper[4831]: I1204 10:30:03.059333 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckcj9\" (UniqueName: \"kubernetes.io/projected/afb08d97-21c7-4452-b8a7-8a8776ee28dd-kube-api-access-ckcj9\") on node \"crc\" DevicePath \"\"" Dec 04 10:30:03 crc kubenswrapper[4831]: I1204 10:30:03.354785 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-78f7b66457-gw6rr"] Dec 04 10:30:03 crc kubenswrapper[4831]: E1204 10:30:03.355092 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afb08d97-21c7-4452-b8a7-8a8776ee28dd" containerName="collect-profiles" Dec 04 10:30:03 crc kubenswrapper[4831]: I1204 10:30:03.355107 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="afb08d97-21c7-4452-b8a7-8a8776ee28dd" containerName="collect-profiles" Dec 04 10:30:03 crc kubenswrapper[4831]: I1204 10:30:03.355208 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="afb08d97-21c7-4452-b8a7-8a8776ee28dd" containerName="collect-profiles" Dec 04 10:30:03 crc kubenswrapper[4831]: I1204 10:30:03.355852 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-78f7b66457-gw6rr" Dec 04 10:30:03 crc kubenswrapper[4831]: I1204 10:30:03.359369 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-bl952" Dec 04 10:30:03 crc kubenswrapper[4831]: I1204 10:30:03.362036 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-557ks\" (UniqueName: \"kubernetes.io/projected/308de017-ff36-4bae-95e7-e0b5c986e62e-kube-api-access-557ks\") pod \"openstack-operator-controller-operator-78f7b66457-gw6rr\" (UID: \"308de017-ff36-4bae-95e7-e0b5c986e62e\") " pod="openstack-operators/openstack-operator-controller-operator-78f7b66457-gw6rr" Dec 04 10:30:03 crc kubenswrapper[4831]: I1204 10:30:03.393505 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-78f7b66457-gw6rr"] Dec 04 10:30:03 crc kubenswrapper[4831]: I1204 10:30:03.462979 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-557ks\" (UniqueName: \"kubernetes.io/projected/308de017-ff36-4bae-95e7-e0b5c986e62e-kube-api-access-557ks\") pod \"openstack-operator-controller-operator-78f7b66457-gw6rr\" (UID: \"308de017-ff36-4bae-95e7-e0b5c986e62e\") " pod="openstack-operators/openstack-operator-controller-operator-78f7b66457-gw6rr" Dec 04 10:30:03 crc kubenswrapper[4831]: I1204 10:30:03.485146 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-557ks\" (UniqueName: \"kubernetes.io/projected/308de017-ff36-4bae-95e7-e0b5c986e62e-kube-api-access-557ks\") pod \"openstack-operator-controller-operator-78f7b66457-gw6rr\" (UID: \"308de017-ff36-4bae-95e7-e0b5c986e62e\") " pod="openstack-operators/openstack-operator-controller-operator-78f7b66457-gw6rr" Dec 04 10:30:03 crc kubenswrapper[4831]: I1204 10:30:03.572840 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-rl7v9" event={"ID":"afb08d97-21c7-4452-b8a7-8a8776ee28dd","Type":"ContainerDied","Data":"9ce2821116e97934ca3e1fadd0575a8ee98bfa81d79d4644d265aba027247a6c"} Dec 04 10:30:03 crc kubenswrapper[4831]: I1204 10:30:03.572889 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ce2821116e97934ca3e1fadd0575a8ee98bfa81d79d4644d265aba027247a6c" Dec 04 10:30:03 crc kubenswrapper[4831]: I1204 10:30:03.572912 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-rl7v9" Dec 04 10:30:03 crc kubenswrapper[4831]: I1204 10:30:03.672190 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-78f7b66457-gw6rr" Dec 04 10:30:04 crc kubenswrapper[4831]: I1204 10:30:04.114031 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-78f7b66457-gw6rr"] Dec 04 10:30:04 crc kubenswrapper[4831]: W1204 10:30:04.117218 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod308de017_ff36_4bae_95e7_e0b5c986e62e.slice/crio-81cfa6835dfc95c58b650a44a3ff0744ec80ac4d95d8a7ab4f86cf1462f347ba WatchSource:0}: Error finding container 81cfa6835dfc95c58b650a44a3ff0744ec80ac4d95d8a7ab4f86cf1462f347ba: Status 404 returned error can't find the container with id 81cfa6835dfc95c58b650a44a3ff0744ec80ac4d95d8a7ab4f86cf1462f347ba Dec 04 10:30:04 crc kubenswrapper[4831]: I1204 10:30:04.580365 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-78f7b66457-gw6rr" event={"ID":"308de017-ff36-4bae-95e7-e0b5c986e62e","Type":"ContainerStarted","Data":"81cfa6835dfc95c58b650a44a3ff0744ec80ac4d95d8a7ab4f86cf1462f347ba"} Dec 04 10:30:05 crc kubenswrapper[4831]: E1204 10:30:05.344875 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafb08d97_21c7_4452_b8a7_8a8776ee28dd.slice/crio-9ce2821116e97934ca3e1fadd0575a8ee98bfa81d79d4644d265aba027247a6c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafb08d97_21c7_4452_b8a7_8a8776ee28dd.slice\": RecentStats: unable to find data in memory cache]" Dec 04 10:30:07 crc kubenswrapper[4831]: I1204 10:30:07.243548 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7hbg8" Dec 04 10:30:07 crc kubenswrapper[4831]: I1204 10:30:07.243818 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7hbg8" Dec 04 10:30:07 crc kubenswrapper[4831]: I1204 10:30:07.288222 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7hbg8" Dec 04 10:30:07 crc kubenswrapper[4831]: I1204 10:30:07.671410 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7hbg8" Dec 04 10:30:08 crc kubenswrapper[4831]: I1204 10:30:08.114131 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-47dct"] Dec 04 10:30:08 crc kubenswrapper[4831]: I1204 10:30:08.120602 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-47dct" Dec 04 10:30:08 crc kubenswrapper[4831]: I1204 10:30:08.123874 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-47dct"] Dec 04 10:30:08 crc kubenswrapper[4831]: I1204 10:30:08.137731 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79-utilities\") pod \"certified-operators-47dct\" (UID: \"0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79\") " pod="openshift-marketplace/certified-operators-47dct" Dec 04 10:30:08 crc kubenswrapper[4831]: I1204 10:30:08.138308 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79-catalog-content\") pod \"certified-operators-47dct\" (UID: \"0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79\") " pod="openshift-marketplace/certified-operators-47dct" Dec 04 10:30:08 crc kubenswrapper[4831]: I1204 10:30:08.138469 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk2tm\" (UniqueName: \"kubernetes.io/projected/0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79-kube-api-access-kk2tm\") pod \"certified-operators-47dct\" (UID: \"0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79\") " pod="openshift-marketplace/certified-operators-47dct" Dec 04 10:30:08 crc kubenswrapper[4831]: I1204 10:30:08.240127 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79-utilities\") pod \"certified-operators-47dct\" (UID: \"0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79\") " pod="openshift-marketplace/certified-operators-47dct" Dec 04 10:30:08 crc kubenswrapper[4831]: I1204 10:30:08.240622 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79-catalog-content\") pod \"certified-operators-47dct\" (UID: \"0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79\") " pod="openshift-marketplace/certified-operators-47dct" Dec 04 10:30:08 crc kubenswrapper[4831]: I1204 10:30:08.240980 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk2tm\" (UniqueName: \"kubernetes.io/projected/0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79-kube-api-access-kk2tm\") pod \"certified-operators-47dct\" (UID: \"0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79\") " pod="openshift-marketplace/certified-operators-47dct" Dec 04 10:30:08 crc kubenswrapper[4831]: I1204 10:30:08.241210 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79-catalog-content\") pod \"certified-operators-47dct\" (UID: \"0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79\") " pod="openshift-marketplace/certified-operators-47dct" Dec 04 10:30:08 crc kubenswrapper[4831]: I1204 10:30:08.240797 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79-utilities\") pod \"certified-operators-47dct\" (UID: \"0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79\") " pod="openshift-marketplace/certified-operators-47dct" Dec 04 10:30:08 crc kubenswrapper[4831]: I1204 10:30:08.265482 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk2tm\" (UniqueName: \"kubernetes.io/projected/0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79-kube-api-access-kk2tm\") pod \"certified-operators-47dct\" (UID: \"0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79\") " pod="openshift-marketplace/certified-operators-47dct" Dec 04 10:30:08 crc kubenswrapper[4831]: I1204 10:30:08.447794 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-47dct" Dec 04 10:30:09 crc kubenswrapper[4831]: I1204 10:30:09.111361 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-47dct"] Dec 04 10:30:09 crc kubenswrapper[4831]: W1204 10:30:09.119610 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0dc98bb3_e6ba_4f98_b8ac_e4fa5b333b79.slice/crio-5b5338e07f013259b64c8533070b1389563805ff3b594e8f27d9b6c621034591 WatchSource:0}: Error finding container 5b5338e07f013259b64c8533070b1389563805ff3b594e8f27d9b6c621034591: Status 404 returned error can't find the container with id 5b5338e07f013259b64c8533070b1389563805ff3b594e8f27d9b6c621034591 Dec 04 10:30:09 crc kubenswrapper[4831]: I1204 10:30:09.625646 4831 generic.go:334] "Generic (PLEG): container finished" podID="0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79" containerID="e679a4c381084248cf2b44471c4c35b844c3ddca45b7bb25f734af2aec867459" exitCode=0 Dec 04 10:30:09 crc kubenswrapper[4831]: I1204 10:30:09.625786 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-47dct" event={"ID":"0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79","Type":"ContainerDied","Data":"e679a4c381084248cf2b44471c4c35b844c3ddca45b7bb25f734af2aec867459"} Dec 04 10:30:09 crc kubenswrapper[4831]: I1204 10:30:09.626040 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-47dct" event={"ID":"0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79","Type":"ContainerStarted","Data":"5b5338e07f013259b64c8533070b1389563805ff3b594e8f27d9b6c621034591"} Dec 04 10:30:09 crc kubenswrapper[4831]: I1204 10:30:09.628050 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-78f7b66457-gw6rr" event={"ID":"308de017-ff36-4bae-95e7-e0b5c986e62e","Type":"ContainerStarted","Data":"c30692065b85bb2b4f7a985c6511374ccc1494aceee8e263d29256801345a872"} Dec 04 10:30:10 crc kubenswrapper[4831]: I1204 10:30:10.903464 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7hbg8"] Dec 04 10:30:10 crc kubenswrapper[4831]: I1204 10:30:10.904395 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7hbg8" podUID="c4ad7a05-4435-4e15-9402-60bcdb3cda0f" containerName="registry-server" containerID="cri-o://92eed03502958aea74b5c6d5a2647c5529485c1904e83c617602bcecb84fecb3" gracePeriod=2 Dec 04 10:30:11 crc kubenswrapper[4831]: I1204 10:30:11.562913 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7hbg8" Dec 04 10:30:11 crc kubenswrapper[4831]: I1204 10:30:11.586549 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ad7a05-4435-4e15-9402-60bcdb3cda0f-utilities\") pod \"c4ad7a05-4435-4e15-9402-60bcdb3cda0f\" (UID: \"c4ad7a05-4435-4e15-9402-60bcdb3cda0f\") " Dec 04 10:30:11 crc kubenswrapper[4831]: I1204 10:30:11.586593 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ad7a05-4435-4e15-9402-60bcdb3cda0f-catalog-content\") pod \"c4ad7a05-4435-4e15-9402-60bcdb3cda0f\" (UID: \"c4ad7a05-4435-4e15-9402-60bcdb3cda0f\") " Dec 04 10:30:11 crc kubenswrapper[4831]: I1204 10:30:11.587689 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4ad7a05-4435-4e15-9402-60bcdb3cda0f-utilities" (OuterVolumeSpecName: "utilities") pod "c4ad7a05-4435-4e15-9402-60bcdb3cda0f" (UID: "c4ad7a05-4435-4e15-9402-60bcdb3cda0f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:30:11 crc kubenswrapper[4831]: I1204 10:30:11.599805 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7b2k\" (UniqueName: \"kubernetes.io/projected/c4ad7a05-4435-4e15-9402-60bcdb3cda0f-kube-api-access-q7b2k\") pod \"c4ad7a05-4435-4e15-9402-60bcdb3cda0f\" (UID: \"c4ad7a05-4435-4e15-9402-60bcdb3cda0f\") " Dec 04 10:30:11 crc kubenswrapper[4831]: I1204 10:30:11.600194 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ad7a05-4435-4e15-9402-60bcdb3cda0f-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:30:11 crc kubenswrapper[4831]: I1204 10:30:11.621778 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4ad7a05-4435-4e15-9402-60bcdb3cda0f-kube-api-access-q7b2k" (OuterVolumeSpecName: "kube-api-access-q7b2k") pod "c4ad7a05-4435-4e15-9402-60bcdb3cda0f" (UID: "c4ad7a05-4435-4e15-9402-60bcdb3cda0f"). InnerVolumeSpecName "kube-api-access-q7b2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:30:11 crc kubenswrapper[4831]: I1204 10:30:11.643543 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-78f7b66457-gw6rr" event={"ID":"308de017-ff36-4bae-95e7-e0b5c986e62e","Type":"ContainerStarted","Data":"e55cbdcc981ee3c86b7999d661c286a38ef6197567dd3105984c2ab25ed2871f"} Dec 04 10:30:11 crc kubenswrapper[4831]: I1204 10:30:11.643671 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4ad7a05-4435-4e15-9402-60bcdb3cda0f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c4ad7a05-4435-4e15-9402-60bcdb3cda0f" (UID: "c4ad7a05-4435-4e15-9402-60bcdb3cda0f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:30:11 crc kubenswrapper[4831]: I1204 10:30:11.643776 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-78f7b66457-gw6rr" Dec 04 10:30:11 crc kubenswrapper[4831]: I1204 10:30:11.653926 4831 generic.go:334] "Generic (PLEG): container finished" podID="c4ad7a05-4435-4e15-9402-60bcdb3cda0f" containerID="92eed03502958aea74b5c6d5a2647c5529485c1904e83c617602bcecb84fecb3" exitCode=0 Dec 04 10:30:11 crc kubenswrapper[4831]: I1204 10:30:11.654021 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hbg8" event={"ID":"c4ad7a05-4435-4e15-9402-60bcdb3cda0f","Type":"ContainerDied","Data":"92eed03502958aea74b5c6d5a2647c5529485c1904e83c617602bcecb84fecb3"} Dec 04 10:30:11 crc kubenswrapper[4831]: I1204 10:30:11.654056 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hbg8" event={"ID":"c4ad7a05-4435-4e15-9402-60bcdb3cda0f","Type":"ContainerDied","Data":"bb3c2c5cf04a7a3af276090d1b2e9047a9d36ffa2e4327fa2165ebbe1ce54569"} Dec 04 10:30:11 crc kubenswrapper[4831]: I1204 10:30:11.654079 4831 scope.go:117] "RemoveContainer" containerID="92eed03502958aea74b5c6d5a2647c5529485c1904e83c617602bcecb84fecb3" Dec 04 10:30:11 crc kubenswrapper[4831]: I1204 10:30:11.654222 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7hbg8" Dec 04 10:30:11 crc kubenswrapper[4831]: I1204 10:30:11.659075 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-47dct" event={"ID":"0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79","Type":"ContainerStarted","Data":"5e9c12502ece6fc037cfd5c82d5b2c4429f493a6e1fd61e21b92c0aae492c2a2"} Dec 04 10:30:11 crc kubenswrapper[4831]: I1204 10:30:11.669179 4831 scope.go:117] "RemoveContainer" containerID="9ffb4d77dd5f562344e56442c4d9461af8b6db87480f4450099891c0e75bb413" Dec 04 10:30:11 crc kubenswrapper[4831]: I1204 10:30:11.701676 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7b2k\" (UniqueName: \"kubernetes.io/projected/c4ad7a05-4435-4e15-9402-60bcdb3cda0f-kube-api-access-q7b2k\") on node \"crc\" DevicePath \"\"" Dec 04 10:30:11 crc kubenswrapper[4831]: I1204 10:30:11.701722 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ad7a05-4435-4e15-9402-60bcdb3cda0f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:30:11 crc kubenswrapper[4831]: I1204 10:30:11.703474 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-78f7b66457-gw6rr" podStartSLOduration=1.494620452 podStartE2EDuration="8.703461201s" podCreationTimestamp="2025-12-04 10:30:03 +0000 UTC" firstStartedPulling="2025-12-04 10:30:04.119095916 +0000 UTC m=+901.068271230" lastFinishedPulling="2025-12-04 10:30:11.327936665 +0000 UTC m=+908.277111979" observedRunningTime="2025-12-04 10:30:11.690821643 +0000 UTC m=+908.639996967" watchObservedRunningTime="2025-12-04 10:30:11.703461201 +0000 UTC m=+908.652636515" Dec 04 10:30:11 crc kubenswrapper[4831]: I1204 10:30:11.710582 4831 scope.go:117] "RemoveContainer" containerID="ea79bc6d4ecff0be8ec3ddba7a7934d9cfa97de6d116981d27d44b492a44bfbd" Dec 04 10:30:11 crc kubenswrapper[4831]: I1204 10:30:11.731494 4831 scope.go:117] "RemoveContainer" containerID="92eed03502958aea74b5c6d5a2647c5529485c1904e83c617602bcecb84fecb3" Dec 04 10:30:11 crc kubenswrapper[4831]: E1204 10:30:11.731816 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92eed03502958aea74b5c6d5a2647c5529485c1904e83c617602bcecb84fecb3\": container with ID starting with 92eed03502958aea74b5c6d5a2647c5529485c1904e83c617602bcecb84fecb3 not found: ID does not exist" containerID="92eed03502958aea74b5c6d5a2647c5529485c1904e83c617602bcecb84fecb3" Dec 04 10:30:11 crc kubenswrapper[4831]: I1204 10:30:11.731853 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92eed03502958aea74b5c6d5a2647c5529485c1904e83c617602bcecb84fecb3"} err="failed to get container status \"92eed03502958aea74b5c6d5a2647c5529485c1904e83c617602bcecb84fecb3\": rpc error: code = NotFound desc = could not find container \"92eed03502958aea74b5c6d5a2647c5529485c1904e83c617602bcecb84fecb3\": container with ID starting with 92eed03502958aea74b5c6d5a2647c5529485c1904e83c617602bcecb84fecb3 not found: ID does not exist" Dec 04 10:30:11 crc kubenswrapper[4831]: I1204 10:30:11.731880 4831 scope.go:117] "RemoveContainer" containerID="9ffb4d77dd5f562344e56442c4d9461af8b6db87480f4450099891c0e75bb413" Dec 04 10:30:11 crc kubenswrapper[4831]: E1204 10:30:11.732073 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ffb4d77dd5f562344e56442c4d9461af8b6db87480f4450099891c0e75bb413\": container with ID starting with 9ffb4d77dd5f562344e56442c4d9461af8b6db87480f4450099891c0e75bb413 not found: ID does not exist" containerID="9ffb4d77dd5f562344e56442c4d9461af8b6db87480f4450099891c0e75bb413" Dec 04 10:30:11 crc kubenswrapper[4831]: I1204 10:30:11.732106 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ffb4d77dd5f562344e56442c4d9461af8b6db87480f4450099891c0e75bb413"} err="failed to get container status \"9ffb4d77dd5f562344e56442c4d9461af8b6db87480f4450099891c0e75bb413\": rpc error: code = NotFound desc = could not find container \"9ffb4d77dd5f562344e56442c4d9461af8b6db87480f4450099891c0e75bb413\": container with ID starting with 9ffb4d77dd5f562344e56442c4d9461af8b6db87480f4450099891c0e75bb413 not found: ID does not exist" Dec 04 10:30:11 crc kubenswrapper[4831]: I1204 10:30:11.732123 4831 scope.go:117] "RemoveContainer" containerID="ea79bc6d4ecff0be8ec3ddba7a7934d9cfa97de6d116981d27d44b492a44bfbd" Dec 04 10:30:11 crc kubenswrapper[4831]: E1204 10:30:11.732310 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea79bc6d4ecff0be8ec3ddba7a7934d9cfa97de6d116981d27d44b492a44bfbd\": container with ID starting with ea79bc6d4ecff0be8ec3ddba7a7934d9cfa97de6d116981d27d44b492a44bfbd not found: ID does not exist" containerID="ea79bc6d4ecff0be8ec3ddba7a7934d9cfa97de6d116981d27d44b492a44bfbd" Dec 04 10:30:11 crc kubenswrapper[4831]: I1204 10:30:11.732337 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea79bc6d4ecff0be8ec3ddba7a7934d9cfa97de6d116981d27d44b492a44bfbd"} err="failed to get container status \"ea79bc6d4ecff0be8ec3ddba7a7934d9cfa97de6d116981d27d44b492a44bfbd\": rpc error: code = NotFound desc = could not find container \"ea79bc6d4ecff0be8ec3ddba7a7934d9cfa97de6d116981d27d44b492a44bfbd\": container with ID starting with ea79bc6d4ecff0be8ec3ddba7a7934d9cfa97de6d116981d27d44b492a44bfbd not found: ID does not exist" Dec 04 10:30:11 crc kubenswrapper[4831]: I1204 10:30:11.732413 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7hbg8"] Dec 04 10:30:11 crc kubenswrapper[4831]: I1204 10:30:11.737931 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7hbg8"] Dec 04 10:30:12 crc kubenswrapper[4831]: I1204 10:30:12.672552 4831 generic.go:334] "Generic (PLEG): container finished" podID="0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79" containerID="5e9c12502ece6fc037cfd5c82d5b2c4429f493a6e1fd61e21b92c0aae492c2a2" exitCode=0 Dec 04 10:30:12 crc kubenswrapper[4831]: I1204 10:30:12.672615 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-47dct" event={"ID":"0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79","Type":"ContainerDied","Data":"5e9c12502ece6fc037cfd5c82d5b2c4429f493a6e1fd61e21b92c0aae492c2a2"} Dec 04 10:30:13 crc kubenswrapper[4831]: I1204 10:30:13.287234 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4ad7a05-4435-4e15-9402-60bcdb3cda0f" path="/var/lib/kubelet/pods/c4ad7a05-4435-4e15-9402-60bcdb3cda0f/volumes" Dec 04 10:30:13 crc kubenswrapper[4831]: I1204 10:30:13.683199 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-47dct" event={"ID":"0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79","Type":"ContainerStarted","Data":"4cd7fd5c0b127f02ee4110587f7ec410084a12d9b09606a58c032b0611046cb0"} Dec 04 10:30:13 crc kubenswrapper[4831]: I1204 10:30:13.716310 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-47dct" podStartSLOduration=2.171117679 podStartE2EDuration="5.716292322s" podCreationTimestamp="2025-12-04 10:30:08 +0000 UTC" firstStartedPulling="2025-12-04 10:30:09.627983048 +0000 UTC m=+906.577158402" lastFinishedPulling="2025-12-04 10:30:13.173157731 +0000 UTC m=+910.122333045" observedRunningTime="2025-12-04 10:30:13.712571522 +0000 UTC m=+910.661746886" watchObservedRunningTime="2025-12-04 10:30:13.716292322 +0000 UTC m=+910.665467646" Dec 04 10:30:14 crc kubenswrapper[4831]: I1204 10:30:14.510880 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-z728q"] Dec 04 10:30:14 crc kubenswrapper[4831]: E1204 10:30:14.511177 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4ad7a05-4435-4e15-9402-60bcdb3cda0f" containerName="registry-server" Dec 04 10:30:14 crc kubenswrapper[4831]: I1204 10:30:14.511191 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ad7a05-4435-4e15-9402-60bcdb3cda0f" containerName="registry-server" Dec 04 10:30:14 crc kubenswrapper[4831]: E1204 10:30:14.511214 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4ad7a05-4435-4e15-9402-60bcdb3cda0f" containerName="extract-utilities" Dec 04 10:30:14 crc kubenswrapper[4831]: I1204 10:30:14.511222 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ad7a05-4435-4e15-9402-60bcdb3cda0f" containerName="extract-utilities" Dec 04 10:30:14 crc kubenswrapper[4831]: E1204 10:30:14.511233 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4ad7a05-4435-4e15-9402-60bcdb3cda0f" containerName="extract-content" Dec 04 10:30:14 crc kubenswrapper[4831]: I1204 10:30:14.511240 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ad7a05-4435-4e15-9402-60bcdb3cda0f" containerName="extract-content" Dec 04 10:30:14 crc kubenswrapper[4831]: I1204 10:30:14.511381 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4ad7a05-4435-4e15-9402-60bcdb3cda0f" containerName="registry-server" Dec 04 10:30:14 crc kubenswrapper[4831]: I1204 10:30:14.512507 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z728q" Dec 04 10:30:14 crc kubenswrapper[4831]: I1204 10:30:14.530554 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z728q"] Dec 04 10:30:14 crc kubenswrapper[4831]: I1204 10:30:14.643740 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvkl7\" (UniqueName: \"kubernetes.io/projected/ebe3cfaf-869d-49e5-9a25-46b7dfc86a59-kube-api-access-lvkl7\") pod \"redhat-marketplace-z728q\" (UID: \"ebe3cfaf-869d-49e5-9a25-46b7dfc86a59\") " pod="openshift-marketplace/redhat-marketplace-z728q" Dec 04 10:30:14 crc kubenswrapper[4831]: I1204 10:30:14.643844 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebe3cfaf-869d-49e5-9a25-46b7dfc86a59-utilities\") pod \"redhat-marketplace-z728q\" (UID: \"ebe3cfaf-869d-49e5-9a25-46b7dfc86a59\") " pod="openshift-marketplace/redhat-marketplace-z728q" Dec 04 10:30:14 crc kubenswrapper[4831]: I1204 10:30:14.643936 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebe3cfaf-869d-49e5-9a25-46b7dfc86a59-catalog-content\") pod \"redhat-marketplace-z728q\" (UID: \"ebe3cfaf-869d-49e5-9a25-46b7dfc86a59\") " pod="openshift-marketplace/redhat-marketplace-z728q" Dec 04 10:30:14 crc kubenswrapper[4831]: I1204 10:30:14.744911 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebe3cfaf-869d-49e5-9a25-46b7dfc86a59-utilities\") pod \"redhat-marketplace-z728q\" (UID: \"ebe3cfaf-869d-49e5-9a25-46b7dfc86a59\") " pod="openshift-marketplace/redhat-marketplace-z728q" Dec 04 10:30:14 crc kubenswrapper[4831]: I1204 10:30:14.745004 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebe3cfaf-869d-49e5-9a25-46b7dfc86a59-catalog-content\") pod \"redhat-marketplace-z728q\" (UID: \"ebe3cfaf-869d-49e5-9a25-46b7dfc86a59\") " pod="openshift-marketplace/redhat-marketplace-z728q" Dec 04 10:30:14 crc kubenswrapper[4831]: I1204 10:30:14.745073 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvkl7\" (UniqueName: \"kubernetes.io/projected/ebe3cfaf-869d-49e5-9a25-46b7dfc86a59-kube-api-access-lvkl7\") pod \"redhat-marketplace-z728q\" (UID: \"ebe3cfaf-869d-49e5-9a25-46b7dfc86a59\") " pod="openshift-marketplace/redhat-marketplace-z728q" Dec 04 10:30:14 crc kubenswrapper[4831]: I1204 10:30:14.745559 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebe3cfaf-869d-49e5-9a25-46b7dfc86a59-utilities\") pod \"redhat-marketplace-z728q\" (UID: \"ebe3cfaf-869d-49e5-9a25-46b7dfc86a59\") " pod="openshift-marketplace/redhat-marketplace-z728q" Dec 04 10:30:14 crc kubenswrapper[4831]: I1204 10:30:14.745594 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebe3cfaf-869d-49e5-9a25-46b7dfc86a59-catalog-content\") pod \"redhat-marketplace-z728q\" (UID: \"ebe3cfaf-869d-49e5-9a25-46b7dfc86a59\") " pod="openshift-marketplace/redhat-marketplace-z728q" Dec 04 10:30:14 crc kubenswrapper[4831]: I1204 10:30:14.770109 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvkl7\" (UniqueName: \"kubernetes.io/projected/ebe3cfaf-869d-49e5-9a25-46b7dfc86a59-kube-api-access-lvkl7\") pod \"redhat-marketplace-z728q\" (UID: \"ebe3cfaf-869d-49e5-9a25-46b7dfc86a59\") " pod="openshift-marketplace/redhat-marketplace-z728q" Dec 04 10:30:14 crc kubenswrapper[4831]: I1204 10:30:14.832232 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z728q" Dec 04 10:30:15 crc kubenswrapper[4831]: I1204 10:30:15.389035 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z728q"] Dec 04 10:30:15 crc kubenswrapper[4831]: W1204 10:30:15.394018 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebe3cfaf_869d_49e5_9a25_46b7dfc86a59.slice/crio-39c387430e999e018c0e6f9833fbdff911320b406e7dcd26570efaa06e6acf15 WatchSource:0}: Error finding container 39c387430e999e018c0e6f9833fbdff911320b406e7dcd26570efaa06e6acf15: Status 404 returned error can't find the container with id 39c387430e999e018c0e6f9833fbdff911320b406e7dcd26570efaa06e6acf15 Dec 04 10:30:15 crc kubenswrapper[4831]: E1204 10:30:15.501225 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafb08d97_21c7_4452_b8a7_8a8776ee28dd.slice/crio-9ce2821116e97934ca3e1fadd0575a8ee98bfa81d79d4644d265aba027247a6c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafb08d97_21c7_4452_b8a7_8a8776ee28dd.slice\": RecentStats: unable to find data in memory cache]" Dec 04 10:30:15 crc kubenswrapper[4831]: I1204 10:30:15.697901 4831 generic.go:334] "Generic (PLEG): container finished" podID="ebe3cfaf-869d-49e5-9a25-46b7dfc86a59" containerID="3bdcb1046e2f8741fe746c9eba2f31809a71d288c8c0805af85d4bd78405717c" exitCode=0 Dec 04 10:30:15 crc kubenswrapper[4831]: I1204 10:30:15.698100 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z728q" event={"ID":"ebe3cfaf-869d-49e5-9a25-46b7dfc86a59","Type":"ContainerDied","Data":"3bdcb1046e2f8741fe746c9eba2f31809a71d288c8c0805af85d4bd78405717c"} Dec 04 10:30:15 crc kubenswrapper[4831]: I1204 10:30:15.698180 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z728q" event={"ID":"ebe3cfaf-869d-49e5-9a25-46b7dfc86a59","Type":"ContainerStarted","Data":"39c387430e999e018c0e6f9833fbdff911320b406e7dcd26570efaa06e6acf15"} Dec 04 10:30:16 crc kubenswrapper[4831]: I1204 10:30:16.708370 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z728q" event={"ID":"ebe3cfaf-869d-49e5-9a25-46b7dfc86a59","Type":"ContainerStarted","Data":"74e8ecf524438f04ecac2401d068b5808d9ba5ab0ea54108a90c0aecd0c2222d"} Dec 04 10:30:17 crc kubenswrapper[4831]: I1204 10:30:17.721562 4831 generic.go:334] "Generic (PLEG): container finished" podID="ebe3cfaf-869d-49e5-9a25-46b7dfc86a59" containerID="74e8ecf524438f04ecac2401d068b5808d9ba5ab0ea54108a90c0aecd0c2222d" exitCode=0 Dec 04 10:30:17 crc kubenswrapper[4831]: I1204 10:30:17.721635 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z728q" event={"ID":"ebe3cfaf-869d-49e5-9a25-46b7dfc86a59","Type":"ContainerDied","Data":"74e8ecf524438f04ecac2401d068b5808d9ba5ab0ea54108a90c0aecd0c2222d"} Dec 04 10:30:18 crc kubenswrapper[4831]: I1204 10:30:18.448484 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-47dct" Dec 04 10:30:18 crc kubenswrapper[4831]: I1204 10:30:18.448535 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-47dct" Dec 04 10:30:18 crc kubenswrapper[4831]: I1204 10:30:18.512442 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-47dct" Dec 04 10:30:18 crc kubenswrapper[4831]: I1204 10:30:18.739464 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z728q" event={"ID":"ebe3cfaf-869d-49e5-9a25-46b7dfc86a59","Type":"ContainerStarted","Data":"d13a258d304924dbb4e680859d7c521ae68834c09bd01bb1e00b94850fa51bf6"} Dec 04 10:30:18 crc kubenswrapper[4831]: I1204 10:30:18.767104 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-z728q" podStartSLOduration=2.326737527 podStartE2EDuration="4.766936838s" podCreationTimestamp="2025-12-04 10:30:14 +0000 UTC" firstStartedPulling="2025-12-04 10:30:15.699352477 +0000 UTC m=+912.648527801" lastFinishedPulling="2025-12-04 10:30:18.139551788 +0000 UTC m=+915.088727112" observedRunningTime="2025-12-04 10:30:18.763224939 +0000 UTC m=+915.712400273" watchObservedRunningTime="2025-12-04 10:30:18.766936838 +0000 UTC m=+915.716112152" Dec 04 10:30:18 crc kubenswrapper[4831]: I1204 10:30:18.814028 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-47dct" Dec 04 10:30:22 crc kubenswrapper[4831]: I1204 10:30:22.104525 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-47dct"] Dec 04 10:30:22 crc kubenswrapper[4831]: I1204 10:30:22.105188 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-47dct" podUID="0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79" containerName="registry-server" containerID="cri-o://4cd7fd5c0b127f02ee4110587f7ec410084a12d9b09606a58c032b0611046cb0" gracePeriod=2 Dec 04 10:30:23 crc kubenswrapper[4831]: I1204 10:30:23.649717 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-47dct" Dec 04 10:30:23 crc kubenswrapper[4831]: I1204 10:30:23.679390 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-78f7b66457-gw6rr" Dec 04 10:30:23 crc kubenswrapper[4831]: I1204 10:30:23.771368 4831 generic.go:334] "Generic (PLEG): container finished" podID="0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79" containerID="4cd7fd5c0b127f02ee4110587f7ec410084a12d9b09606a58c032b0611046cb0" exitCode=0 Dec 04 10:30:23 crc kubenswrapper[4831]: I1204 10:30:23.771415 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-47dct" event={"ID":"0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79","Type":"ContainerDied","Data":"4cd7fd5c0b127f02ee4110587f7ec410084a12d9b09606a58c032b0611046cb0"} Dec 04 10:30:23 crc kubenswrapper[4831]: I1204 10:30:23.771445 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-47dct" event={"ID":"0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79","Type":"ContainerDied","Data":"5b5338e07f013259b64c8533070b1389563805ff3b594e8f27d9b6c621034591"} Dec 04 10:30:23 crc kubenswrapper[4831]: I1204 10:30:23.771463 4831 scope.go:117] "RemoveContainer" containerID="4cd7fd5c0b127f02ee4110587f7ec410084a12d9b09606a58c032b0611046cb0" Dec 04 10:30:23 crc kubenswrapper[4831]: I1204 10:30:23.771585 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-47dct" Dec 04 10:30:23 crc kubenswrapper[4831]: I1204 10:30:23.773045 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79-utilities\") pod \"0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79\" (UID: \"0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79\") " Dec 04 10:30:23 crc kubenswrapper[4831]: I1204 10:30:23.773114 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kk2tm\" (UniqueName: \"kubernetes.io/projected/0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79-kube-api-access-kk2tm\") pod \"0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79\" (UID: \"0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79\") " Dec 04 10:30:23 crc kubenswrapper[4831]: I1204 10:30:23.773192 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79-catalog-content\") pod \"0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79\" (UID: \"0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79\") " Dec 04 10:30:23 crc kubenswrapper[4831]: I1204 10:30:23.774056 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79-utilities" (OuterVolumeSpecName: "utilities") pod "0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79" (UID: "0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:30:23 crc kubenswrapper[4831]: I1204 10:30:23.785481 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79-kube-api-access-kk2tm" (OuterVolumeSpecName: "kube-api-access-kk2tm") pod "0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79" (UID: "0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79"). InnerVolumeSpecName "kube-api-access-kk2tm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:30:23 crc kubenswrapper[4831]: I1204 10:30:23.810197 4831 scope.go:117] "RemoveContainer" containerID="5e9c12502ece6fc037cfd5c82d5b2c4429f493a6e1fd61e21b92c0aae492c2a2" Dec 04 10:30:23 crc kubenswrapper[4831]: I1204 10:30:23.815607 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79" (UID: "0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:30:23 crc kubenswrapper[4831]: I1204 10:30:23.827681 4831 scope.go:117] "RemoveContainer" containerID="e679a4c381084248cf2b44471c4c35b844c3ddca45b7bb25f734af2aec867459" Dec 04 10:30:23 crc kubenswrapper[4831]: I1204 10:30:23.850354 4831 scope.go:117] "RemoveContainer" containerID="4cd7fd5c0b127f02ee4110587f7ec410084a12d9b09606a58c032b0611046cb0" Dec 04 10:30:23 crc kubenswrapper[4831]: E1204 10:30:23.857128 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cd7fd5c0b127f02ee4110587f7ec410084a12d9b09606a58c032b0611046cb0\": container with ID starting with 4cd7fd5c0b127f02ee4110587f7ec410084a12d9b09606a58c032b0611046cb0 not found: ID does not exist" containerID="4cd7fd5c0b127f02ee4110587f7ec410084a12d9b09606a58c032b0611046cb0" Dec 04 10:30:23 crc kubenswrapper[4831]: I1204 10:30:23.857307 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cd7fd5c0b127f02ee4110587f7ec410084a12d9b09606a58c032b0611046cb0"} err="failed to get container status \"4cd7fd5c0b127f02ee4110587f7ec410084a12d9b09606a58c032b0611046cb0\": rpc error: code = NotFound desc = could not find container \"4cd7fd5c0b127f02ee4110587f7ec410084a12d9b09606a58c032b0611046cb0\": container with ID starting with 4cd7fd5c0b127f02ee4110587f7ec410084a12d9b09606a58c032b0611046cb0 not found: ID does not exist" Dec 04 10:30:23 crc kubenswrapper[4831]: I1204 10:30:23.857436 4831 scope.go:117] "RemoveContainer" containerID="5e9c12502ece6fc037cfd5c82d5b2c4429f493a6e1fd61e21b92c0aae492c2a2" Dec 04 10:30:23 crc kubenswrapper[4831]: E1204 10:30:23.857937 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e9c12502ece6fc037cfd5c82d5b2c4429f493a6e1fd61e21b92c0aae492c2a2\": container with ID starting with 5e9c12502ece6fc037cfd5c82d5b2c4429f493a6e1fd61e21b92c0aae492c2a2 not found: ID does not exist" containerID="5e9c12502ece6fc037cfd5c82d5b2c4429f493a6e1fd61e21b92c0aae492c2a2" Dec 04 10:30:23 crc kubenswrapper[4831]: I1204 10:30:23.857983 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e9c12502ece6fc037cfd5c82d5b2c4429f493a6e1fd61e21b92c0aae492c2a2"} err="failed to get container status \"5e9c12502ece6fc037cfd5c82d5b2c4429f493a6e1fd61e21b92c0aae492c2a2\": rpc error: code = NotFound desc = could not find container \"5e9c12502ece6fc037cfd5c82d5b2c4429f493a6e1fd61e21b92c0aae492c2a2\": container with ID starting with 5e9c12502ece6fc037cfd5c82d5b2c4429f493a6e1fd61e21b92c0aae492c2a2 not found: ID does not exist" Dec 04 10:30:23 crc kubenswrapper[4831]: I1204 10:30:23.858014 4831 scope.go:117] "RemoveContainer" containerID="e679a4c381084248cf2b44471c4c35b844c3ddca45b7bb25f734af2aec867459" Dec 04 10:30:23 crc kubenswrapper[4831]: E1204 10:30:23.858481 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e679a4c381084248cf2b44471c4c35b844c3ddca45b7bb25f734af2aec867459\": container with ID starting with e679a4c381084248cf2b44471c4c35b844c3ddca45b7bb25f734af2aec867459 not found: ID does not exist" containerID="e679a4c381084248cf2b44471c4c35b844c3ddca45b7bb25f734af2aec867459" Dec 04 10:30:23 crc kubenswrapper[4831]: I1204 10:30:23.858512 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e679a4c381084248cf2b44471c4c35b844c3ddca45b7bb25f734af2aec867459"} err="failed to get container status \"e679a4c381084248cf2b44471c4c35b844c3ddca45b7bb25f734af2aec867459\": rpc error: code = NotFound desc = could not find container \"e679a4c381084248cf2b44471c4c35b844c3ddca45b7bb25f734af2aec867459\": container with ID starting with e679a4c381084248cf2b44471c4c35b844c3ddca45b7bb25f734af2aec867459 not found: ID does not exist" Dec 04 10:30:23 crc kubenswrapper[4831]: I1204 10:30:23.874512 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:30:23 crc kubenswrapper[4831]: I1204 10:30:23.874541 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:30:23 crc kubenswrapper[4831]: I1204 10:30:23.874552 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kk2tm\" (UniqueName: \"kubernetes.io/projected/0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79-kube-api-access-kk2tm\") on node \"crc\" DevicePath \"\"" Dec 04 10:30:24 crc kubenswrapper[4831]: I1204 10:30:24.100738 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-47dct"] Dec 04 10:30:24 crc kubenswrapper[4831]: I1204 10:30:24.108853 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-47dct"] Dec 04 10:30:24 crc kubenswrapper[4831]: I1204 10:30:24.832765 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-z728q" Dec 04 10:30:24 crc kubenswrapper[4831]: I1204 10:30:24.832962 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-z728q" Dec 04 10:30:24 crc kubenswrapper[4831]: I1204 10:30:24.898218 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-z728q" Dec 04 10:30:25 crc kubenswrapper[4831]: I1204 10:30:25.292047 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79" path="/var/lib/kubelet/pods/0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79/volumes" Dec 04 10:30:25 crc kubenswrapper[4831]: E1204 10:30:25.688166 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafb08d97_21c7_4452_b8a7_8a8776ee28dd.slice/crio-9ce2821116e97934ca3e1fadd0575a8ee98bfa81d79d4644d265aba027247a6c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafb08d97_21c7_4452_b8a7_8a8776ee28dd.slice\": RecentStats: unable to find data in memory cache]" Dec 04 10:30:25 crc kubenswrapper[4831]: I1204 10:30:25.840691 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-z728q" Dec 04 10:30:28 crc kubenswrapper[4831]: I1204 10:30:28.513448 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z728q"] Dec 04 10:30:28 crc kubenswrapper[4831]: I1204 10:30:28.813479 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-z728q" podUID="ebe3cfaf-869d-49e5-9a25-46b7dfc86a59" containerName="registry-server" containerID="cri-o://d13a258d304924dbb4e680859d7c521ae68834c09bd01bb1e00b94850fa51bf6" gracePeriod=2 Dec 04 10:30:29 crc kubenswrapper[4831]: I1204 10:30:29.821046 4831 generic.go:334] "Generic (PLEG): container finished" podID="ebe3cfaf-869d-49e5-9a25-46b7dfc86a59" containerID="d13a258d304924dbb4e680859d7c521ae68834c09bd01bb1e00b94850fa51bf6" exitCode=0 Dec 04 10:30:29 crc kubenswrapper[4831]: I1204 10:30:29.821253 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z728q" event={"ID":"ebe3cfaf-869d-49e5-9a25-46b7dfc86a59","Type":"ContainerDied","Data":"d13a258d304924dbb4e680859d7c521ae68834c09bd01bb1e00b94850fa51bf6"} Dec 04 10:30:30 crc kubenswrapper[4831]: I1204 10:30:30.565007 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z728q" Dec 04 10:30:30 crc kubenswrapper[4831]: I1204 10:30:30.667560 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvkl7\" (UniqueName: \"kubernetes.io/projected/ebe3cfaf-869d-49e5-9a25-46b7dfc86a59-kube-api-access-lvkl7\") pod \"ebe3cfaf-869d-49e5-9a25-46b7dfc86a59\" (UID: \"ebe3cfaf-869d-49e5-9a25-46b7dfc86a59\") " Dec 04 10:30:30 crc kubenswrapper[4831]: I1204 10:30:30.667727 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebe3cfaf-869d-49e5-9a25-46b7dfc86a59-utilities\") pod \"ebe3cfaf-869d-49e5-9a25-46b7dfc86a59\" (UID: \"ebe3cfaf-869d-49e5-9a25-46b7dfc86a59\") " Dec 04 10:30:30 crc kubenswrapper[4831]: I1204 10:30:30.667783 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebe3cfaf-869d-49e5-9a25-46b7dfc86a59-catalog-content\") pod \"ebe3cfaf-869d-49e5-9a25-46b7dfc86a59\" (UID: \"ebe3cfaf-869d-49e5-9a25-46b7dfc86a59\") " Dec 04 10:30:30 crc kubenswrapper[4831]: I1204 10:30:30.668813 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebe3cfaf-869d-49e5-9a25-46b7dfc86a59-utilities" (OuterVolumeSpecName: "utilities") pod "ebe3cfaf-869d-49e5-9a25-46b7dfc86a59" (UID: "ebe3cfaf-869d-49e5-9a25-46b7dfc86a59"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:30:30 crc kubenswrapper[4831]: I1204 10:30:30.676942 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebe3cfaf-869d-49e5-9a25-46b7dfc86a59-kube-api-access-lvkl7" (OuterVolumeSpecName: "kube-api-access-lvkl7") pod "ebe3cfaf-869d-49e5-9a25-46b7dfc86a59" (UID: "ebe3cfaf-869d-49e5-9a25-46b7dfc86a59"). InnerVolumeSpecName "kube-api-access-lvkl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:30:30 crc kubenswrapper[4831]: I1204 10:30:30.684498 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebe3cfaf-869d-49e5-9a25-46b7dfc86a59-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ebe3cfaf-869d-49e5-9a25-46b7dfc86a59" (UID: "ebe3cfaf-869d-49e5-9a25-46b7dfc86a59"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:30:30 crc kubenswrapper[4831]: I1204 10:30:30.768977 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebe3cfaf-869d-49e5-9a25-46b7dfc86a59-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:30:30 crc kubenswrapper[4831]: I1204 10:30:30.769006 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebe3cfaf-869d-49e5-9a25-46b7dfc86a59-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:30:30 crc kubenswrapper[4831]: I1204 10:30:30.769016 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvkl7\" (UniqueName: \"kubernetes.io/projected/ebe3cfaf-869d-49e5-9a25-46b7dfc86a59-kube-api-access-lvkl7\") on node \"crc\" DevicePath \"\"" Dec 04 10:30:30 crc kubenswrapper[4831]: I1204 10:30:30.827826 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z728q" event={"ID":"ebe3cfaf-869d-49e5-9a25-46b7dfc86a59","Type":"ContainerDied","Data":"39c387430e999e018c0e6f9833fbdff911320b406e7dcd26570efaa06e6acf15"} Dec 04 10:30:30 crc kubenswrapper[4831]: I1204 10:30:30.827877 4831 scope.go:117] "RemoveContainer" containerID="d13a258d304924dbb4e680859d7c521ae68834c09bd01bb1e00b94850fa51bf6" Dec 04 10:30:30 crc kubenswrapper[4831]: I1204 10:30:30.827892 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z728q" Dec 04 10:30:30 crc kubenswrapper[4831]: I1204 10:30:30.854885 4831 scope.go:117] "RemoveContainer" containerID="74e8ecf524438f04ecac2401d068b5808d9ba5ab0ea54108a90c0aecd0c2222d" Dec 04 10:30:30 crc kubenswrapper[4831]: I1204 10:30:30.856422 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z728q"] Dec 04 10:30:30 crc kubenswrapper[4831]: I1204 10:30:30.864110 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-z728q"] Dec 04 10:30:30 crc kubenswrapper[4831]: I1204 10:30:30.875643 4831 scope.go:117] "RemoveContainer" containerID="3bdcb1046e2f8741fe746c9eba2f31809a71d288c8c0805af85d4bd78405717c" Dec 04 10:30:31 crc kubenswrapper[4831]: I1204 10:30:31.283226 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebe3cfaf-869d-49e5-9a25-46b7dfc86a59" path="/var/lib/kubelet/pods/ebe3cfaf-869d-49e5-9a25-46b7dfc86a59/volumes" Dec 04 10:30:35 crc kubenswrapper[4831]: E1204 10:30:35.859421 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafb08d97_21c7_4452_b8a7_8a8776ee28dd.slice/crio-9ce2821116e97934ca3e1fadd0575a8ee98bfa81d79d4644d265aba027247a6c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafb08d97_21c7_4452_b8a7_8a8776ee28dd.slice\": RecentStats: unable to find data in memory cache]" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.200644 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5bfbbb859d-bcnp2"] Dec 04 10:30:41 crc kubenswrapper[4831]: E1204 10:30:41.201492 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebe3cfaf-869d-49e5-9a25-46b7dfc86a59" containerName="registry-server" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.201510 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebe3cfaf-869d-49e5-9a25-46b7dfc86a59" containerName="registry-server" Dec 04 10:30:41 crc kubenswrapper[4831]: E1204 10:30:41.201520 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79" containerName="extract-utilities" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.201526 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79" containerName="extract-utilities" Dec 04 10:30:41 crc kubenswrapper[4831]: E1204 10:30:41.201540 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79" containerName="extract-content" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.201547 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79" containerName="extract-content" Dec 04 10:30:41 crc kubenswrapper[4831]: E1204 10:30:41.201555 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebe3cfaf-869d-49e5-9a25-46b7dfc86a59" containerName="extract-content" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.201561 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebe3cfaf-869d-49e5-9a25-46b7dfc86a59" containerName="extract-content" Dec 04 10:30:41 crc kubenswrapper[4831]: E1204 10:30:41.201571 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebe3cfaf-869d-49e5-9a25-46b7dfc86a59" containerName="extract-utilities" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.201576 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebe3cfaf-869d-49e5-9a25-46b7dfc86a59" containerName="extract-utilities" Dec 04 10:30:41 crc kubenswrapper[4831]: E1204 10:30:41.201590 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79" containerName="registry-server" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.201596 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79" containerName="registry-server" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.201757 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dc98bb3-e6ba-4f98-b8ac-e4fa5b333b79" containerName="registry-server" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.201771 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebe3cfaf-869d-49e5-9a25-46b7dfc86a59" containerName="registry-server" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.202576 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-bcnp2" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.207915 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5bfbbb859d-bcnp2"] Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.210092 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-z96zx" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.247222 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-748967c98-qdltx"] Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.258380 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-748967c98-qdltx" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.267180 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-bdclm" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.300827 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6788cc6d75-szknd"] Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.302426 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-szknd" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.310750 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-mf8z8" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.314080 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxrc2\" (UniqueName: \"kubernetes.io/projected/2fa5a388-dd09-43cb-90d4-01bc536e1e82-kube-api-access-kxrc2\") pod \"barbican-operator-controller-manager-5bfbbb859d-bcnp2\" (UID: \"2fa5a388-dd09-43cb-90d4-01bc536e1e82\") " pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-bcnp2" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.314149 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmtjp\" (UniqueName: \"kubernetes.io/projected/f635920e-b830-40f5-afbb-d3a21ac15900-kube-api-access-kmtjp\") pod \"cinder-operator-controller-manager-748967c98-qdltx\" (UID: \"f635920e-b830-40f5-afbb-d3a21ac15900\") " pod="openstack-operators/cinder-operator-controller-manager-748967c98-qdltx" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.337324 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6788cc6d75-szknd"] Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.347571 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-748967c98-qdltx"] Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.354951 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-85fbd69fcd-pkdwc"] Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.356345 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-85fbd69fcd-pkdwc" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.364008 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-xd2kh" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.369186 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-85fbd69fcd-pkdwc"] Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.382329 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-698d6fd7d6-8ff6n"] Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.383411 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-8ff6n" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.385643 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-6wg6g" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.413512 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-78d6x"] Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.414484 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-78d6x" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.415818 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btl9s\" (UniqueName: \"kubernetes.io/projected/2f1667cb-e22c-443f-ab25-594205cd0f52-kube-api-access-btl9s\") pod \"glance-operator-controller-manager-85fbd69fcd-pkdwc\" (UID: \"2f1667cb-e22c-443f-ab25-594205cd0f52\") " pod="openstack-operators/glance-operator-controller-manager-85fbd69fcd-pkdwc" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.415899 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxrc2\" (UniqueName: \"kubernetes.io/projected/2fa5a388-dd09-43cb-90d4-01bc536e1e82-kube-api-access-kxrc2\") pod \"barbican-operator-controller-manager-5bfbbb859d-bcnp2\" (UID: \"2fa5a388-dd09-43cb-90d4-01bc536e1e82\") " pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-bcnp2" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.415931 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmtjp\" (UniqueName: \"kubernetes.io/projected/f635920e-b830-40f5-afbb-d3a21ac15900-kube-api-access-kmtjp\") pod \"cinder-operator-controller-manager-748967c98-qdltx\" (UID: \"f635920e-b830-40f5-afbb-d3a21ac15900\") " pod="openstack-operators/cinder-operator-controller-manager-748967c98-qdltx" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.415966 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snz7h\" (UniqueName: \"kubernetes.io/projected/5006fa30-e563-468f-87c1-d062ca2aacc9-kube-api-access-snz7h\") pod \"designate-operator-controller-manager-6788cc6d75-szknd\" (UID: \"5006fa30-e563-468f-87c1-d062ca2aacc9\") " pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-szknd" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.423025 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-j59gn" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.461891 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-698d6fd7d6-8ff6n"] Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.462909 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmtjp\" (UniqueName: \"kubernetes.io/projected/f635920e-b830-40f5-afbb-d3a21ac15900-kube-api-access-kmtjp\") pod \"cinder-operator-controller-manager-748967c98-qdltx\" (UID: \"f635920e-b830-40f5-afbb-d3a21ac15900\") " pod="openstack-operators/cinder-operator-controller-manager-748967c98-qdltx" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.471210 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxrc2\" (UniqueName: \"kubernetes.io/projected/2fa5a388-dd09-43cb-90d4-01bc536e1e82-kube-api-access-kxrc2\") pod \"barbican-operator-controller-manager-5bfbbb859d-bcnp2\" (UID: \"2fa5a388-dd09-43cb-90d4-01bc536e1e82\") " pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-bcnp2" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.479792 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-78d6x"] Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.486370 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-6c55d8d69b-5945b"] Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.487535 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-6c55d8d69b-5945b" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.493574 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-pnw7k" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.493805 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-6c55d8d69b-5945b"] Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.495118 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.509644 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-54485f899-nj5xp"] Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.510944 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-54485f899-nj5xp" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.513261 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-j6b7q" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.517471 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbzvp\" (UniqueName: \"kubernetes.io/projected/50076597-4138-4f7c-ad31-378087dfc135-kube-api-access-vbzvp\") pod \"horizon-operator-controller-manager-7d5d9fd47f-78d6x\" (UID: \"50076597-4138-4f7c-ad31-378087dfc135\") " pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-78d6x" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.517529 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snz7h\" (UniqueName: \"kubernetes.io/projected/5006fa30-e563-468f-87c1-d062ca2aacc9-kube-api-access-snz7h\") pod \"designate-operator-controller-manager-6788cc6d75-szknd\" (UID: \"5006fa30-e563-468f-87c1-d062ca2aacc9\") " pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-szknd" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.517572 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btl9s\" (UniqueName: \"kubernetes.io/projected/2f1667cb-e22c-443f-ab25-594205cd0f52-kube-api-access-btl9s\") pod \"glance-operator-controller-manager-85fbd69fcd-pkdwc\" (UID: \"2f1667cb-e22c-443f-ab25-594205cd0f52\") " pod="openstack-operators/glance-operator-controller-manager-85fbd69fcd-pkdwc" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.517610 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsc8f\" (UniqueName: \"kubernetes.io/projected/fa61aacd-6e6d-4d05-a918-d81916f3f187-kube-api-access-vsc8f\") pod \"heat-operator-controller-manager-698d6fd7d6-8ff6n\" (UID: \"fa61aacd-6e6d-4d05-a918-d81916f3f187\") " pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-8ff6n" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.544984 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-bcnp2" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.551383 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-54485f899-nj5xp"] Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.565046 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btl9s\" (UniqueName: \"kubernetes.io/projected/2f1667cb-e22c-443f-ab25-594205cd0f52-kube-api-access-btl9s\") pod \"glance-operator-controller-manager-85fbd69fcd-pkdwc\" (UID: \"2f1667cb-e22c-443f-ab25-594205cd0f52\") " pod="openstack-operators/glance-operator-controller-manager-85fbd69fcd-pkdwc" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.566747 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-79cc9d59f5-kwgb2"] Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.568108 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-79cc9d59f5-kwgb2" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.571766 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-v626h" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.572592 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snz7h\" (UniqueName: \"kubernetes.io/projected/5006fa30-e563-468f-87c1-d062ca2aacc9-kube-api-access-snz7h\") pod \"designate-operator-controller-manager-6788cc6d75-szknd\" (UID: \"5006fa30-e563-468f-87c1-d062ca2aacc9\") " pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-szknd" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.585018 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-748967c98-qdltx" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.619584 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsc8f\" (UniqueName: \"kubernetes.io/projected/fa61aacd-6e6d-4d05-a918-d81916f3f187-kube-api-access-vsc8f\") pod \"heat-operator-controller-manager-698d6fd7d6-8ff6n\" (UID: \"fa61aacd-6e6d-4d05-a918-d81916f3f187\") " pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-8ff6n" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.619636 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxf8t\" (UniqueName: \"kubernetes.io/projected/16cb21e3-9b80-4fcf-9389-5dc1e2343fa4-kube-api-access-gxf8t\") pod \"infra-operator-controller-manager-6c55d8d69b-5945b\" (UID: \"16cb21e3-9b80-4fcf-9389-5dc1e2343fa4\") " pod="openstack-operators/infra-operator-controller-manager-6c55d8d69b-5945b" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.623646 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhtn9\" (UniqueName: \"kubernetes.io/projected/72db73b6-9683-45a8-8b41-4ced2e11efdd-kube-api-access-bhtn9\") pod \"keystone-operator-controller-manager-79cc9d59f5-kwgb2\" (UID: \"72db73b6-9683-45a8-8b41-4ced2e11efdd\") " pod="openstack-operators/keystone-operator-controller-manager-79cc9d59f5-kwgb2" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.623756 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbzvp\" (UniqueName: \"kubernetes.io/projected/50076597-4138-4f7c-ad31-378087dfc135-kube-api-access-vbzvp\") pod \"horizon-operator-controller-manager-7d5d9fd47f-78d6x\" (UID: \"50076597-4138-4f7c-ad31-378087dfc135\") " pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-78d6x" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.623794 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbrmm\" (UniqueName: \"kubernetes.io/projected/38d9a0e7-49b9-4f91-938e-c040eef2bb37-kube-api-access-xbrmm\") pod \"ironic-operator-controller-manager-54485f899-nj5xp\" (UID: \"38d9a0e7-49b9-4f91-938e-c040eef2bb37\") " pod="openstack-operators/ironic-operator-controller-manager-54485f899-nj5xp" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.623935 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/16cb21e3-9b80-4fcf-9389-5dc1e2343fa4-cert\") pod \"infra-operator-controller-manager-6c55d8d69b-5945b\" (UID: \"16cb21e3-9b80-4fcf-9389-5dc1e2343fa4\") " pod="openstack-operators/infra-operator-controller-manager-6c55d8d69b-5945b" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.626108 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-szknd" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.628057 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-79cc9d59f5-kwgb2"] Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.641042 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-5cbc8c7f96-l5fq4"] Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.642349 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5cbc8c7f96-l5fq4" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.644168 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-wlb7z" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.644816 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsc8f\" (UniqueName: \"kubernetes.io/projected/fa61aacd-6e6d-4d05-a918-d81916f3f187-kube-api-access-vsc8f\") pod \"heat-operator-controller-manager-698d6fd7d6-8ff6n\" (UID: \"fa61aacd-6e6d-4d05-a918-d81916f3f187\") " pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-8ff6n" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.646337 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbzvp\" (UniqueName: \"kubernetes.io/projected/50076597-4138-4f7c-ad31-378087dfc135-kube-api-access-vbzvp\") pod \"horizon-operator-controller-manager-7d5d9fd47f-78d6x\" (UID: \"50076597-4138-4f7c-ad31-378087dfc135\") " pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-78d6x" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.650298 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-64d7c556cd-8dcqz"] Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.653445 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-8dcqz" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.656443 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-dsjbp" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.668199 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5cbc8c7f96-l5fq4"] Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.697002 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-85fbd69fcd-pkdwc" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.715082 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-8ff6n" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.718238 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-58879495c-8ffv5"] Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.719240 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-58879495c-8ffv5" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.726132 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-hjqq7" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.727486 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk2n5\" (UniqueName: \"kubernetes.io/projected/09dfcf4a-2343-414a-a731-a64a914ab3db-kube-api-access-kk2n5\") pod \"manila-operator-controller-manager-5cbc8c7f96-l5fq4\" (UID: \"09dfcf4a-2343-414a-a731-a64a914ab3db\") " pod="openstack-operators/manila-operator-controller-manager-5cbc8c7f96-l5fq4" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.727541 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/16cb21e3-9b80-4fcf-9389-5dc1e2343fa4-cert\") pod \"infra-operator-controller-manager-6c55d8d69b-5945b\" (UID: \"16cb21e3-9b80-4fcf-9389-5dc1e2343fa4\") " pod="openstack-operators/infra-operator-controller-manager-6c55d8d69b-5945b" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.727597 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4znc\" (UniqueName: \"kubernetes.io/projected/10402b7d-27c5-4672-ba4e-2e67fcb5bf68-kube-api-access-g4znc\") pod \"mariadb-operator-controller-manager-64d7c556cd-8dcqz\" (UID: \"10402b7d-27c5-4672-ba4e-2e67fcb5bf68\") " pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-8dcqz" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.727642 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxf8t\" (UniqueName: \"kubernetes.io/projected/16cb21e3-9b80-4fcf-9389-5dc1e2343fa4-kube-api-access-gxf8t\") pod \"infra-operator-controller-manager-6c55d8d69b-5945b\" (UID: \"16cb21e3-9b80-4fcf-9389-5dc1e2343fa4\") " pod="openstack-operators/infra-operator-controller-manager-6c55d8d69b-5945b" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.727692 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhtn9\" (UniqueName: \"kubernetes.io/projected/72db73b6-9683-45a8-8b41-4ced2e11efdd-kube-api-access-bhtn9\") pod \"keystone-operator-controller-manager-79cc9d59f5-kwgb2\" (UID: \"72db73b6-9683-45a8-8b41-4ced2e11efdd\") " pod="openstack-operators/keystone-operator-controller-manager-79cc9d59f5-kwgb2" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.727718 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbrmm\" (UniqueName: \"kubernetes.io/projected/38d9a0e7-49b9-4f91-938e-c040eef2bb37-kube-api-access-xbrmm\") pod \"ironic-operator-controller-manager-54485f899-nj5xp\" (UID: \"38d9a0e7-49b9-4f91-938e-c040eef2bb37\") " pod="openstack-operators/ironic-operator-controller-manager-54485f899-nj5xp" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.729221 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-64d7c556cd-8dcqz"] Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.734490 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/16cb21e3-9b80-4fcf-9389-5dc1e2343fa4-cert\") pod \"infra-operator-controller-manager-6c55d8d69b-5945b\" (UID: \"16cb21e3-9b80-4fcf-9389-5dc1e2343fa4\") " pod="openstack-operators/infra-operator-controller-manager-6c55d8d69b-5945b" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.747942 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-79d658b66d-7djfw"] Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.748752 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-78d6x" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.749723 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-79d658b66d-7djfw" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.754277 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-vs8g8" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.756120 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhtn9\" (UniqueName: \"kubernetes.io/projected/72db73b6-9683-45a8-8b41-4ced2e11efdd-kube-api-access-bhtn9\") pod \"keystone-operator-controller-manager-79cc9d59f5-kwgb2\" (UID: \"72db73b6-9683-45a8-8b41-4ced2e11efdd\") " pod="openstack-operators/keystone-operator-controller-manager-79cc9d59f5-kwgb2" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.757005 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxf8t\" (UniqueName: \"kubernetes.io/projected/16cb21e3-9b80-4fcf-9389-5dc1e2343fa4-kube-api-access-gxf8t\") pod \"infra-operator-controller-manager-6c55d8d69b-5945b\" (UID: \"16cb21e3-9b80-4fcf-9389-5dc1e2343fa4\") " pod="openstack-operators/infra-operator-controller-manager-6c55d8d69b-5945b" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.762488 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-58879495c-8ffv5"] Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.769776 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-79d658b66d-7djfw"] Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.773472 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbrmm\" (UniqueName: \"kubernetes.io/projected/38d9a0e7-49b9-4f91-938e-c040eef2bb37-kube-api-access-xbrmm\") pod \"ironic-operator-controller-manager-54485f899-nj5xp\" (UID: \"38d9a0e7-49b9-4f91-938e-c040eef2bb37\") " pod="openstack-operators/ironic-operator-controller-manager-54485f899-nj5xp" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.815019 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-d5fb87cb8-zcm4n"] Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.821778 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-6c55d8d69b-5945b" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.825194 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-d5fb87cb8-zcm4n" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.829656 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-d5fb87cb8-zcm4n"] Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.830985 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2pbp\" (UniqueName: \"kubernetes.io/projected/b4b2ad0a-7308-4184-9d37-0c6b60ff873c-kube-api-access-h2pbp\") pod \"neutron-operator-controller-manager-58879495c-8ffv5\" (UID: \"b4b2ad0a-7308-4184-9d37-0c6b60ff873c\") " pod="openstack-operators/neutron-operator-controller-manager-58879495c-8ffv5" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.831057 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk2n5\" (UniqueName: \"kubernetes.io/projected/09dfcf4a-2343-414a-a731-a64a914ab3db-kube-api-access-kk2n5\") pod \"manila-operator-controller-manager-5cbc8c7f96-l5fq4\" (UID: \"09dfcf4a-2343-414a-a731-a64a914ab3db\") " pod="openstack-operators/manila-operator-controller-manager-5cbc8c7f96-l5fq4" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.831150 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4znc\" (UniqueName: \"kubernetes.io/projected/10402b7d-27c5-4672-ba4e-2e67fcb5bf68-kube-api-access-g4znc\") pod \"mariadb-operator-controller-manager-64d7c556cd-8dcqz\" (UID: \"10402b7d-27c5-4672-ba4e-2e67fcb5bf68\") " pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-8dcqz" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.831235 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg9xv\" (UniqueName: \"kubernetes.io/projected/2381f3a2-3514-4ff4-bfd7-47cb5587265c-kube-api-access-rg9xv\") pod \"nova-operator-controller-manager-79d658b66d-7djfw\" (UID: \"2381f3a2-3514-4ff4-bfd7-47cb5587265c\") " pod="openstack-operators/nova-operator-controller-manager-79d658b66d-7djfw" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.849001 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-54485f899-nj5xp" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.852924 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-trcsd" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.893574 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4znc\" (UniqueName: \"kubernetes.io/projected/10402b7d-27c5-4672-ba4e-2e67fcb5bf68-kube-api-access-g4znc\") pod \"mariadb-operator-controller-manager-64d7c556cd-8dcqz\" (UID: \"10402b7d-27c5-4672-ba4e-2e67fcb5bf68\") " pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-8dcqz" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.909149 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-jcvk6"] Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.912164 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-jcvk6" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.916685 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-hzj8n" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.929729 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-2j4hr"] Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.931890 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk2n5\" (UniqueName: \"kubernetes.io/projected/09dfcf4a-2343-414a-a731-a64a914ab3db-kube-api-access-kk2n5\") pod \"manila-operator-controller-manager-5cbc8c7f96-l5fq4\" (UID: \"09dfcf4a-2343-414a-a731-a64a914ab3db\") " pod="openstack-operators/manila-operator-controller-manager-5cbc8c7f96-l5fq4" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.932683 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rg9xv\" (UniqueName: \"kubernetes.io/projected/2381f3a2-3514-4ff4-bfd7-47cb5587265c-kube-api-access-rg9xv\") pod \"nova-operator-controller-manager-79d658b66d-7djfw\" (UID: \"2381f3a2-3514-4ff4-bfd7-47cb5587265c\") " pod="openstack-operators/nova-operator-controller-manager-79d658b66d-7djfw" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.932733 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2pbp\" (UniqueName: \"kubernetes.io/projected/b4b2ad0a-7308-4184-9d37-0c6b60ff873c-kube-api-access-h2pbp\") pod \"neutron-operator-controller-manager-58879495c-8ffv5\" (UID: \"b4b2ad0a-7308-4184-9d37-0c6b60ff873c\") " pod="openstack-operators/neutron-operator-controller-manager-58879495c-8ffv5" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.932816 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9l47\" (UniqueName: \"kubernetes.io/projected/9935df64-3d24-41fc-bc66-c2c576211287-kube-api-access-z9l47\") pod \"octavia-operator-controller-manager-d5fb87cb8-zcm4n\" (UID: \"9935df64-3d24-41fc-bc66-c2c576211287\") " pod="openstack-operators/octavia-operator-controller-manager-d5fb87cb8-zcm4n" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.933454 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-79cc9d59f5-kwgb2" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.940588 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-2j4hr" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.947016 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.947106 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-7596t" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.965638 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-jcvk6"] Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.987256 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg9xv\" (UniqueName: \"kubernetes.io/projected/2381f3a2-3514-4ff4-bfd7-47cb5587265c-kube-api-access-rg9xv\") pod \"nova-operator-controller-manager-79d658b66d-7djfw\" (UID: \"2381f3a2-3514-4ff4-bfd7-47cb5587265c\") " pod="openstack-operators/nova-operator-controller-manager-79d658b66d-7djfw" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.989785 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2pbp\" (UniqueName: \"kubernetes.io/projected/b4b2ad0a-7308-4184-9d37-0c6b60ff873c-kube-api-access-h2pbp\") pod \"neutron-operator-controller-manager-58879495c-8ffv5\" (UID: \"b4b2ad0a-7308-4184-9d37-0c6b60ff873c\") " pod="openstack-operators/neutron-operator-controller-manager-58879495c-8ffv5" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.989815 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5cbc8c7f96-l5fq4" Dec 04 10:30:41 crc kubenswrapper[4831]: I1204 10:30:41.990079 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-8dcqz" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.009831 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-867d87977b-m5ds4"] Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.010901 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-867d87977b-m5ds4" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.013384 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-2cl8r" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.028097 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-867d87977b-m5ds4"] Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.034225 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-8f6687c44-6wj9b"] Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.035657 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-8f6687c44-6wj9b" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.038569 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-5p86c" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.039570 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c84j\" (UniqueName: \"kubernetes.io/projected/0c772a6a-6023-4eea-870a-904ae2d47896-kube-api-access-4c84j\") pod \"ovn-operator-controller-manager-5b67cfc8fb-jcvk6\" (UID: \"0c772a6a-6023-4eea-870a-904ae2d47896\") " pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-jcvk6" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.039631 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2cce552d-d699-47f9-9405-1dd33478f23d-cert\") pod \"openstack-baremetal-operator-controller-manager-77868f484-2j4hr\" (UID: \"2cce552d-d699-47f9-9405-1dd33478f23d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-2j4hr" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.039700 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9l47\" (UniqueName: \"kubernetes.io/projected/9935df64-3d24-41fc-bc66-c2c576211287-kube-api-access-z9l47\") pod \"octavia-operator-controller-manager-d5fb87cb8-zcm4n\" (UID: \"9935df64-3d24-41fc-bc66-c2c576211287\") " pod="openstack-operators/octavia-operator-controller-manager-d5fb87cb8-zcm4n" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.039724 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mn84\" (UniqueName: \"kubernetes.io/projected/2cce552d-d699-47f9-9405-1dd33478f23d-kube-api-access-8mn84\") pod \"openstack-baremetal-operator-controller-manager-77868f484-2j4hr\" (UID: \"2cce552d-d699-47f9-9405-1dd33478f23d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-2j4hr" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.049874 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-58879495c-8ffv5" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.053129 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-2j4hr"] Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.065499 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9l47\" (UniqueName: \"kubernetes.io/projected/9935df64-3d24-41fc-bc66-c2c576211287-kube-api-access-z9l47\") pod \"octavia-operator-controller-manager-d5fb87cb8-zcm4n\" (UID: \"9935df64-3d24-41fc-bc66-c2c576211287\") " pod="openstack-operators/octavia-operator-controller-manager-d5fb87cb8-zcm4n" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.068487 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-79d658b66d-7djfw" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.071638 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-8f6687c44-6wj9b"] Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.088701 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-695797c565-7tbpv"] Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.089944 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-695797c565-7tbpv" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.093831 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-vqkxw" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.105613 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-695797c565-7tbpv"] Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.123325 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-bb86466d8-x2g4s"] Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.125462 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-bb86466d8-x2g4s" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.128397 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-bb86466d8-x2g4s"] Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.134187 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-j7qqq" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.141324 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phzfd\" (UniqueName: \"kubernetes.io/projected/f2bea76c-1144-4d84-b1f3-e00fd84aa09d-kube-api-access-phzfd\") pod \"swift-operator-controller-manager-8f6687c44-6wj9b\" (UID: \"f2bea76c-1144-4d84-b1f3-e00fd84aa09d\") " pod="openstack-operators/swift-operator-controller-manager-8f6687c44-6wj9b" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.141389 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mn84\" (UniqueName: \"kubernetes.io/projected/2cce552d-d699-47f9-9405-1dd33478f23d-kube-api-access-8mn84\") pod \"openstack-baremetal-operator-controller-manager-77868f484-2j4hr\" (UID: \"2cce552d-d699-47f9-9405-1dd33478f23d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-2j4hr" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.141434 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c84j\" (UniqueName: \"kubernetes.io/projected/0c772a6a-6023-4eea-870a-904ae2d47896-kube-api-access-4c84j\") pod \"ovn-operator-controller-manager-5b67cfc8fb-jcvk6\" (UID: \"0c772a6a-6023-4eea-870a-904ae2d47896\") " pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-jcvk6" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.141521 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4cln\" (UniqueName: \"kubernetes.io/projected/b3bc64ee-f0c4-4c2a-99b7-0c7bd8a5a970-kube-api-access-t4cln\") pod \"telemetry-operator-controller-manager-695797c565-7tbpv\" (UID: \"b3bc64ee-f0c4-4c2a-99b7-0c7bd8a5a970\") " pod="openstack-operators/telemetry-operator-controller-manager-695797c565-7tbpv" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.141549 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k4nz\" (UniqueName: \"kubernetes.io/projected/de9a52da-79c5-43dd-8a20-eb9917314ad5-kube-api-access-5k4nz\") pod \"placement-operator-controller-manager-867d87977b-m5ds4\" (UID: \"de9a52da-79c5-43dd-8a20-eb9917314ad5\") " pod="openstack-operators/placement-operator-controller-manager-867d87977b-m5ds4" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.141593 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2cce552d-d699-47f9-9405-1dd33478f23d-cert\") pod \"openstack-baremetal-operator-controller-manager-77868f484-2j4hr\" (UID: \"2cce552d-d699-47f9-9405-1dd33478f23d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-2j4hr" Dec 04 10:30:42 crc kubenswrapper[4831]: E1204 10:30:42.141759 4831 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 10:30:42 crc kubenswrapper[4831]: E1204 10:30:42.141936 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2cce552d-d699-47f9-9405-1dd33478f23d-cert podName:2cce552d-d699-47f9-9405-1dd33478f23d nodeName:}" failed. No retries permitted until 2025-12-04 10:30:42.641801922 +0000 UTC m=+939.590977236 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2cce552d-d699-47f9-9405-1dd33478f23d-cert") pod "openstack-baremetal-operator-controller-manager-77868f484-2j4hr" (UID: "2cce552d-d699-47f9-9405-1dd33478f23d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.160209 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7f99c4b8d7-bl79v"] Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.161638 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7f99c4b8d7-bl79v" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.164952 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-z5wrw" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.168768 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mn84\" (UniqueName: \"kubernetes.io/projected/2cce552d-d699-47f9-9405-1dd33478f23d-kube-api-access-8mn84\") pod \"openstack-baremetal-operator-controller-manager-77868f484-2j4hr\" (UID: \"2cce552d-d699-47f9-9405-1dd33478f23d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-2j4hr" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.174496 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7f99c4b8d7-bl79v"] Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.178351 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.178370 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c84j\" (UniqueName: \"kubernetes.io/projected/0c772a6a-6023-4eea-870a-904ae2d47896-kube-api-access-4c84j\") pod \"ovn-operator-controller-manager-5b67cfc8fb-jcvk6\" (UID: \"0c772a6a-6023-4eea-870a-904ae2d47896\") " pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-jcvk6" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.200737 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-76898dc959-mxx27"] Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.204201 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-76898dc959-mxx27" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.207981 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.208173 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-vrr5q" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.211314 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-d5fb87cb8-zcm4n" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.211935 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-76898dc959-mxx27"] Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.244313 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr896\" (UniqueName: \"kubernetes.io/projected/8a70b911-37d6-41d5-81cf-9c631572a523-kube-api-access-lr896\") pod \"openstack-operator-controller-manager-76898dc959-mxx27\" (UID: \"8a70b911-37d6-41d5-81cf-9c631572a523\") " pod="openstack-operators/openstack-operator-controller-manager-76898dc959-mxx27" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.244353 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phzfd\" (UniqueName: \"kubernetes.io/projected/f2bea76c-1144-4d84-b1f3-e00fd84aa09d-kube-api-access-phzfd\") pod \"swift-operator-controller-manager-8f6687c44-6wj9b\" (UID: \"f2bea76c-1144-4d84-b1f3-e00fd84aa09d\") " pod="openstack-operators/swift-operator-controller-manager-8f6687c44-6wj9b" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.244388 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wkg9\" (UniqueName: \"kubernetes.io/projected/578ef786-b985-42c9-ae2c-5d93b7359382-kube-api-access-4wkg9\") pod \"test-operator-controller-manager-bb86466d8-x2g4s\" (UID: \"578ef786-b985-42c9-ae2c-5d93b7359382\") " pod="openstack-operators/test-operator-controller-manager-bb86466d8-x2g4s" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.244438 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4cln\" (UniqueName: \"kubernetes.io/projected/b3bc64ee-f0c4-4c2a-99b7-0c7bd8a5a970-kube-api-access-t4cln\") pod \"telemetry-operator-controller-manager-695797c565-7tbpv\" (UID: \"b3bc64ee-f0c4-4c2a-99b7-0c7bd8a5a970\") " pod="openstack-operators/telemetry-operator-controller-manager-695797c565-7tbpv" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.244462 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k4nz\" (UniqueName: \"kubernetes.io/projected/de9a52da-79c5-43dd-8a20-eb9917314ad5-kube-api-access-5k4nz\") pod \"placement-operator-controller-manager-867d87977b-m5ds4\" (UID: \"de9a52da-79c5-43dd-8a20-eb9917314ad5\") " pod="openstack-operators/placement-operator-controller-manager-867d87977b-m5ds4" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.244485 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8v98\" (UniqueName: \"kubernetes.io/projected/1cd0c273-75dc-4c87-bbf5-0a68b76f8ca6-kube-api-access-z8v98\") pod \"watcher-operator-controller-manager-7f99c4b8d7-bl79v\" (UID: \"1cd0c273-75dc-4c87-bbf5-0a68b76f8ca6\") " pod="openstack-operators/watcher-operator-controller-manager-7f99c4b8d7-bl79v" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.244502 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a70b911-37d6-41d5-81cf-9c631572a523-cert\") pod \"openstack-operator-controller-manager-76898dc959-mxx27\" (UID: \"8a70b911-37d6-41d5-81cf-9c631572a523\") " pod="openstack-operators/openstack-operator-controller-manager-76898dc959-mxx27" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.261648 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-jcvk6" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.277721 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-cvbgc"] Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.277866 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phzfd\" (UniqueName: \"kubernetes.io/projected/f2bea76c-1144-4d84-b1f3-e00fd84aa09d-kube-api-access-phzfd\") pod \"swift-operator-controller-manager-8f6687c44-6wj9b\" (UID: \"f2bea76c-1144-4d84-b1f3-e00fd84aa09d\") " pod="openstack-operators/swift-operator-controller-manager-8f6687c44-6wj9b" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.278211 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4cln\" (UniqueName: \"kubernetes.io/projected/b3bc64ee-f0c4-4c2a-99b7-0c7bd8a5a970-kube-api-access-t4cln\") pod \"telemetry-operator-controller-manager-695797c565-7tbpv\" (UID: \"b3bc64ee-f0c4-4c2a-99b7-0c7bd8a5a970\") " pod="openstack-operators/telemetry-operator-controller-manager-695797c565-7tbpv" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.278485 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-cvbgc" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.297588 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-2cbph" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.297750 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k4nz\" (UniqueName: \"kubernetes.io/projected/de9a52da-79c5-43dd-8a20-eb9917314ad5-kube-api-access-5k4nz\") pod \"placement-operator-controller-manager-867d87977b-m5ds4\" (UID: \"de9a52da-79c5-43dd-8a20-eb9917314ad5\") " pod="openstack-operators/placement-operator-controller-manager-867d87977b-m5ds4" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.298240 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-cvbgc"] Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.346539 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8v98\" (UniqueName: \"kubernetes.io/projected/1cd0c273-75dc-4c87-bbf5-0a68b76f8ca6-kube-api-access-z8v98\") pod \"watcher-operator-controller-manager-7f99c4b8d7-bl79v\" (UID: \"1cd0c273-75dc-4c87-bbf5-0a68b76f8ca6\") " pod="openstack-operators/watcher-operator-controller-manager-7f99c4b8d7-bl79v" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.346588 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a70b911-37d6-41d5-81cf-9c631572a523-cert\") pod \"openstack-operator-controller-manager-76898dc959-mxx27\" (UID: \"8a70b911-37d6-41d5-81cf-9c631572a523\") " pod="openstack-operators/openstack-operator-controller-manager-76898dc959-mxx27" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.346678 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm2w4\" (UniqueName: \"kubernetes.io/projected/63d30685-d6c8-44cf-a586-2bb8030844da-kube-api-access-qm2w4\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-cvbgc\" (UID: \"63d30685-d6c8-44cf-a586-2bb8030844da\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-cvbgc" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.346714 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr896\" (UniqueName: \"kubernetes.io/projected/8a70b911-37d6-41d5-81cf-9c631572a523-kube-api-access-lr896\") pod \"openstack-operator-controller-manager-76898dc959-mxx27\" (UID: \"8a70b911-37d6-41d5-81cf-9c631572a523\") " pod="openstack-operators/openstack-operator-controller-manager-76898dc959-mxx27" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.346773 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wkg9\" (UniqueName: \"kubernetes.io/projected/578ef786-b985-42c9-ae2c-5d93b7359382-kube-api-access-4wkg9\") pod \"test-operator-controller-manager-bb86466d8-x2g4s\" (UID: \"578ef786-b985-42c9-ae2c-5d93b7359382\") " pod="openstack-operators/test-operator-controller-manager-bb86466d8-x2g4s" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.371735 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a70b911-37d6-41d5-81cf-9c631572a523-cert\") pod \"openstack-operator-controller-manager-76898dc959-mxx27\" (UID: \"8a70b911-37d6-41d5-81cf-9c631572a523\") " pod="openstack-operators/openstack-operator-controller-manager-76898dc959-mxx27" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.380567 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr896\" (UniqueName: \"kubernetes.io/projected/8a70b911-37d6-41d5-81cf-9c631572a523-kube-api-access-lr896\") pod \"openstack-operator-controller-manager-76898dc959-mxx27\" (UID: \"8a70b911-37d6-41d5-81cf-9c631572a523\") " pod="openstack-operators/openstack-operator-controller-manager-76898dc959-mxx27" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.381871 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8v98\" (UniqueName: \"kubernetes.io/projected/1cd0c273-75dc-4c87-bbf5-0a68b76f8ca6-kube-api-access-z8v98\") pod \"watcher-operator-controller-manager-7f99c4b8d7-bl79v\" (UID: \"1cd0c273-75dc-4c87-bbf5-0a68b76f8ca6\") " pod="openstack-operators/watcher-operator-controller-manager-7f99c4b8d7-bl79v" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.382460 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wkg9\" (UniqueName: \"kubernetes.io/projected/578ef786-b985-42c9-ae2c-5d93b7359382-kube-api-access-4wkg9\") pod \"test-operator-controller-manager-bb86466d8-x2g4s\" (UID: \"578ef786-b985-42c9-ae2c-5d93b7359382\") " pod="openstack-operators/test-operator-controller-manager-bb86466d8-x2g4s" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.415446 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5bfbbb859d-bcnp2"] Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.448593 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qm2w4\" (UniqueName: \"kubernetes.io/projected/63d30685-d6c8-44cf-a586-2bb8030844da-kube-api-access-qm2w4\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-cvbgc\" (UID: \"63d30685-d6c8-44cf-a586-2bb8030844da\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-cvbgc" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.469032 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm2w4\" (UniqueName: \"kubernetes.io/projected/63d30685-d6c8-44cf-a586-2bb8030844da-kube-api-access-qm2w4\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-cvbgc\" (UID: \"63d30685-d6c8-44cf-a586-2bb8030844da\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-cvbgc" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.474803 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-867d87977b-m5ds4" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.499027 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-8f6687c44-6wj9b" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.514783 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-695797c565-7tbpv" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.537467 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-bb86466d8-x2g4s" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.563071 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7f99c4b8d7-bl79v" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.563640 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-76898dc959-mxx27" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.571867 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-748967c98-qdltx"] Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.582237 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6788cc6d75-szknd"] Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.602997 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-85fbd69fcd-pkdwc"] Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.630173 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-cvbgc" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.658626 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2cce552d-d699-47f9-9405-1dd33478f23d-cert\") pod \"openstack-baremetal-operator-controller-manager-77868f484-2j4hr\" (UID: \"2cce552d-d699-47f9-9405-1dd33478f23d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-2j4hr" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.663112 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2cce552d-d699-47f9-9405-1dd33478f23d-cert\") pod \"openstack-baremetal-operator-controller-manager-77868f484-2j4hr\" (UID: \"2cce552d-d699-47f9-9405-1dd33478f23d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-2j4hr" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.809962 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-79cc9d59f5-kwgb2"] Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.832140 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-6c55d8d69b-5945b"] Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.840083 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-698d6fd7d6-8ff6n"] Dec 04 10:30:42 crc kubenswrapper[4831]: W1204 10:30:42.869985 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa61aacd_6e6d_4d05_a918_d81916f3f187.slice/crio-b09d7bbfe1e598d56ac9ea6e7858614c5b1d9e816b14290ee9c8fad83cc2e8b3 WatchSource:0}: Error finding container b09d7bbfe1e598d56ac9ea6e7858614c5b1d9e816b14290ee9c8fad83cc2e8b3: Status 404 returned error can't find the container with id b09d7bbfe1e598d56ac9ea6e7858614c5b1d9e816b14290ee9c8fad83cc2e8b3 Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.891850 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-2j4hr" Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.974762 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-54485f899-nj5xp"] Dec 04 10:30:42 crc kubenswrapper[4831]: W1204 10:30:42.984631 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10402b7d_27c5_4672_ba4e_2e67fcb5bf68.slice/crio-53b1f870c20d8d3d4db311a3c0e5d194a62c6dde59d596e1e5e76433415eab10 WatchSource:0}: Error finding container 53b1f870c20d8d3d4db311a3c0e5d194a62c6dde59d596e1e5e76433415eab10: Status 404 returned error can't find the container with id 53b1f870c20d8d3d4db311a3c0e5d194a62c6dde59d596e1e5e76433415eab10 Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.987266 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-64d7c556cd-8dcqz"] Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.998245 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-85fbd69fcd-pkdwc" event={"ID":"2f1667cb-e22c-443f-ab25-594205cd0f52","Type":"ContainerStarted","Data":"1e9dd5ff3e854010b3fe7a19036fd2e30ff389c3c29a66c5d8ea6243c6fbf8b3"} Dec 04 10:30:42 crc kubenswrapper[4831]: I1204 10:30:42.999171 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-79cc9d59f5-kwgb2" event={"ID":"72db73b6-9683-45a8-8b41-4ced2e11efdd","Type":"ContainerStarted","Data":"d80553f0f34cb6b88b8424ebdba8e609a4a43910a2c0b74c32b4744fa27fa01b"} Dec 04 10:30:43 crc kubenswrapper[4831]: I1204 10:30:43.001548 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5cbc8c7f96-l5fq4"] Dec 04 10:30:43 crc kubenswrapper[4831]: I1204 10:30:43.006429 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-8ff6n" event={"ID":"fa61aacd-6e6d-4d05-a918-d81916f3f187","Type":"ContainerStarted","Data":"b09d7bbfe1e598d56ac9ea6e7858614c5b1d9e816b14290ee9c8fad83cc2e8b3"} Dec 04 10:30:43 crc kubenswrapper[4831]: W1204 10:30:43.007166 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09dfcf4a_2343_414a_a731_a64a914ab3db.slice/crio-04df1eee1fa9e2ccb0f8b4978b870a0d73920674ed7496b5677ca89a90139046 WatchSource:0}: Error finding container 04df1eee1fa9e2ccb0f8b4978b870a0d73920674ed7496b5677ca89a90139046: Status 404 returned error can't find the container with id 04df1eee1fa9e2ccb0f8b4978b870a0d73920674ed7496b5677ca89a90139046 Dec 04 10:30:43 crc kubenswrapper[4831]: I1204 10:30:43.011033 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-78d6x"] Dec 04 10:30:43 crc kubenswrapper[4831]: I1204 10:30:43.044432 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-szknd" event={"ID":"5006fa30-e563-468f-87c1-d062ca2aacc9","Type":"ContainerStarted","Data":"e6e4e16003713b3ae58d67518e4e9ccf4c8537af070c0971513daf19b5552fb3"} Dec 04 10:30:43 crc kubenswrapper[4831]: I1204 10:30:43.049279 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-748967c98-qdltx" event={"ID":"f635920e-b830-40f5-afbb-d3a21ac15900","Type":"ContainerStarted","Data":"ae14a4ded5811e40ab9fe4bff1682db9fe8913c437f4badb4838cb3241c6a672"} Dec 04 10:30:43 crc kubenswrapper[4831]: I1204 10:30:43.051588 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-6c55d8d69b-5945b" event={"ID":"16cb21e3-9b80-4fcf-9389-5dc1e2343fa4","Type":"ContainerStarted","Data":"860cf5a9815ba7f7500f03b1e084ee77775dc9d3bdd978b008cf4286c982c15b"} Dec 04 10:30:43 crc kubenswrapper[4831]: I1204 10:30:43.071473 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-54485f899-nj5xp" event={"ID":"38d9a0e7-49b9-4f91-938e-c040eef2bb37","Type":"ContainerStarted","Data":"9274a8425ef357b2ac07adc32a5387416e2eeec08d6d5a9e47db9e7ab30b816d"} Dec 04 10:30:43 crc kubenswrapper[4831]: I1204 10:30:43.083010 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-bcnp2" event={"ID":"2fa5a388-dd09-43cb-90d4-01bc536e1e82","Type":"ContainerStarted","Data":"21809451c8a28180d00ba443a689aa1d4556e091550793aba6419862ef57cc82"} Dec 04 10:30:43 crc kubenswrapper[4831]: I1204 10:30:43.170191 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-jcvk6"] Dec 04 10:30:43 crc kubenswrapper[4831]: I1204 10:30:43.196499 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-867d87977b-m5ds4"] Dec 04 10:30:43 crc kubenswrapper[4831]: I1204 10:30:43.227977 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-79d658b66d-7djfw"] Dec 04 10:30:43 crc kubenswrapper[4831]: W1204 10:30:43.236092 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c772a6a_6023_4eea_870a_904ae2d47896.slice/crio-39313bef50e544a31ba730c693787592317b4eb8d0ac25f5feb162945e44a43f WatchSource:0}: Error finding container 39313bef50e544a31ba730c693787592317b4eb8d0ac25f5feb162945e44a43f: Status 404 returned error can't find the container with id 39313bef50e544a31ba730c693787592317b4eb8d0ac25f5feb162945e44a43f Dec 04 10:30:43 crc kubenswrapper[4831]: I1204 10:30:43.336355 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-58879495c-8ffv5"] Dec 04 10:30:43 crc kubenswrapper[4831]: I1204 10:30:43.336395 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-d5fb87cb8-zcm4n"] Dec 04 10:30:43 crc kubenswrapper[4831]: I1204 10:30:43.336405 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7f99c4b8d7-bl79v"] Dec 04 10:30:43 crc kubenswrapper[4831]: E1204 10:30:43.338199 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:1463c43243c75f56609cbae6bee2f86d411107181775721cb097cbd22fcae1d1,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-h2pbp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-58879495c-8ffv5_openstack-operators(b4b2ad0a-7308-4184-9d37-0c6b60ff873c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 10:30:43 crc kubenswrapper[4831]: E1204 10:30:43.338985 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:5245e851b4476baecd4173eca3e8669ac09ec69d36ad1ebc3a0f867713cbc14b,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z9l47,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-d5fb87cb8-zcm4n_openstack-operators(9935df64-3d24-41fc-bc66-c2c576211287): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 10:30:43 crc kubenswrapper[4831]: E1204 10:30:43.339109 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.47:5001/openstack-k8s-operators/watcher-operator:5e4745ad47403efffe48968672fd7cd5232c27d7,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z8v98,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-7f99c4b8d7-bl79v_openstack-operators(1cd0c273-75dc-4c87-bbf5-0a68b76f8ca6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 10:30:43 crc kubenswrapper[4831]: I1204 10:30:43.344940 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-8f6687c44-6wj9b"] Dec 04 10:30:43 crc kubenswrapper[4831]: I1204 10:30:43.428715 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-695797c565-7tbpv"] Dec 04 10:30:43 crc kubenswrapper[4831]: W1204 10:30:43.457082 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3bc64ee_f0c4_4c2a_99b7_0c7bd8a5a970.slice/crio-3ffdc606c456c3560bf63041962c4b454f5b7b9150d6e08337cca7c8386d501e WatchSource:0}: Error finding container 3ffdc606c456c3560bf63041962c4b454f5b7b9150d6e08337cca7c8386d501e: Status 404 returned error can't find the container with id 3ffdc606c456c3560bf63041962c4b454f5b7b9150d6e08337cca7c8386d501e Dec 04 10:30:43 crc kubenswrapper[4831]: E1204 10:30:43.477795 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:78d91c3cdd5eda41c2cd6d4a8491844e161dc33f6221be8cb822b2107d7ff46f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-t4cln,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-695797c565-7tbpv_openstack-operators(b3bc64ee-f0c4-4c2a-99b7-0c7bd8a5a970): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 10:30:43 crc kubenswrapper[4831]: E1204 10:30:43.564105 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qm2w4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-cvbgc_openstack-operators(63d30685-d6c8-44cf-a586-2bb8030844da): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 10:30:43 crc kubenswrapper[4831]: E1204 10:30:43.566246 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-cvbgc" podUID="63d30685-d6c8-44cf-a586-2bb8030844da" Dec 04 10:30:43 crc kubenswrapper[4831]: I1204 10:30:43.610608 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-cvbgc"] Dec 04 10:30:43 crc kubenswrapper[4831]: I1204 10:30:43.646605 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-76898dc959-mxx27"] Dec 04 10:30:43 crc kubenswrapper[4831]: I1204 10:30:43.651541 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-bb86466d8-x2g4s"] Dec 04 10:30:43 crc kubenswrapper[4831]: I1204 10:30:43.693317 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-2j4hr"] Dec 04 10:30:43 crc kubenswrapper[4831]: E1204 10:30:43.713523 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:66928f0eae5206f671ac7b21f79953e37009c54187d768dc6e03fe0a3d202b3b,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_API_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_PROC_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-processor:current,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8mn84,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-77868f484-2j4hr_openstack-operators(2cce552d-d699-47f9-9405-1dd33478f23d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 10:30:43 crc kubenswrapper[4831]: E1204 10:30:43.829048 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-58879495c-8ffv5" podUID="b4b2ad0a-7308-4184-9d37-0c6b60ff873c" Dec 04 10:30:43 crc kubenswrapper[4831]: E1204 10:30:43.921129 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-7f99c4b8d7-bl79v" podUID="1cd0c273-75dc-4c87-bbf5-0a68b76f8ca6" Dec 04 10:30:43 crc kubenswrapper[4831]: E1204 10:30:43.921536 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-d5fb87cb8-zcm4n" podUID="9935df64-3d24-41fc-bc66-c2c576211287" Dec 04 10:30:44 crc kubenswrapper[4831]: I1204 10:30:44.105190 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-695797c565-7tbpv" event={"ID":"b3bc64ee-f0c4-4c2a-99b7-0c7bd8a5a970","Type":"ContainerStarted","Data":"bf724485d05dd790991bce936f4bb65ae1973a81ddc59f6a5f6cfd21283477aa"} Dec 04 10:30:44 crc kubenswrapper[4831]: I1204 10:30:44.105244 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-695797c565-7tbpv" event={"ID":"b3bc64ee-f0c4-4c2a-99b7-0c7bd8a5a970","Type":"ContainerStarted","Data":"3ffdc606c456c3560bf63041962c4b454f5b7b9150d6e08337cca7c8386d501e"} Dec 04 10:30:44 crc kubenswrapper[4831]: I1204 10:30:44.132899 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-867d87977b-m5ds4" event={"ID":"de9a52da-79c5-43dd-8a20-eb9917314ad5","Type":"ContainerStarted","Data":"e6a64c67fc33af56dc8e42ff7c017bd98f53de2137a552e22becd48b78c8d9e6"} Dec 04 10:30:44 crc kubenswrapper[4831]: I1204 10:30:44.139029 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-2j4hr" event={"ID":"2cce552d-d699-47f9-9405-1dd33478f23d","Type":"ContainerStarted","Data":"8940c08f94c900e30defddedfbe7461fa10f3c93c6d84ae38805d4e1d3679898"} Dec 04 10:30:44 crc kubenswrapper[4831]: I1204 10:30:44.144527 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-58879495c-8ffv5" event={"ID":"b4b2ad0a-7308-4184-9d37-0c6b60ff873c","Type":"ContainerStarted","Data":"14f2e30c76c6bab19127cd669417556ea7e4e461e95fe24c460d562ffac3295b"} Dec 04 10:30:44 crc kubenswrapper[4831]: I1204 10:30:44.144561 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-58879495c-8ffv5" event={"ID":"b4b2ad0a-7308-4184-9d37-0c6b60ff873c","Type":"ContainerStarted","Data":"ce5656f3443be64577f43a7a6668676c312faa977c5bf907f5a836de33cd531d"} Dec 04 10:30:44 crc kubenswrapper[4831]: E1204 10:30:44.146043 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:1463c43243c75f56609cbae6bee2f86d411107181775721cb097cbd22fcae1d1\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-58879495c-8ffv5" podUID="b4b2ad0a-7308-4184-9d37-0c6b60ff873c" Dec 04 10:30:44 crc kubenswrapper[4831]: I1204 10:30:44.153778 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79d658b66d-7djfw" event={"ID":"2381f3a2-3514-4ff4-bfd7-47cb5587265c","Type":"ContainerStarted","Data":"601bddfb67a720dbc51547f6f985b8cb44c4e85dcb49634c1b831d6adedc7292"} Dec 04 10:30:44 crc kubenswrapper[4831]: I1204 10:30:44.156317 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5cbc8c7f96-l5fq4" event={"ID":"09dfcf4a-2343-414a-a731-a64a914ab3db","Type":"ContainerStarted","Data":"04df1eee1fa9e2ccb0f8b4978b870a0d73920674ed7496b5677ca89a90139046"} Dec 04 10:30:44 crc kubenswrapper[4831]: I1204 10:30:44.162032 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-d5fb87cb8-zcm4n" event={"ID":"9935df64-3d24-41fc-bc66-c2c576211287","Type":"ContainerStarted","Data":"0e2525fba3712b5559b4a1b7254317e0af7a20e5c4728c45f891dac0c94a6bb5"} Dec 04 10:30:44 crc kubenswrapper[4831]: I1204 10:30:44.162076 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-d5fb87cb8-zcm4n" event={"ID":"9935df64-3d24-41fc-bc66-c2c576211287","Type":"ContainerStarted","Data":"0dacbf72e3747859b24390eaca891c81c4596931af5885f1f6a8cc42785d2101"} Dec 04 10:30:44 crc kubenswrapper[4831]: E1204 10:30:44.165100 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:5245e851b4476baecd4173eca3e8669ac09ec69d36ad1ebc3a0f867713cbc14b\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-d5fb87cb8-zcm4n" podUID="9935df64-3d24-41fc-bc66-c2c576211287" Dec 04 10:30:44 crc kubenswrapper[4831]: I1204 10:30:44.165444 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-8f6687c44-6wj9b" event={"ID":"f2bea76c-1144-4d84-b1f3-e00fd84aa09d","Type":"ContainerStarted","Data":"7e631e9daae2e2b6480f179030a769ad8bee503b32cd7e436fa00553ea3253e5"} Dec 04 10:30:44 crc kubenswrapper[4831]: I1204 10:30:44.168945 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-jcvk6" event={"ID":"0c772a6a-6023-4eea-870a-904ae2d47896","Type":"ContainerStarted","Data":"39313bef50e544a31ba730c693787592317b4eb8d0ac25f5feb162945e44a43f"} Dec 04 10:30:44 crc kubenswrapper[4831]: I1204 10:30:44.175037 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7f99c4b8d7-bl79v" event={"ID":"1cd0c273-75dc-4c87-bbf5-0a68b76f8ca6","Type":"ContainerStarted","Data":"24adaf1e05e751a863c545f1cd67958a77101a9d35841e71aa9a33d960cac361"} Dec 04 10:30:44 crc kubenswrapper[4831]: I1204 10:30:44.175081 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7f99c4b8d7-bl79v" event={"ID":"1cd0c273-75dc-4c87-bbf5-0a68b76f8ca6","Type":"ContainerStarted","Data":"ff19dc4370c25cacc6d5390f1456d0ededea64e55bbb68f89a199affbfe88ccd"} Dec 04 10:30:44 crc kubenswrapper[4831]: E1204 10:30:44.177590 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.47:5001/openstack-k8s-operators/watcher-operator:5e4745ad47403efffe48968672fd7cd5232c27d7\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-7f99c4b8d7-bl79v" podUID="1cd0c273-75dc-4c87-bbf5-0a68b76f8ca6" Dec 04 10:30:44 crc kubenswrapper[4831]: I1204 10:30:44.183577 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-76898dc959-mxx27" event={"ID":"8a70b911-37d6-41d5-81cf-9c631572a523","Type":"ContainerStarted","Data":"47cb151862fddc5d42c54e5f22e15044f1d72e490a7ad65e34c6741da9b4084a"} Dec 04 10:30:44 crc kubenswrapper[4831]: I1204 10:30:44.183631 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-76898dc959-mxx27" event={"ID":"8a70b911-37d6-41d5-81cf-9c631572a523","Type":"ContainerStarted","Data":"f1cf9a6cbc75a758ee6974ed5ad4e5e2d418b13d0e91f7a2885b0249d3c861c5"} Dec 04 10:30:44 crc kubenswrapper[4831]: I1204 10:30:44.197577 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-8dcqz" event={"ID":"10402b7d-27c5-4672-ba4e-2e67fcb5bf68","Type":"ContainerStarted","Data":"53b1f870c20d8d3d4db311a3c0e5d194a62c6dde59d596e1e5e76433415eab10"} Dec 04 10:30:44 crc kubenswrapper[4831]: I1204 10:30:44.201217 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-78d6x" event={"ID":"50076597-4138-4f7c-ad31-378087dfc135","Type":"ContainerStarted","Data":"e56650d3a31074894a2c0e37e2f9d38d2c4eb82813d77b25ee329cd8c6119581"} Dec 04 10:30:44 crc kubenswrapper[4831]: I1204 10:30:44.215577 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-cvbgc" event={"ID":"63d30685-d6c8-44cf-a586-2bb8030844da","Type":"ContainerStarted","Data":"2cafafbdc103b4be2c95a5cec21d046b971a872ca8951eebe676dfa292bc04c9"} Dec 04 10:30:44 crc kubenswrapper[4831]: I1204 10:30:44.217174 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-bb86466d8-x2g4s" event={"ID":"578ef786-b985-42c9-ae2c-5d93b7359382","Type":"ContainerStarted","Data":"1d3eb7447d69d3d0dd136a5561eec9a9fb8790f7aba83d4b9f0dea9821b52dd5"} Dec 04 10:30:44 crc kubenswrapper[4831]: E1204 10:30:44.217515 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-cvbgc" podUID="63d30685-d6c8-44cf-a586-2bb8030844da" Dec 04 10:30:44 crc kubenswrapper[4831]: E1204 10:30:44.317284 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-695797c565-7tbpv" podUID="b3bc64ee-f0c4-4c2a-99b7-0c7bd8a5a970" Dec 04 10:30:44 crc kubenswrapper[4831]: E1204 10:30:44.360977 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-2j4hr" podUID="2cce552d-d699-47f9-9405-1dd33478f23d" Dec 04 10:30:45 crc kubenswrapper[4831]: I1204 10:30:45.231727 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-76898dc959-mxx27" event={"ID":"8a70b911-37d6-41d5-81cf-9c631572a523","Type":"ContainerStarted","Data":"dbd706347b5031428733e167503844524bb571c234e8fc2de052567f9a4ee12f"} Dec 04 10:30:45 crc kubenswrapper[4831]: I1204 10:30:45.232117 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-76898dc959-mxx27" Dec 04 10:30:45 crc kubenswrapper[4831]: I1204 10:30:45.237058 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-2j4hr" event={"ID":"2cce552d-d699-47f9-9405-1dd33478f23d","Type":"ContainerStarted","Data":"785a500a00e8e41b1f6d4cc74f3c0621bf18810b9c6c5f78e99099855c6b85a1"} Dec 04 10:30:45 crc kubenswrapper[4831]: E1204 10:30:45.240673 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-cvbgc" podUID="63d30685-d6c8-44cf-a586-2bb8030844da" Dec 04 10:30:45 crc kubenswrapper[4831]: E1204 10:30:45.241059 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.47:5001/openstack-k8s-operators/watcher-operator:5e4745ad47403efffe48968672fd7cd5232c27d7\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-7f99c4b8d7-bl79v" podUID="1cd0c273-75dc-4c87-bbf5-0a68b76f8ca6" Dec 04 10:30:45 crc kubenswrapper[4831]: E1204 10:30:45.241097 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:5245e851b4476baecd4173eca3e8669ac09ec69d36ad1ebc3a0f867713cbc14b\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-d5fb87cb8-zcm4n" podUID="9935df64-3d24-41fc-bc66-c2c576211287" Dec 04 10:30:45 crc kubenswrapper[4831]: E1204 10:30:45.241154 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:1463c43243c75f56609cbae6bee2f86d411107181775721cb097cbd22fcae1d1\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-58879495c-8ffv5" podUID="b4b2ad0a-7308-4184-9d37-0c6b60ff873c" Dec 04 10:30:45 crc kubenswrapper[4831]: E1204 10:30:45.241400 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:66928f0eae5206f671ac7b21f79953e37009c54187d768dc6e03fe0a3d202b3b\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-2j4hr" podUID="2cce552d-d699-47f9-9405-1dd33478f23d" Dec 04 10:30:45 crc kubenswrapper[4831]: E1204 10:30:45.241544 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:78d91c3cdd5eda41c2cd6d4a8491844e161dc33f6221be8cb822b2107d7ff46f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-695797c565-7tbpv" podUID="b3bc64ee-f0c4-4c2a-99b7-0c7bd8a5a970" Dec 04 10:30:45 crc kubenswrapper[4831]: I1204 10:30:45.285863 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-76898dc959-mxx27" podStartSLOduration=4.285843264 podStartE2EDuration="4.285843264s" podCreationTimestamp="2025-12-04 10:30:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:30:45.271060279 +0000 UTC m=+942.220235593" watchObservedRunningTime="2025-12-04 10:30:45.285843264 +0000 UTC m=+942.235018578" Dec 04 10:30:46 crc kubenswrapper[4831]: E1204 10:30:46.086819 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafb08d97_21c7_4452_b8a7_8a8776ee28dd.slice/crio-9ce2821116e97934ca3e1fadd0575a8ee98bfa81d79d4644d265aba027247a6c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafb08d97_21c7_4452_b8a7_8a8776ee28dd.slice\": RecentStats: unable to find data in memory cache]" Dec 04 10:30:46 crc kubenswrapper[4831]: E1204 10:30:46.250646 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:66928f0eae5206f671ac7b21f79953e37009c54187d768dc6e03fe0a3d202b3b\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-2j4hr" podUID="2cce552d-d699-47f9-9405-1dd33478f23d" Dec 04 10:30:52 crc kubenswrapper[4831]: I1204 10:30:52.571750 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-76898dc959-mxx27" Dec 04 10:30:56 crc kubenswrapper[4831]: E1204 10:30:56.321511 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafb08d97_21c7_4452_b8a7_8a8776ee28dd.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafb08d97_21c7_4452_b8a7_8a8776ee28dd.slice/crio-9ce2821116e97934ca3e1fadd0575a8ee98bfa81d79d4644d265aba027247a6c\": RecentStats: unable to find data in memory cache]" Dec 04 10:30:56 crc kubenswrapper[4831]: E1204 10:30:56.843911 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:57d9cb0034a7d5c7a39410fcb619ade2010e6855344dc3a0bc2bfd98cdf345d8" Dec 04 10:30:56 crc kubenswrapper[4831]: E1204 10:30:56.844722 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:57d9cb0034a7d5c7a39410fcb619ade2010e6855344dc3a0bc2bfd98cdf345d8,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kk2n5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-5cbc8c7f96-l5fq4_openstack-operators(09dfcf4a-2343-414a-a731-a64a914ab3db): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 10:30:57 crc kubenswrapper[4831]: E1204 10:30:57.298093 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:f4b6baa2b8a661351cfc24fff5aacee5aa4198106618700cfa47ec3a75f88b31" Dec 04 10:30:57 crc kubenswrapper[4831]: E1204 10:30:57.298307 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:f4b6baa2b8a661351cfc24fff5aacee5aa4198106618700cfa47ec3a75f88b31,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-btl9s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-85fbd69fcd-pkdwc_openstack-operators(2f1667cb-e22c-443f-ab25-594205cd0f52): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 10:30:57 crc kubenswrapper[4831]: E1204 10:30:57.700727 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-5cbc8c7f96-l5fq4" podUID="09dfcf4a-2343-414a-a731-a64a914ab3db" Dec 04 10:30:57 crc kubenswrapper[4831]: E1204 10:30:57.863866 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-85fbd69fcd-pkdwc" podUID="2f1667cb-e22c-443f-ab25-594205cd0f52" Dec 04 10:30:58 crc kubenswrapper[4831]: I1204 10:30:58.384693 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-szknd" event={"ID":"5006fa30-e563-468f-87c1-d062ca2aacc9","Type":"ContainerStarted","Data":"906ce5b0f3288675b7c8ec1d6147f606d75b0c7167e6caa023b4cd460ee8952d"} Dec 04 10:30:58 crc kubenswrapper[4831]: I1204 10:30:58.399413 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-8f6687c44-6wj9b" event={"ID":"f2bea76c-1144-4d84-b1f3-e00fd84aa09d","Type":"ContainerStarted","Data":"366b825a000eab175067c76df8a1257f5f486b1e9d6e1ac95ee4c2bc67439c35"} Dec 04 10:30:58 crc kubenswrapper[4831]: I1204 10:30:58.411911 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-79cc9d59f5-kwgb2" event={"ID":"72db73b6-9683-45a8-8b41-4ced2e11efdd","Type":"ContainerStarted","Data":"153938c8d53c9a0ee4e63aebf87504d7bd7ed76bdee2b1346015bac3cdc8e9c1"} Dec 04 10:30:58 crc kubenswrapper[4831]: I1204 10:30:58.425037 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-867d87977b-m5ds4" event={"ID":"de9a52da-79c5-43dd-8a20-eb9917314ad5","Type":"ContainerStarted","Data":"6e7b7c4c8f1f44c9d5ac2038162dc78421422bc10c4b2f4236ccf0877aaf906a"} Dec 04 10:30:58 crc kubenswrapper[4831]: I1204 10:30:58.465971 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-8dcqz" event={"ID":"10402b7d-27c5-4672-ba4e-2e67fcb5bf68","Type":"ContainerStarted","Data":"8866667573f2c69802cc3e7d30964cf0761c35a524af768a0bdeb77f85191124"} Dec 04 10:30:58 crc kubenswrapper[4831]: I1204 10:30:58.519948 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-8ff6n" event={"ID":"fa61aacd-6e6d-4d05-a918-d81916f3f187","Type":"ContainerStarted","Data":"5f55748b1358917645af1c7a661c8315afadb6f11ca329a556c8c58563ace744"} Dec 04 10:30:58 crc kubenswrapper[4831]: I1204 10:30:58.542895 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79d658b66d-7djfw" event={"ID":"2381f3a2-3514-4ff4-bfd7-47cb5587265c","Type":"ContainerStarted","Data":"3724c163fa402f1bff71e9ecd29fa78fbbca48556d5ee1e32647e4cfd4580c4e"} Dec 04 10:30:58 crc kubenswrapper[4831]: I1204 10:30:58.543522 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-79d658b66d-7djfw" Dec 04 10:30:58 crc kubenswrapper[4831]: I1204 10:30:58.562873 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-54485f899-nj5xp" event={"ID":"38d9a0e7-49b9-4f91-938e-c040eef2bb37","Type":"ContainerStarted","Data":"9ea4b44aff54e91cd286153cea7cdb8b86280593cd316a078e2c5a83debfe518"} Dec 04 10:30:58 crc kubenswrapper[4831]: I1204 10:30:58.579649 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-bcnp2" event={"ID":"2fa5a388-dd09-43cb-90d4-01bc536e1e82","Type":"ContainerStarted","Data":"04344909cfb7ae57b80803957ba53fd95c6f172b0830a5aba495af003da8360b"} Dec 04 10:30:58 crc kubenswrapper[4831]: I1204 10:30:58.581817 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-79d658b66d-7djfw" podStartSLOduration=4.065315867 podStartE2EDuration="17.581807882s" podCreationTimestamp="2025-12-04 10:30:41 +0000 UTC" firstStartedPulling="2025-12-04 10:30:43.302149252 +0000 UTC m=+940.251324566" lastFinishedPulling="2025-12-04 10:30:56.818641257 +0000 UTC m=+953.767816581" observedRunningTime="2025-12-04 10:30:58.57613326 +0000 UTC m=+955.525308574" watchObservedRunningTime="2025-12-04 10:30:58.581807882 +0000 UTC m=+955.530983196" Dec 04 10:30:58 crc kubenswrapper[4831]: I1204 10:30:58.600553 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-85fbd69fcd-pkdwc" event={"ID":"2f1667cb-e22c-443f-ab25-594205cd0f52","Type":"ContainerStarted","Data":"bbac37e31f7f257d606c58a00ade3434b8dfb5faaf54c88f0550896a6f935062"} Dec 04 10:30:58 crc kubenswrapper[4831]: E1204 10:30:58.603440 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:f4b6baa2b8a661351cfc24fff5aacee5aa4198106618700cfa47ec3a75f88b31\\\"\"" pod="openstack-operators/glance-operator-controller-manager-85fbd69fcd-pkdwc" podUID="2f1667cb-e22c-443f-ab25-594205cd0f52" Dec 04 10:30:58 crc kubenswrapper[4831]: I1204 10:30:58.615303 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5cbc8c7f96-l5fq4" event={"ID":"09dfcf4a-2343-414a-a731-a64a914ab3db","Type":"ContainerStarted","Data":"01c4018ba710da46a793ff29d09e263072f11de4bc9daa811d4c63923ac68158"} Dec 04 10:30:58 crc kubenswrapper[4831]: E1204 10:30:58.620244 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:57d9cb0034a7d5c7a39410fcb619ade2010e6855344dc3a0bc2bfd98cdf345d8\\\"\"" pod="openstack-operators/manila-operator-controller-manager-5cbc8c7f96-l5fq4" podUID="09dfcf4a-2343-414a-a731-a64a914ab3db" Dec 04 10:30:58 crc kubenswrapper[4831]: I1204 10:30:58.643891 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-bb86466d8-x2g4s" event={"ID":"578ef786-b985-42c9-ae2c-5d93b7359382","Type":"ContainerStarted","Data":"f8a45cd0d1bbfd7bb61f2c0609e399e929811c34900818bcdc9cadcbf07f6bcd"} Dec 04 10:30:58 crc kubenswrapper[4831]: I1204 10:30:58.674629 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-6c55d8d69b-5945b" event={"ID":"16cb21e3-9b80-4fcf-9389-5dc1e2343fa4","Type":"ContainerStarted","Data":"cac9769f1eefbf12029cb0a058d76fd8e1176993240649377af5265e125c74ac"} Dec 04 10:30:58 crc kubenswrapper[4831]: I1204 10:30:58.674693 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-6c55d8d69b-5945b" event={"ID":"16cb21e3-9b80-4fcf-9389-5dc1e2343fa4","Type":"ContainerStarted","Data":"1396781d24c1ca566f24ab82ae686fa816f385f8c0ff4d2a40418d2d5c7d08c7"} Dec 04 10:30:58 crc kubenswrapper[4831]: I1204 10:30:58.675336 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-6c55d8d69b-5945b" Dec 04 10:30:58 crc kubenswrapper[4831]: I1204 10:30:58.701681 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-78d6x" event={"ID":"50076597-4138-4f7c-ad31-378087dfc135","Type":"ContainerStarted","Data":"f514cdc72a2ea9387c458eb97d7bb394f9e1f47bc7afd83e2ae152cee03f1c7f"} Dec 04 10:30:58 crc kubenswrapper[4831]: I1204 10:30:58.713387 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-6c55d8d69b-5945b" podStartSLOduration=3.761412162 podStartE2EDuration="17.713367164s" podCreationTimestamp="2025-12-04 10:30:41 +0000 UTC" firstStartedPulling="2025-12-04 10:30:42.867642321 +0000 UTC m=+939.816817635" lastFinishedPulling="2025-12-04 10:30:56.819597313 +0000 UTC m=+953.768772637" observedRunningTime="2025-12-04 10:30:58.711090173 +0000 UTC m=+955.660265487" watchObservedRunningTime="2025-12-04 10:30:58.713367164 +0000 UTC m=+955.662542478" Dec 04 10:30:58 crc kubenswrapper[4831]: I1204 10:30:58.731648 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-748967c98-qdltx" event={"ID":"f635920e-b830-40f5-afbb-d3a21ac15900","Type":"ContainerStarted","Data":"8bf4cf8554bcb1ae098148a4d1f37e106ec77a53223be5e8e1053f5c460f93e6"} Dec 04 10:30:58 crc kubenswrapper[4831]: I1204 10:30:58.731701 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-748967c98-qdltx" event={"ID":"f635920e-b830-40f5-afbb-d3a21ac15900","Type":"ContainerStarted","Data":"ed58f45635a6756328a2455a3fbb48628fc5a1d8531d80e308d139ccbbcca4a5"} Dec 04 10:30:58 crc kubenswrapper[4831]: I1204 10:30:58.732246 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-748967c98-qdltx" Dec 04 10:30:58 crc kubenswrapper[4831]: I1204 10:30:58.752645 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-jcvk6" event={"ID":"0c772a6a-6023-4eea-870a-904ae2d47896","Type":"ContainerStarted","Data":"24f098c407c14cded072476390df7abfb245ea6d041bd9cb0ca88225ac54bb19"} Dec 04 10:30:58 crc kubenswrapper[4831]: I1204 10:30:58.755183 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-748967c98-qdltx" podStartSLOduration=3.584979531 podStartE2EDuration="17.75517361s" podCreationTimestamp="2025-12-04 10:30:41 +0000 UTC" firstStartedPulling="2025-12-04 10:30:42.649339792 +0000 UTC m=+939.598515106" lastFinishedPulling="2025-12-04 10:30:56.819533871 +0000 UTC m=+953.768709185" observedRunningTime="2025-12-04 10:30:58.751113932 +0000 UTC m=+955.700289246" watchObservedRunningTime="2025-12-04 10:30:58.75517361 +0000 UTC m=+955.704348924" Dec 04 10:30:59 crc kubenswrapper[4831]: I1204 10:30:59.762455 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-bb86466d8-x2g4s" event={"ID":"578ef786-b985-42c9-ae2c-5d93b7359382","Type":"ContainerStarted","Data":"b54a7702f1205e91c6d9d8a600cbb8cddacfb4b24c583f763703745d930cfda7"} Dec 04 10:30:59 crc kubenswrapper[4831]: I1204 10:30:59.763503 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-bb86466d8-x2g4s" Dec 04 10:30:59 crc kubenswrapper[4831]: I1204 10:30:59.765590 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-54485f899-nj5xp" event={"ID":"38d9a0e7-49b9-4f91-938e-c040eef2bb37","Type":"ContainerStarted","Data":"06476310c96da11074b4f3f9cc283b6fef156a44d8196c78ad65c23469e1e44a"} Dec 04 10:30:59 crc kubenswrapper[4831]: I1204 10:30:59.766019 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-54485f899-nj5xp" Dec 04 10:30:59 crc kubenswrapper[4831]: I1204 10:30:59.773445 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-bcnp2" event={"ID":"2fa5a388-dd09-43cb-90d4-01bc536e1e82","Type":"ContainerStarted","Data":"f8a208b6c191fd435cd165216c9f10b2351ad5e83a9c64d20d15bfb3c7d73c79"} Dec 04 10:30:59 crc kubenswrapper[4831]: I1204 10:30:59.773566 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-bcnp2" Dec 04 10:30:59 crc kubenswrapper[4831]: I1204 10:30:59.776925 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-jcvk6" event={"ID":"0c772a6a-6023-4eea-870a-904ae2d47896","Type":"ContainerStarted","Data":"2559d6986db9c1397bb2f9435390f10844ca16eccff65db6f4fb3edd236fff53"} Dec 04 10:30:59 crc kubenswrapper[4831]: I1204 10:30:59.777100 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-jcvk6" Dec 04 10:30:59 crc kubenswrapper[4831]: I1204 10:30:59.783369 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-79cc9d59f5-kwgb2" event={"ID":"72db73b6-9683-45a8-8b41-4ced2e11efdd","Type":"ContainerStarted","Data":"aba8729c4740c92c5bceeb43e7bd65b8aa0290267962002a8353cf0f8c5a45e9"} Dec 04 10:30:59 crc kubenswrapper[4831]: I1204 10:30:59.783595 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-79cc9d59f5-kwgb2" Dec 04 10:30:59 crc kubenswrapper[4831]: I1204 10:30:59.787777 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79d658b66d-7djfw" event={"ID":"2381f3a2-3514-4ff4-bfd7-47cb5587265c","Type":"ContainerStarted","Data":"51a8e5866b999453e2505569bfec26828b9faa1ce82e6e4629b5636182d12f1e"} Dec 04 10:30:59 crc kubenswrapper[4831]: I1204 10:30:59.791996 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-bb86466d8-x2g4s" podStartSLOduration=5.075372423 podStartE2EDuration="18.791968932s" podCreationTimestamp="2025-12-04 10:30:41 +0000 UTC" firstStartedPulling="2025-12-04 10:30:43.637837714 +0000 UTC m=+940.587013028" lastFinishedPulling="2025-12-04 10:30:57.354434203 +0000 UTC m=+954.303609537" observedRunningTime="2025-12-04 10:30:59.779804597 +0000 UTC m=+956.728979981" watchObservedRunningTime="2025-12-04 10:30:59.791968932 +0000 UTC m=+956.741144246" Dec 04 10:30:59 crc kubenswrapper[4831]: I1204 10:30:59.792602 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-8f6687c44-6wj9b" event={"ID":"f2bea76c-1144-4d84-b1f3-e00fd84aa09d","Type":"ContainerStarted","Data":"d9a31eb36a9a7c830182eb2e683c9ea5505b04323a8f319b98add127e04f1a8a"} Dec 04 10:30:59 crc kubenswrapper[4831]: I1204 10:30:59.792884 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-8f6687c44-6wj9b" Dec 04 10:30:59 crc kubenswrapper[4831]: I1204 10:30:59.815937 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-78d6x" event={"ID":"50076597-4138-4f7c-ad31-378087dfc135","Type":"ContainerStarted","Data":"dc22763ca4dd7ae060bc6a750b88737db718721f2dc3d5330c330634c70d33cd"} Dec 04 10:30:59 crc kubenswrapper[4831]: I1204 10:30:59.816543 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-78d6x" Dec 04 10:30:59 crc kubenswrapper[4831]: I1204 10:30:59.819348 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-867d87977b-m5ds4" event={"ID":"de9a52da-79c5-43dd-8a20-eb9917314ad5","Type":"ContainerStarted","Data":"157f443f8f9043766c21032b496d3e5981f887303821e4b892212382fe4d6f4c"} Dec 04 10:30:59 crc kubenswrapper[4831]: I1204 10:30:59.819926 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-867d87977b-m5ds4" Dec 04 10:30:59 crc kubenswrapper[4831]: I1204 10:30:59.822208 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-8dcqz" event={"ID":"10402b7d-27c5-4672-ba4e-2e67fcb5bf68","Type":"ContainerStarted","Data":"83c7609b87a221219ef027f57e0c66a686563d8b80960fea2b577832f9d8ff86"} Dec 04 10:30:59 crc kubenswrapper[4831]: I1204 10:30:59.822755 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-8dcqz" Dec 04 10:30:59 crc kubenswrapper[4831]: I1204 10:30:59.824345 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-54485f899-nj5xp" podStartSLOduration=4.987603061 podStartE2EDuration="18.824321276s" podCreationTimestamp="2025-12-04 10:30:41 +0000 UTC" firstStartedPulling="2025-12-04 10:30:42.982840597 +0000 UTC m=+939.932015901" lastFinishedPulling="2025-12-04 10:30:56.819558802 +0000 UTC m=+953.768734116" observedRunningTime="2025-12-04 10:30:59.796067331 +0000 UTC m=+956.745242645" watchObservedRunningTime="2025-12-04 10:30:59.824321276 +0000 UTC m=+956.773496600" Dec 04 10:30:59 crc kubenswrapper[4831]: I1204 10:30:59.828063 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-8ff6n" event={"ID":"fa61aacd-6e6d-4d05-a918-d81916f3f187","Type":"ContainerStarted","Data":"28e218c9783ba9b7b0cdeaa5f444faa1672513b08c4465afd80462e2547cd6f2"} Dec 04 10:30:59 crc kubenswrapper[4831]: I1204 10:30:59.828592 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-8ff6n" Dec 04 10:30:59 crc kubenswrapper[4831]: I1204 10:30:59.839802 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-jcvk6" podStartSLOduration=4.824347101 podStartE2EDuration="18.839776828s" podCreationTimestamp="2025-12-04 10:30:41 +0000 UTC" firstStartedPulling="2025-12-04 10:30:43.245550441 +0000 UTC m=+940.194725755" lastFinishedPulling="2025-12-04 10:30:57.260980168 +0000 UTC m=+954.210155482" observedRunningTime="2025-12-04 10:30:59.837414575 +0000 UTC m=+956.786589889" watchObservedRunningTime="2025-12-04 10:30:59.839776828 +0000 UTC m=+956.788952142" Dec 04 10:30:59 crc kubenswrapper[4831]: I1204 10:30:59.845517 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-szknd" event={"ID":"5006fa30-e563-468f-87c1-d062ca2aacc9","Type":"ContainerStarted","Data":"b445144c3c9d5a6911ea63a0b29da1187a41e613ea30a41059a9acfb604dff91"} Dec 04 10:30:59 crc kubenswrapper[4831]: I1204 10:30:59.845560 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-szknd" Dec 04 10:30:59 crc kubenswrapper[4831]: E1204 10:30:59.847290 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:57d9cb0034a7d5c7a39410fcb619ade2010e6855344dc3a0bc2bfd98cdf345d8\\\"\"" pod="openstack-operators/manila-operator-controller-manager-5cbc8c7f96-l5fq4" podUID="09dfcf4a-2343-414a-a731-a64a914ab3db" Dec 04 10:30:59 crc kubenswrapper[4831]: E1204 10:30:59.849087 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:f4b6baa2b8a661351cfc24fff5aacee5aa4198106618700cfa47ec3a75f88b31\\\"\"" pod="openstack-operators/glance-operator-controller-manager-85fbd69fcd-pkdwc" podUID="2f1667cb-e22c-443f-ab25-594205cd0f52" Dec 04 10:30:59 crc kubenswrapper[4831]: I1204 10:30:59.882169 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-bcnp2" podStartSLOduration=4.239027244 podStartE2EDuration="18.882136349s" podCreationTimestamp="2025-12-04 10:30:41 +0000 UTC" firstStartedPulling="2025-12-04 10:30:42.175470921 +0000 UTC m=+939.124646235" lastFinishedPulling="2025-12-04 10:30:56.818580036 +0000 UTC m=+953.767755340" observedRunningTime="2025-12-04 10:30:59.85858563 +0000 UTC m=+956.807760944" watchObservedRunningTime="2025-12-04 10:30:59.882136349 +0000 UTC m=+956.831311663" Dec 04 10:30:59 crc kubenswrapper[4831]: I1204 10:30:59.903352 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-867d87977b-m5ds4" podStartSLOduration=6.337386369 podStartE2EDuration="18.903329085s" podCreationTimestamp="2025-12-04 10:30:41 +0000 UTC" firstStartedPulling="2025-12-04 10:30:43.245173811 +0000 UTC m=+940.194349125" lastFinishedPulling="2025-12-04 10:30:55.811116517 +0000 UTC m=+952.760291841" observedRunningTime="2025-12-04 10:30:59.894234352 +0000 UTC m=+956.843409666" watchObservedRunningTime="2025-12-04 10:30:59.903329085 +0000 UTC m=+956.852504399" Dec 04 10:30:59 crc kubenswrapper[4831]: I1204 10:30:59.938489 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-szknd" podStartSLOduration=4.696449787 podStartE2EDuration="18.938469903s" podCreationTimestamp="2025-12-04 10:30:41 +0000 UTC" firstStartedPulling="2025-12-04 10:30:42.57657472 +0000 UTC m=+939.525750034" lastFinishedPulling="2025-12-04 10:30:56.818594836 +0000 UTC m=+953.767770150" observedRunningTime="2025-12-04 10:30:59.932085873 +0000 UTC m=+956.881261207" watchObservedRunningTime="2025-12-04 10:30:59.938469903 +0000 UTC m=+956.887645217" Dec 04 10:30:59 crc kubenswrapper[4831]: I1204 10:30:59.958514 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-8ff6n" podStartSLOduration=5.019051041 podStartE2EDuration="18.958486798s" podCreationTimestamp="2025-12-04 10:30:41 +0000 UTC" firstStartedPulling="2025-12-04 10:30:42.879308853 +0000 UTC m=+939.828484167" lastFinishedPulling="2025-12-04 10:30:56.81874457 +0000 UTC m=+953.767919924" observedRunningTime="2025-12-04 10:30:59.949357464 +0000 UTC m=+956.898532778" watchObservedRunningTime="2025-12-04 10:30:59.958486798 +0000 UTC m=+956.907662112" Dec 04 10:31:00 crc kubenswrapper[4831]: I1204 10:31:00.000786 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-78d6x" podStartSLOduration=5.251498296 podStartE2EDuration="19.000751406s" podCreationTimestamp="2025-12-04 10:30:41 +0000 UTC" firstStartedPulling="2025-12-04 10:30:43.070888458 +0000 UTC m=+940.020063772" lastFinishedPulling="2025-12-04 10:30:56.820141568 +0000 UTC m=+953.769316882" observedRunningTime="2025-12-04 10:30:59.994807138 +0000 UTC m=+956.943982462" watchObservedRunningTime="2025-12-04 10:31:00.000751406 +0000 UTC m=+956.949926730" Dec 04 10:31:00 crc kubenswrapper[4831]: I1204 10:31:00.001177 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-79cc9d59f5-kwgb2" podStartSLOduration=5.040887242 podStartE2EDuration="19.001171787s" podCreationTimestamp="2025-12-04 10:30:41 +0000 UTC" firstStartedPulling="2025-12-04 10:30:42.860648514 +0000 UTC m=+939.809823828" lastFinishedPulling="2025-12-04 10:30:56.820933059 +0000 UTC m=+953.770108373" observedRunningTime="2025-12-04 10:30:59.974585628 +0000 UTC m=+956.923760942" watchObservedRunningTime="2025-12-04 10:31:00.001171787 +0000 UTC m=+956.950347101" Dec 04 10:31:00 crc kubenswrapper[4831]: I1204 10:31:00.019318 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-8dcqz" podStartSLOduration=4.777993754 podStartE2EDuration="19.019291531s" podCreationTimestamp="2025-12-04 10:30:41 +0000 UTC" firstStartedPulling="2025-12-04 10:30:43.026673587 +0000 UTC m=+939.975848891" lastFinishedPulling="2025-12-04 10:30:57.267971354 +0000 UTC m=+954.217146668" observedRunningTime="2025-12-04 10:31:00.015814868 +0000 UTC m=+956.964990192" watchObservedRunningTime="2025-12-04 10:31:00.019291531 +0000 UTC m=+956.968466845" Dec 04 10:31:00 crc kubenswrapper[4831]: I1204 10:31:00.038419 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-8f6687c44-6wj9b" podStartSLOduration=5.605410255 podStartE2EDuration="19.038395781s" podCreationTimestamp="2025-12-04 10:30:41 +0000 UTC" firstStartedPulling="2025-12-04 10:30:43.386567286 +0000 UTC m=+940.335742600" lastFinishedPulling="2025-12-04 10:30:56.819552812 +0000 UTC m=+953.768728126" observedRunningTime="2025-12-04 10:31:00.031308062 +0000 UTC m=+956.980483396" watchObservedRunningTime="2025-12-04 10:31:00.038395781 +0000 UTC m=+956.987571095" Dec 04 10:31:02 crc kubenswrapper[4831]: I1204 10:31:02.073290 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-79d658b66d-7djfw" Dec 04 10:31:02 crc kubenswrapper[4831]: I1204 10:31:02.481050 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-867d87977b-m5ds4" Dec 04 10:31:02 crc kubenswrapper[4831]: I1204 10:31:02.512308 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-8f6687c44-6wj9b" Dec 04 10:31:02 crc kubenswrapper[4831]: I1204 10:31:02.552653 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-bb86466d8-x2g4s" Dec 04 10:31:11 crc kubenswrapper[4831]: I1204 10:31:11.547765 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-bcnp2" Dec 04 10:31:11 crc kubenswrapper[4831]: I1204 10:31:11.625056 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-748967c98-qdltx" Dec 04 10:31:11 crc kubenswrapper[4831]: I1204 10:31:11.628909 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-szknd" Dec 04 10:31:11 crc kubenswrapper[4831]: I1204 10:31:11.719378 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-8ff6n" Dec 04 10:31:11 crc kubenswrapper[4831]: I1204 10:31:11.750776 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-78d6x" Dec 04 10:31:11 crc kubenswrapper[4831]: I1204 10:31:11.836840 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-6c55d8d69b-5945b" Dec 04 10:31:11 crc kubenswrapper[4831]: I1204 10:31:11.853560 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-54485f899-nj5xp" Dec 04 10:31:11 crc kubenswrapper[4831]: I1204 10:31:11.936371 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-79cc9d59f5-kwgb2" Dec 04 10:31:12 crc kubenswrapper[4831]: I1204 10:31:12.000263 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-8dcqz" Dec 04 10:31:12 crc kubenswrapper[4831]: I1204 10:31:12.264447 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-jcvk6" Dec 04 10:31:20 crc kubenswrapper[4831]: E1204 10:31:20.461692 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.47:5001/openstack-k8s-operators/watcher-operator:5e4745ad47403efffe48968672fd7cd5232c27d7" Dec 04 10:31:20 crc kubenswrapper[4831]: E1204 10:31:20.462229 4831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.47:5001/openstack-k8s-operators/watcher-operator:5e4745ad47403efffe48968672fd7cd5232c27d7" Dec 04 10:31:20 crc kubenswrapper[4831]: E1204 10:31:20.462361 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.47:5001/openstack-k8s-operators/watcher-operator:5e4745ad47403efffe48968672fd7cd5232c27d7,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z8v98,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-7f99c4b8d7-bl79v_openstack-operators(1cd0c273-75dc-4c87-bbf5-0a68b76f8ca6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 10:31:20 crc kubenswrapper[4831]: E1204 10:31:20.463725 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-7f99c4b8d7-bl79v" podUID="1cd0c273-75dc-4c87-bbf5-0a68b76f8ca6" Dec 04 10:31:21 crc kubenswrapper[4831]: E1204 10:31:21.064568 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:66928f0eae5206f671ac7b21f79953e37009c54187d768dc6e03fe0a3d202b3b" Dec 04 10:31:21 crc kubenswrapper[4831]: E1204 10:31:21.065000 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:66928f0eae5206f671ac7b21f79953e37009c54187d768dc6e03fe0a3d202b3b,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_API_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_PROC_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-processor:current,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8mn84,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-77868f484-2j4hr_openstack-operators(2cce552d-d699-47f9-9405-1dd33478f23d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 10:31:21 crc kubenswrapper[4831]: E1204 10:31:21.066276 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-2j4hr" podUID="2cce552d-d699-47f9-9405-1dd33478f23d" Dec 04 10:31:22 crc kubenswrapper[4831]: I1204 10:31:22.044470 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-d5fb87cb8-zcm4n" event={"ID":"9935df64-3d24-41fc-bc66-c2c576211287","Type":"ContainerStarted","Data":"58d46b368e3ea99b4926d4c2a6958e612c7c222c693c939b1ecf257cd96ed5c1"} Dec 04 10:31:22 crc kubenswrapper[4831]: I1204 10:31:22.044933 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-d5fb87cb8-zcm4n" Dec 04 10:31:22 crc kubenswrapper[4831]: I1204 10:31:22.047420 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-85fbd69fcd-pkdwc" event={"ID":"2f1667cb-e22c-443f-ab25-594205cd0f52","Type":"ContainerStarted","Data":"52ecc10f877b1612fef1e8e1d5758ac7e649a25b82d2b88a1e459a04db60ba60"} Dec 04 10:31:22 crc kubenswrapper[4831]: I1204 10:31:22.047899 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-85fbd69fcd-pkdwc" Dec 04 10:31:22 crc kubenswrapper[4831]: I1204 10:31:22.050182 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-695797c565-7tbpv" event={"ID":"b3bc64ee-f0c4-4c2a-99b7-0c7bd8a5a970","Type":"ContainerStarted","Data":"ab6bfa0bff3a3d4d8061da9765c6193ad39ca2207abd80ece50d92871e22f416"} Dec 04 10:31:22 crc kubenswrapper[4831]: I1204 10:31:22.050375 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-695797c565-7tbpv" Dec 04 10:31:22 crc kubenswrapper[4831]: I1204 10:31:22.052096 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-cvbgc" event={"ID":"63d30685-d6c8-44cf-a586-2bb8030844da","Type":"ContainerStarted","Data":"ac40357a4749eedbbed66edfc81bf31e3c9812c81758849dc10d989dfeb6a694"} Dec 04 10:31:22 crc kubenswrapper[4831]: I1204 10:31:22.054189 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-58879495c-8ffv5" event={"ID":"b4b2ad0a-7308-4184-9d37-0c6b60ff873c","Type":"ContainerStarted","Data":"b4d06bc4bcd48c4c8964f77969dc634e6a04761516b3f70af9a95bf38060d2fa"} Dec 04 10:31:22 crc kubenswrapper[4831]: I1204 10:31:22.054547 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-58879495c-8ffv5" Dec 04 10:31:22 crc kubenswrapper[4831]: I1204 10:31:22.057120 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5cbc8c7f96-l5fq4" event={"ID":"09dfcf4a-2343-414a-a731-a64a914ab3db","Type":"ContainerStarted","Data":"3b32eb1635194340fa760c9e1d4bcda6a1cb74c0609a5b4c7e1d4c7e269accb4"} Dec 04 10:31:22 crc kubenswrapper[4831]: I1204 10:31:22.057243 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-5cbc8c7f96-l5fq4" Dec 04 10:31:22 crc kubenswrapper[4831]: I1204 10:31:22.068085 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-d5fb87cb8-zcm4n" podStartSLOduration=3.322700295 podStartE2EDuration="41.06806697s" podCreationTimestamp="2025-12-04 10:30:41 +0000 UTC" firstStartedPulling="2025-12-04 10:30:43.338874612 +0000 UTC m=+940.288049926" lastFinishedPulling="2025-12-04 10:31:21.084241287 +0000 UTC m=+978.033416601" observedRunningTime="2025-12-04 10:31:22.062952655 +0000 UTC m=+979.012127979" watchObservedRunningTime="2025-12-04 10:31:22.06806697 +0000 UTC m=+979.017242284" Dec 04 10:31:22 crc kubenswrapper[4831]: I1204 10:31:22.088491 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-58879495c-8ffv5" podStartSLOduration=3.9929384900000002 podStartE2EDuration="41.088464518s" podCreationTimestamp="2025-12-04 10:30:41 +0000 UTC" firstStartedPulling="2025-12-04 10:30:43.338064491 +0000 UTC m=+940.287239805" lastFinishedPulling="2025-12-04 10:31:20.433590519 +0000 UTC m=+977.382765833" observedRunningTime="2025-12-04 10:31:22.080494858 +0000 UTC m=+979.029670222" watchObservedRunningTime="2025-12-04 10:31:22.088464518 +0000 UTC m=+979.037639872" Dec 04 10:31:22 crc kubenswrapper[4831]: I1204 10:31:22.120229 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-695797c565-7tbpv" podStartSLOduration=3.449678949 podStartE2EDuration="41.120204355s" podCreationTimestamp="2025-12-04 10:30:41 +0000 UTC" firstStartedPulling="2025-12-04 10:30:43.477572185 +0000 UTC m=+940.426747499" lastFinishedPulling="2025-12-04 10:31:21.148097551 +0000 UTC m=+978.097272905" observedRunningTime="2025-12-04 10:31:22.106421712 +0000 UTC m=+979.055597026" watchObservedRunningTime="2025-12-04 10:31:22.120204355 +0000 UTC m=+979.069379709" Dec 04 10:31:22 crc kubenswrapper[4831]: I1204 10:31:22.124147 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-85fbd69fcd-pkdwc" podStartSLOduration=2.616740853 podStartE2EDuration="41.124123199s" podCreationTimestamp="2025-12-04 10:30:41 +0000 UTC" firstStartedPulling="2025-12-04 10:30:42.647979336 +0000 UTC m=+939.597154650" lastFinishedPulling="2025-12-04 10:31:21.155361692 +0000 UTC m=+978.104536996" observedRunningTime="2025-12-04 10:31:22.123394089 +0000 UTC m=+979.072569453" watchObservedRunningTime="2025-12-04 10:31:22.124123199 +0000 UTC m=+979.073298523" Dec 04 10:31:22 crc kubenswrapper[4831]: I1204 10:31:22.140342 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-cvbgc" podStartSLOduration=2.555222301 podStartE2EDuration="40.140322956s" podCreationTimestamp="2025-12-04 10:30:42 +0000 UTC" firstStartedPulling="2025-12-04 10:30:43.56390844 +0000 UTC m=+940.513083794" lastFinishedPulling="2025-12-04 10:31:21.149009105 +0000 UTC m=+978.098184449" observedRunningTime="2025-12-04 10:31:22.136936486 +0000 UTC m=+979.086111800" watchObservedRunningTime="2025-12-04 10:31:22.140322956 +0000 UTC m=+979.089498270" Dec 04 10:31:22 crc kubenswrapper[4831]: I1204 10:31:22.162968 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-5cbc8c7f96-l5fq4" podStartSLOduration=3.053173563 podStartE2EDuration="41.162950753s" podCreationTimestamp="2025-12-04 10:30:41 +0000 UTC" firstStartedPulling="2025-12-04 10:30:43.047826602 +0000 UTC m=+939.997001916" lastFinishedPulling="2025-12-04 10:31:21.157603742 +0000 UTC m=+978.106779106" observedRunningTime="2025-12-04 10:31:22.159279886 +0000 UTC m=+979.108455200" watchObservedRunningTime="2025-12-04 10:31:22.162950753 +0000 UTC m=+979.112126067" Dec 04 10:31:31 crc kubenswrapper[4831]: I1204 10:31:31.701795 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-85fbd69fcd-pkdwc" Dec 04 10:31:31 crc kubenswrapper[4831]: I1204 10:31:31.995383 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-5cbc8c7f96-l5fq4" Dec 04 10:31:32 crc kubenswrapper[4831]: I1204 10:31:32.055438 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-58879495c-8ffv5" Dec 04 10:31:32 crc kubenswrapper[4831]: I1204 10:31:32.216025 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-d5fb87cb8-zcm4n" Dec 04 10:31:32 crc kubenswrapper[4831]: E1204 10:31:32.278830 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.47:5001/openstack-k8s-operators/watcher-operator:5e4745ad47403efffe48968672fd7cd5232c27d7\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-7f99c4b8d7-bl79v" podUID="1cd0c273-75dc-4c87-bbf5-0a68b76f8ca6" Dec 04 10:31:32 crc kubenswrapper[4831]: I1204 10:31:32.519762 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-695797c565-7tbpv" Dec 04 10:31:35 crc kubenswrapper[4831]: E1204 10:31:35.281322 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:66928f0eae5206f671ac7b21f79953e37009c54187d768dc6e03fe0a3d202b3b\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-2j4hr" podUID="2cce552d-d699-47f9-9405-1dd33478f23d" Dec 04 10:31:47 crc kubenswrapper[4831]: I1204 10:31:47.292360 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7f99c4b8d7-bl79v" event={"ID":"1cd0c273-75dc-4c87-bbf5-0a68b76f8ca6","Type":"ContainerStarted","Data":"f9e7367c9d6c29867b83c256c4d516b014cb279a86073596be5306573e8592f8"} Dec 04 10:31:47 crc kubenswrapper[4831]: I1204 10:31:47.293179 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-7f99c4b8d7-bl79v" Dec 04 10:31:47 crc kubenswrapper[4831]: I1204 10:31:47.332011 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-7f99c4b8d7-bl79v" podStartSLOduration=3.60042456 podStartE2EDuration="1m6.331987563s" podCreationTimestamp="2025-12-04 10:30:41 +0000 UTC" firstStartedPulling="2025-12-04 10:30:43.339059427 +0000 UTC m=+940.288234741" lastFinishedPulling="2025-12-04 10:31:46.07062241 +0000 UTC m=+1003.019797744" observedRunningTime="2025-12-04 10:31:47.323455678 +0000 UTC m=+1004.272631002" watchObservedRunningTime="2025-12-04 10:31:47.331987563 +0000 UTC m=+1004.281162877" Dec 04 10:31:48 crc kubenswrapper[4831]: I1204 10:31:48.310745 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-2j4hr" event={"ID":"2cce552d-d699-47f9-9405-1dd33478f23d","Type":"ContainerStarted","Data":"070afcf47220bb1d7d250f1e4b3c7860ef3e7e815abecf2eed20626012faff1f"} Dec 04 10:31:48 crc kubenswrapper[4831]: I1204 10:31:48.311604 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-2j4hr" Dec 04 10:31:48 crc kubenswrapper[4831]: I1204 10:31:48.360249 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-2j4hr" podStartSLOduration=3.322991045 podStartE2EDuration="1m7.360228107s" podCreationTimestamp="2025-12-04 10:30:41 +0000 UTC" firstStartedPulling="2025-12-04 10:30:43.711520031 +0000 UTC m=+940.660695345" lastFinishedPulling="2025-12-04 10:31:47.748757083 +0000 UTC m=+1004.697932407" observedRunningTime="2025-12-04 10:31:48.352823061 +0000 UTC m=+1005.301998405" watchObservedRunningTime="2025-12-04 10:31:48.360228107 +0000 UTC m=+1005.309403441" Dec 04 10:31:51 crc kubenswrapper[4831]: I1204 10:31:51.971923 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:31:51 crc kubenswrapper[4831]: I1204 10:31:51.973793 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:31:52 crc kubenswrapper[4831]: I1204 10:31:52.568855 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-7f99c4b8d7-bl79v" Dec 04 10:31:52 crc kubenswrapper[4831]: I1204 10:31:52.903601 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-2j4hr" Dec 04 10:32:09 crc kubenswrapper[4831]: I1204 10:32:09.250452 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f59f64fdc-hl8mg"] Dec 04 10:32:09 crc kubenswrapper[4831]: I1204 10:32:09.252322 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f59f64fdc-hl8mg" Dec 04 10:32:09 crc kubenswrapper[4831]: I1204 10:32:09.254342 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 04 10:32:09 crc kubenswrapper[4831]: I1204 10:32:09.254511 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-ttk2z" Dec 04 10:32:09 crc kubenswrapper[4831]: I1204 10:32:09.256020 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 04 10:32:09 crc kubenswrapper[4831]: I1204 10:32:09.258904 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f59f64fdc-hl8mg"] Dec 04 10:32:09 crc kubenswrapper[4831]: I1204 10:32:09.259646 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 04 10:32:09 crc kubenswrapper[4831]: I1204 10:32:09.288164 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d46hz\" (UniqueName: \"kubernetes.io/projected/6e5d4c67-551c-49d3-b204-0562a898def5-kube-api-access-d46hz\") pod \"dnsmasq-dns-6f59f64fdc-hl8mg\" (UID: \"6e5d4c67-551c-49d3-b204-0562a898def5\") " pod="openstack/dnsmasq-dns-6f59f64fdc-hl8mg" Dec 04 10:32:09 crc kubenswrapper[4831]: I1204 10:32:09.288294 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e5d4c67-551c-49d3-b204-0562a898def5-config\") pod \"dnsmasq-dns-6f59f64fdc-hl8mg\" (UID: \"6e5d4c67-551c-49d3-b204-0562a898def5\") " pod="openstack/dnsmasq-dns-6f59f64fdc-hl8mg" Dec 04 10:32:09 crc kubenswrapper[4831]: I1204 10:32:09.328295 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-659447cd97-nphz2"] Dec 04 10:32:09 crc kubenswrapper[4831]: I1204 10:32:09.334806 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-659447cd97-nphz2"] Dec 04 10:32:09 crc kubenswrapper[4831]: I1204 10:32:09.334900 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-659447cd97-nphz2" Dec 04 10:32:09 crc kubenswrapper[4831]: I1204 10:32:09.337953 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 04 10:32:09 crc kubenswrapper[4831]: I1204 10:32:09.389511 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c23142e-4541-40cb-b870-aeb15f8af94a-dns-svc\") pod \"dnsmasq-dns-659447cd97-nphz2\" (UID: \"1c23142e-4541-40cb-b870-aeb15f8af94a\") " pod="openstack/dnsmasq-dns-659447cd97-nphz2" Dec 04 10:32:09 crc kubenswrapper[4831]: I1204 10:32:09.389564 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e5d4c67-551c-49d3-b204-0562a898def5-config\") pod \"dnsmasq-dns-6f59f64fdc-hl8mg\" (UID: \"6e5d4c67-551c-49d3-b204-0562a898def5\") " pod="openstack/dnsmasq-dns-6f59f64fdc-hl8mg" Dec 04 10:32:09 crc kubenswrapper[4831]: I1204 10:32:09.389594 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c23142e-4541-40cb-b870-aeb15f8af94a-config\") pod \"dnsmasq-dns-659447cd97-nphz2\" (UID: \"1c23142e-4541-40cb-b870-aeb15f8af94a\") " pod="openstack/dnsmasq-dns-659447cd97-nphz2" Dec 04 10:32:09 crc kubenswrapper[4831]: I1204 10:32:09.389634 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d46hz\" (UniqueName: \"kubernetes.io/projected/6e5d4c67-551c-49d3-b204-0562a898def5-kube-api-access-d46hz\") pod \"dnsmasq-dns-6f59f64fdc-hl8mg\" (UID: \"6e5d4c67-551c-49d3-b204-0562a898def5\") " pod="openstack/dnsmasq-dns-6f59f64fdc-hl8mg" Dec 04 10:32:09 crc kubenswrapper[4831]: I1204 10:32:09.389696 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j46f2\" (UniqueName: \"kubernetes.io/projected/1c23142e-4541-40cb-b870-aeb15f8af94a-kube-api-access-j46f2\") pod \"dnsmasq-dns-659447cd97-nphz2\" (UID: \"1c23142e-4541-40cb-b870-aeb15f8af94a\") " pod="openstack/dnsmasq-dns-659447cd97-nphz2" Dec 04 10:32:09 crc kubenswrapper[4831]: I1204 10:32:09.390528 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e5d4c67-551c-49d3-b204-0562a898def5-config\") pod \"dnsmasq-dns-6f59f64fdc-hl8mg\" (UID: \"6e5d4c67-551c-49d3-b204-0562a898def5\") " pod="openstack/dnsmasq-dns-6f59f64fdc-hl8mg" Dec 04 10:32:09 crc kubenswrapper[4831]: I1204 10:32:09.411578 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d46hz\" (UniqueName: \"kubernetes.io/projected/6e5d4c67-551c-49d3-b204-0562a898def5-kube-api-access-d46hz\") pod \"dnsmasq-dns-6f59f64fdc-hl8mg\" (UID: \"6e5d4c67-551c-49d3-b204-0562a898def5\") " pod="openstack/dnsmasq-dns-6f59f64fdc-hl8mg" Dec 04 10:32:09 crc kubenswrapper[4831]: I1204 10:32:09.491055 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c23142e-4541-40cb-b870-aeb15f8af94a-config\") pod \"dnsmasq-dns-659447cd97-nphz2\" (UID: \"1c23142e-4541-40cb-b870-aeb15f8af94a\") " pod="openstack/dnsmasq-dns-659447cd97-nphz2" Dec 04 10:32:09 crc kubenswrapper[4831]: I1204 10:32:09.491348 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j46f2\" (UniqueName: \"kubernetes.io/projected/1c23142e-4541-40cb-b870-aeb15f8af94a-kube-api-access-j46f2\") pod \"dnsmasq-dns-659447cd97-nphz2\" (UID: \"1c23142e-4541-40cb-b870-aeb15f8af94a\") " pod="openstack/dnsmasq-dns-659447cd97-nphz2" Dec 04 10:32:09 crc kubenswrapper[4831]: I1204 10:32:09.491382 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c23142e-4541-40cb-b870-aeb15f8af94a-dns-svc\") pod \"dnsmasq-dns-659447cd97-nphz2\" (UID: \"1c23142e-4541-40cb-b870-aeb15f8af94a\") " pod="openstack/dnsmasq-dns-659447cd97-nphz2" Dec 04 10:32:09 crc kubenswrapper[4831]: I1204 10:32:09.492008 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c23142e-4541-40cb-b870-aeb15f8af94a-dns-svc\") pod \"dnsmasq-dns-659447cd97-nphz2\" (UID: \"1c23142e-4541-40cb-b870-aeb15f8af94a\") " pod="openstack/dnsmasq-dns-659447cd97-nphz2" Dec 04 10:32:09 crc kubenswrapper[4831]: I1204 10:32:09.492124 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c23142e-4541-40cb-b870-aeb15f8af94a-config\") pod \"dnsmasq-dns-659447cd97-nphz2\" (UID: \"1c23142e-4541-40cb-b870-aeb15f8af94a\") " pod="openstack/dnsmasq-dns-659447cd97-nphz2" Dec 04 10:32:09 crc kubenswrapper[4831]: I1204 10:32:09.518945 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j46f2\" (UniqueName: \"kubernetes.io/projected/1c23142e-4541-40cb-b870-aeb15f8af94a-kube-api-access-j46f2\") pod \"dnsmasq-dns-659447cd97-nphz2\" (UID: \"1c23142e-4541-40cb-b870-aeb15f8af94a\") " pod="openstack/dnsmasq-dns-659447cd97-nphz2" Dec 04 10:32:09 crc kubenswrapper[4831]: I1204 10:32:09.571757 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f59f64fdc-hl8mg" Dec 04 10:32:09 crc kubenswrapper[4831]: I1204 10:32:09.667004 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-659447cd97-nphz2" Dec 04 10:32:10 crc kubenswrapper[4831]: I1204 10:32:10.054130 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f59f64fdc-hl8mg"] Dec 04 10:32:10 crc kubenswrapper[4831]: I1204 10:32:10.175380 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-659447cd97-nphz2"] Dec 04 10:32:10 crc kubenswrapper[4831]: W1204 10:32:10.178583 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c23142e_4541_40cb_b870_aeb15f8af94a.slice/crio-b73117ebb1926e8dcc06240025af80dcda76757aa84d03c8f4b169d546f09c7c WatchSource:0}: Error finding container b73117ebb1926e8dcc06240025af80dcda76757aa84d03c8f4b169d546f09c7c: Status 404 returned error can't find the container with id b73117ebb1926e8dcc06240025af80dcda76757aa84d03c8f4b169d546f09c7c Dec 04 10:32:10 crc kubenswrapper[4831]: I1204 10:32:10.511671 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f59f64fdc-hl8mg" event={"ID":"6e5d4c67-551c-49d3-b204-0562a898def5","Type":"ContainerStarted","Data":"879fe661b14f8e581859602c747dabc5fa6ba62d89f2433f4db94352b98b8138"} Dec 04 10:32:10 crc kubenswrapper[4831]: I1204 10:32:10.516196 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-659447cd97-nphz2" event={"ID":"1c23142e-4541-40cb-b870-aeb15f8af94a","Type":"ContainerStarted","Data":"b73117ebb1926e8dcc06240025af80dcda76757aa84d03c8f4b169d546f09c7c"} Dec 04 10:32:13 crc kubenswrapper[4831]: I1204 10:32:13.318209 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-659447cd97-nphz2"] Dec 04 10:32:13 crc kubenswrapper[4831]: I1204 10:32:13.345411 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d7964976f-vpdr9"] Dec 04 10:32:13 crc kubenswrapper[4831]: I1204 10:32:13.347174 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7964976f-vpdr9" Dec 04 10:32:13 crc kubenswrapper[4831]: I1204 10:32:13.379381 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4slkr\" (UniqueName: \"kubernetes.io/projected/4335ff7d-e14c-4dcc-9c0e-53638cc7cb07-kube-api-access-4slkr\") pod \"dnsmasq-dns-5d7964976f-vpdr9\" (UID: \"4335ff7d-e14c-4dcc-9c0e-53638cc7cb07\") " pod="openstack/dnsmasq-dns-5d7964976f-vpdr9" Dec 04 10:32:13 crc kubenswrapper[4831]: I1204 10:32:13.379930 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4335ff7d-e14c-4dcc-9c0e-53638cc7cb07-config\") pod \"dnsmasq-dns-5d7964976f-vpdr9\" (UID: \"4335ff7d-e14c-4dcc-9c0e-53638cc7cb07\") " pod="openstack/dnsmasq-dns-5d7964976f-vpdr9" Dec 04 10:32:13 crc kubenswrapper[4831]: I1204 10:32:13.380277 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4335ff7d-e14c-4dcc-9c0e-53638cc7cb07-dns-svc\") pod \"dnsmasq-dns-5d7964976f-vpdr9\" (UID: \"4335ff7d-e14c-4dcc-9c0e-53638cc7cb07\") " pod="openstack/dnsmasq-dns-5d7964976f-vpdr9" Dec 04 10:32:13 crc kubenswrapper[4831]: I1204 10:32:13.396467 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7964976f-vpdr9"] Dec 04 10:32:13 crc kubenswrapper[4831]: I1204 10:32:13.482684 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4slkr\" (UniqueName: \"kubernetes.io/projected/4335ff7d-e14c-4dcc-9c0e-53638cc7cb07-kube-api-access-4slkr\") pod \"dnsmasq-dns-5d7964976f-vpdr9\" (UID: \"4335ff7d-e14c-4dcc-9c0e-53638cc7cb07\") " pod="openstack/dnsmasq-dns-5d7964976f-vpdr9" Dec 04 10:32:13 crc kubenswrapper[4831]: I1204 10:32:13.482750 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4335ff7d-e14c-4dcc-9c0e-53638cc7cb07-config\") pod \"dnsmasq-dns-5d7964976f-vpdr9\" (UID: \"4335ff7d-e14c-4dcc-9c0e-53638cc7cb07\") " pod="openstack/dnsmasq-dns-5d7964976f-vpdr9" Dec 04 10:32:13 crc kubenswrapper[4831]: I1204 10:32:13.482817 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4335ff7d-e14c-4dcc-9c0e-53638cc7cb07-dns-svc\") pod \"dnsmasq-dns-5d7964976f-vpdr9\" (UID: \"4335ff7d-e14c-4dcc-9c0e-53638cc7cb07\") " pod="openstack/dnsmasq-dns-5d7964976f-vpdr9" Dec 04 10:32:13 crc kubenswrapper[4831]: I1204 10:32:13.484288 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4335ff7d-e14c-4dcc-9c0e-53638cc7cb07-dns-svc\") pod \"dnsmasq-dns-5d7964976f-vpdr9\" (UID: \"4335ff7d-e14c-4dcc-9c0e-53638cc7cb07\") " pod="openstack/dnsmasq-dns-5d7964976f-vpdr9" Dec 04 10:32:13 crc kubenswrapper[4831]: I1204 10:32:13.484390 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4335ff7d-e14c-4dcc-9c0e-53638cc7cb07-config\") pod \"dnsmasq-dns-5d7964976f-vpdr9\" (UID: \"4335ff7d-e14c-4dcc-9c0e-53638cc7cb07\") " pod="openstack/dnsmasq-dns-5d7964976f-vpdr9" Dec 04 10:32:13 crc kubenswrapper[4831]: I1204 10:32:13.502867 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4slkr\" (UniqueName: \"kubernetes.io/projected/4335ff7d-e14c-4dcc-9c0e-53638cc7cb07-kube-api-access-4slkr\") pod \"dnsmasq-dns-5d7964976f-vpdr9\" (UID: \"4335ff7d-e14c-4dcc-9c0e-53638cc7cb07\") " pod="openstack/dnsmasq-dns-5d7964976f-vpdr9" Dec 04 10:32:13 crc kubenswrapper[4831]: I1204 10:32:13.637628 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f59f64fdc-hl8mg"] Dec 04 10:32:13 crc kubenswrapper[4831]: I1204 10:32:13.692835 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7964976f-vpdr9" Dec 04 10:32:13 crc kubenswrapper[4831]: I1204 10:32:13.701720 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c99ddcf47-zm45x"] Dec 04 10:32:13 crc kubenswrapper[4831]: I1204 10:32:13.702981 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c99ddcf47-zm45x" Dec 04 10:32:13 crc kubenswrapper[4831]: I1204 10:32:13.711490 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c99ddcf47-zm45x"] Dec 04 10:32:13 crc kubenswrapper[4831]: I1204 10:32:13.787445 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9d7ac1c-c0c4-477c-babd-74166b286800-dns-svc\") pod \"dnsmasq-dns-5c99ddcf47-zm45x\" (UID: \"b9d7ac1c-c0c4-477c-babd-74166b286800\") " pod="openstack/dnsmasq-dns-5c99ddcf47-zm45x" Dec 04 10:32:13 crc kubenswrapper[4831]: I1204 10:32:13.787708 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9d7ac1c-c0c4-477c-babd-74166b286800-config\") pod \"dnsmasq-dns-5c99ddcf47-zm45x\" (UID: \"b9d7ac1c-c0c4-477c-babd-74166b286800\") " pod="openstack/dnsmasq-dns-5c99ddcf47-zm45x" Dec 04 10:32:13 crc kubenswrapper[4831]: I1204 10:32:13.787806 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wg72\" (UniqueName: \"kubernetes.io/projected/b9d7ac1c-c0c4-477c-babd-74166b286800-kube-api-access-8wg72\") pod \"dnsmasq-dns-5c99ddcf47-zm45x\" (UID: \"b9d7ac1c-c0c4-477c-babd-74166b286800\") " pod="openstack/dnsmasq-dns-5c99ddcf47-zm45x" Dec 04 10:32:13 crc kubenswrapper[4831]: I1204 10:32:13.889147 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9d7ac1c-c0c4-477c-babd-74166b286800-dns-svc\") pod \"dnsmasq-dns-5c99ddcf47-zm45x\" (UID: \"b9d7ac1c-c0c4-477c-babd-74166b286800\") " pod="openstack/dnsmasq-dns-5c99ddcf47-zm45x" Dec 04 10:32:13 crc kubenswrapper[4831]: I1204 10:32:13.889208 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9d7ac1c-c0c4-477c-babd-74166b286800-config\") pod \"dnsmasq-dns-5c99ddcf47-zm45x\" (UID: \"b9d7ac1c-c0c4-477c-babd-74166b286800\") " pod="openstack/dnsmasq-dns-5c99ddcf47-zm45x" Dec 04 10:32:13 crc kubenswrapper[4831]: I1204 10:32:13.889242 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wg72\" (UniqueName: \"kubernetes.io/projected/b9d7ac1c-c0c4-477c-babd-74166b286800-kube-api-access-8wg72\") pod \"dnsmasq-dns-5c99ddcf47-zm45x\" (UID: \"b9d7ac1c-c0c4-477c-babd-74166b286800\") " pod="openstack/dnsmasq-dns-5c99ddcf47-zm45x" Dec 04 10:32:13 crc kubenswrapper[4831]: I1204 10:32:13.890531 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9d7ac1c-c0c4-477c-babd-74166b286800-config\") pod \"dnsmasq-dns-5c99ddcf47-zm45x\" (UID: \"b9d7ac1c-c0c4-477c-babd-74166b286800\") " pod="openstack/dnsmasq-dns-5c99ddcf47-zm45x" Dec 04 10:32:13 crc kubenswrapper[4831]: I1204 10:32:13.890575 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9d7ac1c-c0c4-477c-babd-74166b286800-dns-svc\") pod \"dnsmasq-dns-5c99ddcf47-zm45x\" (UID: \"b9d7ac1c-c0c4-477c-babd-74166b286800\") " pod="openstack/dnsmasq-dns-5c99ddcf47-zm45x" Dec 04 10:32:13 crc kubenswrapper[4831]: I1204 10:32:13.906453 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wg72\" (UniqueName: \"kubernetes.io/projected/b9d7ac1c-c0c4-477c-babd-74166b286800-kube-api-access-8wg72\") pod \"dnsmasq-dns-5c99ddcf47-zm45x\" (UID: \"b9d7ac1c-c0c4-477c-babd-74166b286800\") " pod="openstack/dnsmasq-dns-5c99ddcf47-zm45x" Dec 04 10:32:13 crc kubenswrapper[4831]: I1204 10:32:13.947428 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c99ddcf47-zm45x"] Dec 04 10:32:13 crc kubenswrapper[4831]: I1204 10:32:13.947914 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c99ddcf47-zm45x" Dec 04 10:32:13 crc kubenswrapper[4831]: I1204 10:32:13.972728 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79bc4fb897-d58q6"] Dec 04 10:32:13 crc kubenswrapper[4831]: I1204 10:32:13.973995 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bc4fb897-d58q6" Dec 04 10:32:13 crc kubenswrapper[4831]: I1204 10:32:13.983315 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bc4fb897-d58q6"] Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.091534 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx56s\" (UniqueName: \"kubernetes.io/projected/bd68160b-20a1-40d4-b7c9-827aa56cf7db-kube-api-access-zx56s\") pod \"dnsmasq-dns-79bc4fb897-d58q6\" (UID: \"bd68160b-20a1-40d4-b7c9-827aa56cf7db\") " pod="openstack/dnsmasq-dns-79bc4fb897-d58q6" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.091600 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd68160b-20a1-40d4-b7c9-827aa56cf7db-dns-svc\") pod \"dnsmasq-dns-79bc4fb897-d58q6\" (UID: \"bd68160b-20a1-40d4-b7c9-827aa56cf7db\") " pod="openstack/dnsmasq-dns-79bc4fb897-d58q6" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.091719 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd68160b-20a1-40d4-b7c9-827aa56cf7db-config\") pod \"dnsmasq-dns-79bc4fb897-d58q6\" (UID: \"bd68160b-20a1-40d4-b7c9-827aa56cf7db\") " pod="openstack/dnsmasq-dns-79bc4fb897-d58q6" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.193386 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd68160b-20a1-40d4-b7c9-827aa56cf7db-config\") pod \"dnsmasq-dns-79bc4fb897-d58q6\" (UID: \"bd68160b-20a1-40d4-b7c9-827aa56cf7db\") " pod="openstack/dnsmasq-dns-79bc4fb897-d58q6" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.193558 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd68160b-20a1-40d4-b7c9-827aa56cf7db-dns-svc\") pod \"dnsmasq-dns-79bc4fb897-d58q6\" (UID: \"bd68160b-20a1-40d4-b7c9-827aa56cf7db\") " pod="openstack/dnsmasq-dns-79bc4fb897-d58q6" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.193599 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zx56s\" (UniqueName: \"kubernetes.io/projected/bd68160b-20a1-40d4-b7c9-827aa56cf7db-kube-api-access-zx56s\") pod \"dnsmasq-dns-79bc4fb897-d58q6\" (UID: \"bd68160b-20a1-40d4-b7c9-827aa56cf7db\") " pod="openstack/dnsmasq-dns-79bc4fb897-d58q6" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.194310 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd68160b-20a1-40d4-b7c9-827aa56cf7db-config\") pod \"dnsmasq-dns-79bc4fb897-d58q6\" (UID: \"bd68160b-20a1-40d4-b7c9-827aa56cf7db\") " pod="openstack/dnsmasq-dns-79bc4fb897-d58q6" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.194366 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd68160b-20a1-40d4-b7c9-827aa56cf7db-dns-svc\") pod \"dnsmasq-dns-79bc4fb897-d58q6\" (UID: \"bd68160b-20a1-40d4-b7c9-827aa56cf7db\") " pod="openstack/dnsmasq-dns-79bc4fb897-d58q6" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.211499 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx56s\" (UniqueName: \"kubernetes.io/projected/bd68160b-20a1-40d4-b7c9-827aa56cf7db-kube-api-access-zx56s\") pod \"dnsmasq-dns-79bc4fb897-d58q6\" (UID: \"bd68160b-20a1-40d4-b7c9-827aa56cf7db\") " pod="openstack/dnsmasq-dns-79bc4fb897-d58q6" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.291857 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bc4fb897-d58q6" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.477281 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.478694 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.480493 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.480686 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-gh87l" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.481003 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.481182 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.482006 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.482220 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.482388 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.500400 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.598422 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " pod="openstack/rabbitmq-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.598469 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/462ff702-35a4-4cbe-8155-3ce8a321bf48-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " pod="openstack/rabbitmq-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.598496 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/462ff702-35a4-4cbe-8155-3ce8a321bf48-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " pod="openstack/rabbitmq-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.598524 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/462ff702-35a4-4cbe-8155-3ce8a321bf48-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " pod="openstack/rabbitmq-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.598544 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5k49\" (UniqueName: \"kubernetes.io/projected/462ff702-35a4-4cbe-8155-3ce8a321bf48-kube-api-access-d5k49\") pod \"rabbitmq-server-0\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " pod="openstack/rabbitmq-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.598569 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/462ff702-35a4-4cbe-8155-3ce8a321bf48-server-conf\") pod \"rabbitmq-server-0\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " pod="openstack/rabbitmq-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.598625 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/462ff702-35a4-4cbe-8155-3ce8a321bf48-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " pod="openstack/rabbitmq-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.598648 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/462ff702-35a4-4cbe-8155-3ce8a321bf48-config-data\") pod \"rabbitmq-server-0\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " pod="openstack/rabbitmq-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.598680 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/462ff702-35a4-4cbe-8155-3ce8a321bf48-pod-info\") pod \"rabbitmq-server-0\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " pod="openstack/rabbitmq-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.598697 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/462ff702-35a4-4cbe-8155-3ce8a321bf48-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " pod="openstack/rabbitmq-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.598713 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/462ff702-35a4-4cbe-8155-3ce8a321bf48-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " pod="openstack/rabbitmq-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.700407 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/462ff702-35a4-4cbe-8155-3ce8a321bf48-config-data\") pod \"rabbitmq-server-0\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " pod="openstack/rabbitmq-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.700463 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/462ff702-35a4-4cbe-8155-3ce8a321bf48-pod-info\") pod \"rabbitmq-server-0\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " pod="openstack/rabbitmq-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.700490 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/462ff702-35a4-4cbe-8155-3ce8a321bf48-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " pod="openstack/rabbitmq-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.700507 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/462ff702-35a4-4cbe-8155-3ce8a321bf48-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " pod="openstack/rabbitmq-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.700533 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " pod="openstack/rabbitmq-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.700547 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/462ff702-35a4-4cbe-8155-3ce8a321bf48-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " pod="openstack/rabbitmq-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.700574 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/462ff702-35a4-4cbe-8155-3ce8a321bf48-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " pod="openstack/rabbitmq-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.700597 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/462ff702-35a4-4cbe-8155-3ce8a321bf48-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " pod="openstack/rabbitmq-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.700615 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5k49\" (UniqueName: \"kubernetes.io/projected/462ff702-35a4-4cbe-8155-3ce8a321bf48-kube-api-access-d5k49\") pod \"rabbitmq-server-0\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " pod="openstack/rabbitmq-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.700631 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/462ff702-35a4-4cbe-8155-3ce8a321bf48-server-conf\") pod \"rabbitmq-server-0\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " pod="openstack/rabbitmq-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.700706 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/462ff702-35a4-4cbe-8155-3ce8a321bf48-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " pod="openstack/rabbitmq-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.701260 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.701592 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/462ff702-35a4-4cbe-8155-3ce8a321bf48-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " pod="openstack/rabbitmq-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.701625 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/462ff702-35a4-4cbe-8155-3ce8a321bf48-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " pod="openstack/rabbitmq-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.702185 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/462ff702-35a4-4cbe-8155-3ce8a321bf48-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " pod="openstack/rabbitmq-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.702865 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/462ff702-35a4-4cbe-8155-3ce8a321bf48-server-conf\") pod \"rabbitmq-server-0\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " pod="openstack/rabbitmq-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.705134 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/462ff702-35a4-4cbe-8155-3ce8a321bf48-config-data\") pod \"rabbitmq-server-0\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " pod="openstack/rabbitmq-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.705142 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/462ff702-35a4-4cbe-8155-3ce8a321bf48-pod-info\") pod \"rabbitmq-server-0\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " pod="openstack/rabbitmq-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.705545 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/462ff702-35a4-4cbe-8155-3ce8a321bf48-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " pod="openstack/rabbitmq-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.707518 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/462ff702-35a4-4cbe-8155-3ce8a321bf48-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " pod="openstack/rabbitmq-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.709183 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/462ff702-35a4-4cbe-8155-3ce8a321bf48-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " pod="openstack/rabbitmq-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.724908 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " pod="openstack/rabbitmq-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.730068 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5k49\" (UniqueName: \"kubernetes.io/projected/462ff702-35a4-4cbe-8155-3ce8a321bf48-kube-api-access-d5k49\") pod \"rabbitmq-server-0\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " pod="openstack/rabbitmq-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.789313 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.793058 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.796818 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.796921 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.797016 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.797055 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-ll27g" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.797146 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.797205 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.797251 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.801580 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.802073 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.904689 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d1e04df-4c2a-440f-b533-9903a58c8ecc-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.904781 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2d1e04df-4c2a-440f-b533-9903a58c8ecc-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.904813 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2d1e04df-4c2a-440f-b533-9903a58c8ecc-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.904864 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2d1e04df-4c2a-440f-b533-9903a58c8ecc-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.904888 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2d1e04df-4c2a-440f-b533-9903a58c8ecc-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.904907 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw7nj\" (UniqueName: \"kubernetes.io/projected/2d1e04df-4c2a-440f-b533-9903a58c8ecc-kube-api-access-vw7nj\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.904931 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2d1e04df-4c2a-440f-b533-9903a58c8ecc-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.904963 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.905003 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2d1e04df-4c2a-440f-b533-9903a58c8ecc-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.905027 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2d1e04df-4c2a-440f-b533-9903a58c8ecc-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:32:14 crc kubenswrapper[4831]: I1204 10:32:14.905054 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2d1e04df-4c2a-440f-b533-9903a58c8ecc-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.007058 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2d1e04df-4c2a-440f-b533-9903a58c8ecc-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.007136 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2d1e04df-4c2a-440f-b533-9903a58c8ecc-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.007167 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw7nj\" (UniqueName: \"kubernetes.io/projected/2d1e04df-4c2a-440f-b533-9903a58c8ecc-kube-api-access-vw7nj\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.007196 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2d1e04df-4c2a-440f-b533-9903a58c8ecc-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.007234 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.007276 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2d1e04df-4c2a-440f-b533-9903a58c8ecc-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.007301 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2d1e04df-4c2a-440f-b533-9903a58c8ecc-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.007332 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2d1e04df-4c2a-440f-b533-9903a58c8ecc-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.007387 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d1e04df-4c2a-440f-b533-9903a58c8ecc-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.007446 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2d1e04df-4c2a-440f-b533-9903a58c8ecc-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.007473 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2d1e04df-4c2a-440f-b533-9903a58c8ecc-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.008576 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2d1e04df-4c2a-440f-b533-9903a58c8ecc-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.008677 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.009148 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2d1e04df-4c2a-440f-b533-9903a58c8ecc-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.010815 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d1e04df-4c2a-440f-b533-9903a58c8ecc-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.011067 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2d1e04df-4c2a-440f-b533-9903a58c8ecc-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.012797 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2d1e04df-4c2a-440f-b533-9903a58c8ecc-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.013347 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2d1e04df-4c2a-440f-b533-9903a58c8ecc-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.013426 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2d1e04df-4c2a-440f-b533-9903a58c8ecc-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.013786 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2d1e04df-4c2a-440f-b533-9903a58c8ecc-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.027228 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2d1e04df-4c2a-440f-b533-9903a58c8ecc-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.028345 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw7nj\" (UniqueName: \"kubernetes.io/projected/2d1e04df-4c2a-440f-b533-9903a58c8ecc-kube-api-access-vw7nj\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.040930 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.080933 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.082543 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-notifications-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.084295 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-erlang-cookie" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.086114 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-notifications-svc" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.086178 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-default-user" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.086443 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-server-dockercfg-pl2nf" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.086732 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-plugins-conf" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.086793 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-server-conf" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.086898 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-config-data" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.095977 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.108361 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d13ed0c0-494b-46b5-965d-1426a9575119-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"d13ed0c0-494b-46b5-965d-1426a9575119\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.108457 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d13ed0c0-494b-46b5-965d-1426a9575119-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"d13ed0c0-494b-46b5-965d-1426a9575119\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.108533 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d13ed0c0-494b-46b5-965d-1426a9575119-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"d13ed0c0-494b-46b5-965d-1426a9575119\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.108595 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4jmb\" (UniqueName: \"kubernetes.io/projected/d13ed0c0-494b-46b5-965d-1426a9575119-kube-api-access-d4jmb\") pod \"rabbitmq-notifications-server-0\" (UID: \"d13ed0c0-494b-46b5-965d-1426a9575119\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.108624 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d13ed0c0-494b-46b5-965d-1426a9575119-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"d13ed0c0-494b-46b5-965d-1426a9575119\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.108645 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d13ed0c0-494b-46b5-965d-1426a9575119-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"d13ed0c0-494b-46b5-965d-1426a9575119\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.108711 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d13ed0c0-494b-46b5-965d-1426a9575119-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"d13ed0c0-494b-46b5-965d-1426a9575119\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.108759 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d13ed0c0-494b-46b5-965d-1426a9575119-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"d13ed0c0-494b-46b5-965d-1426a9575119\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.108813 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d13ed0c0-494b-46b5-965d-1426a9575119-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"d13ed0c0-494b-46b5-965d-1426a9575119\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.108933 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d13ed0c0-494b-46b5-965d-1426a9575119-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"d13ed0c0-494b-46b5-965d-1426a9575119\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.108965 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"d13ed0c0-494b-46b5-965d-1426a9575119\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.125069 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.210001 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d13ed0c0-494b-46b5-965d-1426a9575119-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"d13ed0c0-494b-46b5-965d-1426a9575119\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.210070 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d13ed0c0-494b-46b5-965d-1426a9575119-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"d13ed0c0-494b-46b5-965d-1426a9575119\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.210125 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d13ed0c0-494b-46b5-965d-1426a9575119-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"d13ed0c0-494b-46b5-965d-1426a9575119\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.210194 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d13ed0c0-494b-46b5-965d-1426a9575119-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"d13ed0c0-494b-46b5-965d-1426a9575119\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.210225 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"d13ed0c0-494b-46b5-965d-1426a9575119\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.210254 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d13ed0c0-494b-46b5-965d-1426a9575119-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"d13ed0c0-494b-46b5-965d-1426a9575119\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.210276 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d13ed0c0-494b-46b5-965d-1426a9575119-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"d13ed0c0-494b-46b5-965d-1426a9575119\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.210300 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d13ed0c0-494b-46b5-965d-1426a9575119-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"d13ed0c0-494b-46b5-965d-1426a9575119\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.210348 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4jmb\" (UniqueName: \"kubernetes.io/projected/d13ed0c0-494b-46b5-965d-1426a9575119-kube-api-access-d4jmb\") pod \"rabbitmq-notifications-server-0\" (UID: \"d13ed0c0-494b-46b5-965d-1426a9575119\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.210373 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d13ed0c0-494b-46b5-965d-1426a9575119-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"d13ed0c0-494b-46b5-965d-1426a9575119\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.210397 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d13ed0c0-494b-46b5-965d-1426a9575119-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"d13ed0c0-494b-46b5-965d-1426a9575119\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.212267 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d13ed0c0-494b-46b5-965d-1426a9575119-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"d13ed0c0-494b-46b5-965d-1426a9575119\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.212493 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d13ed0c0-494b-46b5-965d-1426a9575119-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"d13ed0c0-494b-46b5-965d-1426a9575119\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.212701 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"d13ed0c0-494b-46b5-965d-1426a9575119\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-notifications-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.212789 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d13ed0c0-494b-46b5-965d-1426a9575119-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"d13ed0c0-494b-46b5-965d-1426a9575119\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.213109 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d13ed0c0-494b-46b5-965d-1426a9575119-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"d13ed0c0-494b-46b5-965d-1426a9575119\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.213698 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d13ed0c0-494b-46b5-965d-1426a9575119-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"d13ed0c0-494b-46b5-965d-1426a9575119\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.214388 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d13ed0c0-494b-46b5-965d-1426a9575119-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"d13ed0c0-494b-46b5-965d-1426a9575119\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.214783 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d13ed0c0-494b-46b5-965d-1426a9575119-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"d13ed0c0-494b-46b5-965d-1426a9575119\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.216008 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d13ed0c0-494b-46b5-965d-1426a9575119-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"d13ed0c0-494b-46b5-965d-1426a9575119\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.228758 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d13ed0c0-494b-46b5-965d-1426a9575119-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"d13ed0c0-494b-46b5-965d-1426a9575119\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.231828 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"d13ed0c0-494b-46b5-965d-1426a9575119\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.232463 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4jmb\" (UniqueName: \"kubernetes.io/projected/d13ed0c0-494b-46b5-965d-1426a9575119-kube-api-access-d4jmb\") pod \"rabbitmq-notifications-server-0\" (UID: \"d13ed0c0-494b-46b5-965d-1426a9575119\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 10:32:15 crc kubenswrapper[4831]: I1204 10:32:15.411482 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-notifications-server-0" Dec 04 10:32:16 crc kubenswrapper[4831]: I1204 10:32:16.970157 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 04 10:32:16 crc kubenswrapper[4831]: I1204 10:32:16.971817 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 04 10:32:16 crc kubenswrapper[4831]: I1204 10:32:16.975473 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 04 10:32:16 crc kubenswrapper[4831]: I1204 10:32:16.975746 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 04 10:32:16 crc kubenswrapper[4831]: I1204 10:32:16.975878 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-4hx6k" Dec 04 10:32:16 crc kubenswrapper[4831]: I1204 10:32:16.976589 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 04 10:32:16 crc kubenswrapper[4831]: I1204 10:32:16.976964 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 04 10:32:16 crc kubenswrapper[4831]: I1204 10:32:16.991893 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 04 10:32:16 crc kubenswrapper[4831]: I1204 10:32:16.991971 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 04 10:32:17 crc kubenswrapper[4831]: I1204 10:32:17.143617 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"d6faa4d2-bf02-4ed3-baa8-4fcf903fa422\") " pod="openstack/openstack-galera-0" Dec 04 10:32:17 crc kubenswrapper[4831]: I1204 10:32:17.143709 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d6faa4d2-bf02-4ed3-baa8-4fcf903fa422-kolla-config\") pod \"openstack-galera-0\" (UID: \"d6faa4d2-bf02-4ed3-baa8-4fcf903fa422\") " pod="openstack/openstack-galera-0" Dec 04 10:32:17 crc kubenswrapper[4831]: I1204 10:32:17.143744 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/d6faa4d2-bf02-4ed3-baa8-4fcf903fa422-secrets\") pod \"openstack-galera-0\" (UID: \"d6faa4d2-bf02-4ed3-baa8-4fcf903fa422\") " pod="openstack/openstack-galera-0" Dec 04 10:32:17 crc kubenswrapper[4831]: I1204 10:32:17.143766 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws8cl\" (UniqueName: \"kubernetes.io/projected/d6faa4d2-bf02-4ed3-baa8-4fcf903fa422-kube-api-access-ws8cl\") pod \"openstack-galera-0\" (UID: \"d6faa4d2-bf02-4ed3-baa8-4fcf903fa422\") " pod="openstack/openstack-galera-0" Dec 04 10:32:17 crc kubenswrapper[4831]: I1204 10:32:17.143924 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6faa4d2-bf02-4ed3-baa8-4fcf903fa422-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d6faa4d2-bf02-4ed3-baa8-4fcf903fa422\") " pod="openstack/openstack-galera-0" Dec 04 10:32:17 crc kubenswrapper[4831]: I1204 10:32:17.143966 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6faa4d2-bf02-4ed3-baa8-4fcf903fa422-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d6faa4d2-bf02-4ed3-baa8-4fcf903fa422\") " pod="openstack/openstack-galera-0" Dec 04 10:32:17 crc kubenswrapper[4831]: I1204 10:32:17.143988 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6faa4d2-bf02-4ed3-baa8-4fcf903fa422-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d6faa4d2-bf02-4ed3-baa8-4fcf903fa422\") " pod="openstack/openstack-galera-0" Dec 04 10:32:17 crc kubenswrapper[4831]: I1204 10:32:17.144011 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d6faa4d2-bf02-4ed3-baa8-4fcf903fa422-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d6faa4d2-bf02-4ed3-baa8-4fcf903fa422\") " pod="openstack/openstack-galera-0" Dec 04 10:32:17 crc kubenswrapper[4831]: I1204 10:32:17.144131 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d6faa4d2-bf02-4ed3-baa8-4fcf903fa422-config-data-default\") pod \"openstack-galera-0\" (UID: \"d6faa4d2-bf02-4ed3-baa8-4fcf903fa422\") " pod="openstack/openstack-galera-0" Dec 04 10:32:17 crc kubenswrapper[4831]: I1204 10:32:17.245990 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6faa4d2-bf02-4ed3-baa8-4fcf903fa422-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d6faa4d2-bf02-4ed3-baa8-4fcf903fa422\") " pod="openstack/openstack-galera-0" Dec 04 10:32:17 crc kubenswrapper[4831]: I1204 10:32:17.246056 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6faa4d2-bf02-4ed3-baa8-4fcf903fa422-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d6faa4d2-bf02-4ed3-baa8-4fcf903fa422\") " pod="openstack/openstack-galera-0" Dec 04 10:32:17 crc kubenswrapper[4831]: I1204 10:32:17.246081 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6faa4d2-bf02-4ed3-baa8-4fcf903fa422-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d6faa4d2-bf02-4ed3-baa8-4fcf903fa422\") " pod="openstack/openstack-galera-0" Dec 04 10:32:17 crc kubenswrapper[4831]: I1204 10:32:17.246106 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d6faa4d2-bf02-4ed3-baa8-4fcf903fa422-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d6faa4d2-bf02-4ed3-baa8-4fcf903fa422\") " pod="openstack/openstack-galera-0" Dec 04 10:32:17 crc kubenswrapper[4831]: I1204 10:32:17.246137 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d6faa4d2-bf02-4ed3-baa8-4fcf903fa422-config-data-default\") pod \"openstack-galera-0\" (UID: \"d6faa4d2-bf02-4ed3-baa8-4fcf903fa422\") " pod="openstack/openstack-galera-0" Dec 04 10:32:17 crc kubenswrapper[4831]: I1204 10:32:17.246205 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"d6faa4d2-bf02-4ed3-baa8-4fcf903fa422\") " pod="openstack/openstack-galera-0" Dec 04 10:32:17 crc kubenswrapper[4831]: I1204 10:32:17.246267 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d6faa4d2-bf02-4ed3-baa8-4fcf903fa422-kolla-config\") pod \"openstack-galera-0\" (UID: \"d6faa4d2-bf02-4ed3-baa8-4fcf903fa422\") " pod="openstack/openstack-galera-0" Dec 04 10:32:17 crc kubenswrapper[4831]: I1204 10:32:17.246305 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/d6faa4d2-bf02-4ed3-baa8-4fcf903fa422-secrets\") pod \"openstack-galera-0\" (UID: \"d6faa4d2-bf02-4ed3-baa8-4fcf903fa422\") " pod="openstack/openstack-galera-0" Dec 04 10:32:17 crc kubenswrapper[4831]: I1204 10:32:17.246328 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws8cl\" (UniqueName: \"kubernetes.io/projected/d6faa4d2-bf02-4ed3-baa8-4fcf903fa422-kube-api-access-ws8cl\") pod \"openstack-galera-0\" (UID: \"d6faa4d2-bf02-4ed3-baa8-4fcf903fa422\") " pod="openstack/openstack-galera-0" Dec 04 10:32:17 crc kubenswrapper[4831]: I1204 10:32:17.247411 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"d6faa4d2-bf02-4ed3-baa8-4fcf903fa422\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-galera-0" Dec 04 10:32:17 crc kubenswrapper[4831]: I1204 10:32:17.247451 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d6faa4d2-bf02-4ed3-baa8-4fcf903fa422-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d6faa4d2-bf02-4ed3-baa8-4fcf903fa422\") " pod="openstack/openstack-galera-0" Dec 04 10:32:17 crc kubenswrapper[4831]: I1204 10:32:17.247729 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d6faa4d2-bf02-4ed3-baa8-4fcf903fa422-kolla-config\") pod \"openstack-galera-0\" (UID: \"d6faa4d2-bf02-4ed3-baa8-4fcf903fa422\") " pod="openstack/openstack-galera-0" Dec 04 10:32:17 crc kubenswrapper[4831]: I1204 10:32:17.247781 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d6faa4d2-bf02-4ed3-baa8-4fcf903fa422-config-data-default\") pod \"openstack-galera-0\" (UID: \"d6faa4d2-bf02-4ed3-baa8-4fcf903fa422\") " pod="openstack/openstack-galera-0" Dec 04 10:32:17 crc kubenswrapper[4831]: I1204 10:32:17.248089 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6faa4d2-bf02-4ed3-baa8-4fcf903fa422-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d6faa4d2-bf02-4ed3-baa8-4fcf903fa422\") " pod="openstack/openstack-galera-0" Dec 04 10:32:17 crc kubenswrapper[4831]: I1204 10:32:17.252460 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6faa4d2-bf02-4ed3-baa8-4fcf903fa422-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d6faa4d2-bf02-4ed3-baa8-4fcf903fa422\") " pod="openstack/openstack-galera-0" Dec 04 10:32:17 crc kubenswrapper[4831]: I1204 10:32:17.259606 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6faa4d2-bf02-4ed3-baa8-4fcf903fa422-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d6faa4d2-bf02-4ed3-baa8-4fcf903fa422\") " pod="openstack/openstack-galera-0" Dec 04 10:32:17 crc kubenswrapper[4831]: I1204 10:32:17.269275 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws8cl\" (UniqueName: \"kubernetes.io/projected/d6faa4d2-bf02-4ed3-baa8-4fcf903fa422-kube-api-access-ws8cl\") pod \"openstack-galera-0\" (UID: \"d6faa4d2-bf02-4ed3-baa8-4fcf903fa422\") " pod="openstack/openstack-galera-0" Dec 04 10:32:17 crc kubenswrapper[4831]: I1204 10:32:17.276078 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/d6faa4d2-bf02-4ed3-baa8-4fcf903fa422-secrets\") pod \"openstack-galera-0\" (UID: \"d6faa4d2-bf02-4ed3-baa8-4fcf903fa422\") " pod="openstack/openstack-galera-0" Dec 04 10:32:17 crc kubenswrapper[4831]: I1204 10:32:17.285796 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"d6faa4d2-bf02-4ed3-baa8-4fcf903fa422\") " pod="openstack/openstack-galera-0" Dec 04 10:32:17 crc kubenswrapper[4831]: I1204 10:32:17.305073 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.340452 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.342700 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.345897 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.345905 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.345967 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.347939 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-c92h5" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.373109 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.477766 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8b3ada8b-0b18-4561-a770-80c259283ce1-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"8b3ada8b-0b18-4561-a770-80c259283ce1\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.477816 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8b3ada8b-0b18-4561-a770-80c259283ce1-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"8b3ada8b-0b18-4561-a770-80c259283ce1\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.477846 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b3ada8b-0b18-4561-a770-80c259283ce1-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"8b3ada8b-0b18-4561-a770-80c259283ce1\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.477872 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8b3ada8b-0b18-4561-a770-80c259283ce1-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"8b3ada8b-0b18-4561-a770-80c259283ce1\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.477931 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b3ada8b-0b18-4561-a770-80c259283ce1-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"8b3ada8b-0b18-4561-a770-80c259283ce1\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.477960 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/8b3ada8b-0b18-4561-a770-80c259283ce1-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"8b3ada8b-0b18-4561-a770-80c259283ce1\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.478021 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"8b3ada8b-0b18-4561-a770-80c259283ce1\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.478056 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b3ada8b-0b18-4561-a770-80c259283ce1-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"8b3ada8b-0b18-4561-a770-80c259283ce1\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.478073 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvfj6\" (UniqueName: \"kubernetes.io/projected/8b3ada8b-0b18-4561-a770-80c259283ce1-kube-api-access-bvfj6\") pod \"openstack-cell1-galera-0\" (UID: \"8b3ada8b-0b18-4561-a770-80c259283ce1\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.506845 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.507839 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.510590 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.514708 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.515053 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-sbtdl" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.517278 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.580081 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8b3ada8b-0b18-4561-a770-80c259283ce1-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"8b3ada8b-0b18-4561-a770-80c259283ce1\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.580145 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b3ada8b-0b18-4561-a770-80c259283ce1-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"8b3ada8b-0b18-4561-a770-80c259283ce1\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.580188 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8b3ada8b-0b18-4561-a770-80c259283ce1-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"8b3ada8b-0b18-4561-a770-80c259283ce1\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.580229 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b3ada8b-0b18-4561-a770-80c259283ce1-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"8b3ada8b-0b18-4561-a770-80c259283ce1\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.580253 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/8b3ada8b-0b18-4561-a770-80c259283ce1-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"8b3ada8b-0b18-4561-a770-80c259283ce1\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.580299 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"8b3ada8b-0b18-4561-a770-80c259283ce1\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.580335 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b3ada8b-0b18-4561-a770-80c259283ce1-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"8b3ada8b-0b18-4561-a770-80c259283ce1\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.580351 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvfj6\" (UniqueName: \"kubernetes.io/projected/8b3ada8b-0b18-4561-a770-80c259283ce1-kube-api-access-bvfj6\") pod \"openstack-cell1-galera-0\" (UID: \"8b3ada8b-0b18-4561-a770-80c259283ce1\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.580390 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8b3ada8b-0b18-4561-a770-80c259283ce1-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"8b3ada8b-0b18-4561-a770-80c259283ce1\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.580770 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"8b3ada8b-0b18-4561-a770-80c259283ce1\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/openstack-cell1-galera-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.580842 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8b3ada8b-0b18-4561-a770-80c259283ce1-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"8b3ada8b-0b18-4561-a770-80c259283ce1\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.580847 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8b3ada8b-0b18-4561-a770-80c259283ce1-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"8b3ada8b-0b18-4561-a770-80c259283ce1\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.581114 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8b3ada8b-0b18-4561-a770-80c259283ce1-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"8b3ada8b-0b18-4561-a770-80c259283ce1\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.582031 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b3ada8b-0b18-4561-a770-80c259283ce1-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"8b3ada8b-0b18-4561-a770-80c259283ce1\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.583720 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/8b3ada8b-0b18-4561-a770-80c259283ce1-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"8b3ada8b-0b18-4561-a770-80c259283ce1\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.584008 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b3ada8b-0b18-4561-a770-80c259283ce1-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"8b3ada8b-0b18-4561-a770-80c259283ce1\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.596736 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b3ada8b-0b18-4561-a770-80c259283ce1-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"8b3ada8b-0b18-4561-a770-80c259283ce1\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.598252 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvfj6\" (UniqueName: \"kubernetes.io/projected/8b3ada8b-0b18-4561-a770-80c259283ce1-kube-api-access-bvfj6\") pod \"openstack-cell1-galera-0\" (UID: \"8b3ada8b-0b18-4561-a770-80c259283ce1\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.613857 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"8b3ada8b-0b18-4561-a770-80c259283ce1\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.662033 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.681707 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e102e9d-f79c-447b-9dff-dfb887b330fe-memcached-tls-certs\") pod \"memcached-0\" (UID: \"4e102e9d-f79c-447b-9dff-dfb887b330fe\") " pod="openstack/memcached-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.681751 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4e102e9d-f79c-447b-9dff-dfb887b330fe-config-data\") pod \"memcached-0\" (UID: \"4e102e9d-f79c-447b-9dff-dfb887b330fe\") " pod="openstack/memcached-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.681770 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4e102e9d-f79c-447b-9dff-dfb887b330fe-kolla-config\") pod \"memcached-0\" (UID: \"4e102e9d-f79c-447b-9dff-dfb887b330fe\") " pod="openstack/memcached-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.682095 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhccr\" (UniqueName: \"kubernetes.io/projected/4e102e9d-f79c-447b-9dff-dfb887b330fe-kube-api-access-rhccr\") pod \"memcached-0\" (UID: \"4e102e9d-f79c-447b-9dff-dfb887b330fe\") " pod="openstack/memcached-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.682159 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e102e9d-f79c-447b-9dff-dfb887b330fe-combined-ca-bundle\") pod \"memcached-0\" (UID: \"4e102e9d-f79c-447b-9dff-dfb887b330fe\") " pod="openstack/memcached-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.783892 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhccr\" (UniqueName: \"kubernetes.io/projected/4e102e9d-f79c-447b-9dff-dfb887b330fe-kube-api-access-rhccr\") pod \"memcached-0\" (UID: \"4e102e9d-f79c-447b-9dff-dfb887b330fe\") " pod="openstack/memcached-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.783947 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e102e9d-f79c-447b-9dff-dfb887b330fe-combined-ca-bundle\") pod \"memcached-0\" (UID: \"4e102e9d-f79c-447b-9dff-dfb887b330fe\") " pod="openstack/memcached-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.783996 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e102e9d-f79c-447b-9dff-dfb887b330fe-memcached-tls-certs\") pod \"memcached-0\" (UID: \"4e102e9d-f79c-447b-9dff-dfb887b330fe\") " pod="openstack/memcached-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.784018 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4e102e9d-f79c-447b-9dff-dfb887b330fe-config-data\") pod \"memcached-0\" (UID: \"4e102e9d-f79c-447b-9dff-dfb887b330fe\") " pod="openstack/memcached-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.784051 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4e102e9d-f79c-447b-9dff-dfb887b330fe-kolla-config\") pod \"memcached-0\" (UID: \"4e102e9d-f79c-447b-9dff-dfb887b330fe\") " pod="openstack/memcached-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.785294 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4e102e9d-f79c-447b-9dff-dfb887b330fe-kolla-config\") pod \"memcached-0\" (UID: \"4e102e9d-f79c-447b-9dff-dfb887b330fe\") " pod="openstack/memcached-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.785982 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4e102e9d-f79c-447b-9dff-dfb887b330fe-config-data\") pod \"memcached-0\" (UID: \"4e102e9d-f79c-447b-9dff-dfb887b330fe\") " pod="openstack/memcached-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.787880 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e102e9d-f79c-447b-9dff-dfb887b330fe-memcached-tls-certs\") pod \"memcached-0\" (UID: \"4e102e9d-f79c-447b-9dff-dfb887b330fe\") " pod="openstack/memcached-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.800792 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhccr\" (UniqueName: \"kubernetes.io/projected/4e102e9d-f79c-447b-9dff-dfb887b330fe-kube-api-access-rhccr\") pod \"memcached-0\" (UID: \"4e102e9d-f79c-447b-9dff-dfb887b330fe\") " pod="openstack/memcached-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.805198 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e102e9d-f79c-447b-9dff-dfb887b330fe-combined-ca-bundle\") pod \"memcached-0\" (UID: \"4e102e9d-f79c-447b-9dff-dfb887b330fe\") " pod="openstack/memcached-0" Dec 04 10:32:18 crc kubenswrapper[4831]: I1204 10:32:18.826149 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 04 10:32:20 crc kubenswrapper[4831]: I1204 10:32:20.625511 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 10:32:20 crc kubenswrapper[4831]: I1204 10:32:20.626505 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 10:32:20 crc kubenswrapper[4831]: I1204 10:32:20.628690 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-smzgq" Dec 04 10:32:20 crc kubenswrapper[4831]: I1204 10:32:20.634586 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 10:32:20 crc kubenswrapper[4831]: I1204 10:32:20.719623 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ckzz\" (UniqueName: \"kubernetes.io/projected/52a381c8-f093-4972-90ac-64799e0184c2-kube-api-access-5ckzz\") pod \"kube-state-metrics-0\" (UID: \"52a381c8-f093-4972-90ac-64799e0184c2\") " pod="openstack/kube-state-metrics-0" Dec 04 10:32:20 crc kubenswrapper[4831]: I1204 10:32:20.821354 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ckzz\" (UniqueName: \"kubernetes.io/projected/52a381c8-f093-4972-90ac-64799e0184c2-kube-api-access-5ckzz\") pod \"kube-state-metrics-0\" (UID: \"52a381c8-f093-4972-90ac-64799e0184c2\") " pod="openstack/kube-state-metrics-0" Dec 04 10:32:20 crc kubenswrapper[4831]: I1204 10:32:20.850538 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ckzz\" (UniqueName: \"kubernetes.io/projected/52a381c8-f093-4972-90ac-64799e0184c2-kube-api-access-5ckzz\") pod \"kube-state-metrics-0\" (UID: \"52a381c8-f093-4972-90ac-64799e0184c2\") " pod="openstack/kube-state-metrics-0" Dec 04 10:32:20 crc kubenswrapper[4831]: I1204 10:32:20.941869 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 10:32:21 crc kubenswrapper[4831]: I1204 10:32:21.973093 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:32:21 crc kubenswrapper[4831]: I1204 10:32:21.974107 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:32:21 crc kubenswrapper[4831]: I1204 10:32:21.979061 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 04 10:32:21 crc kubenswrapper[4831]: I1204 10:32:21.981461 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 04 10:32:21 crc kubenswrapper[4831]: I1204 10:32:21.985317 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 04 10:32:21 crc kubenswrapper[4831]: I1204 10:32:21.985740 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 04 10:32:21 crc kubenswrapper[4831]: I1204 10:32:21.986000 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 04 10:32:21 crc kubenswrapper[4831]: I1204 10:32:21.986870 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 04 10:32:21 crc kubenswrapper[4831]: I1204 10:32:21.987207 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-6p9wz" Dec 04 10:32:21 crc kubenswrapper[4831]: I1204 10:32:21.997345 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 04 10:32:22 crc kubenswrapper[4831]: I1204 10:32:22.040319 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 04 10:32:22 crc kubenswrapper[4831]: I1204 10:32:22.156923 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e058318a-8379-4f10-9860-7af36b3278e5-config\") pod \"prometheus-metric-storage-0\" (UID: \"e058318a-8379-4f10-9860-7af36b3278e5\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:32:22 crc kubenswrapper[4831]: I1204 10:32:22.156976 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e058318a-8379-4f10-9860-7af36b3278e5-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e058318a-8379-4f10-9860-7af36b3278e5\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:32:22 crc kubenswrapper[4831]: I1204 10:32:22.157011 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e058318a-8379-4f10-9860-7af36b3278e5-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e058318a-8379-4f10-9860-7af36b3278e5\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:32:22 crc kubenswrapper[4831]: I1204 10:32:22.157080 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e058318a-8379-4f10-9860-7af36b3278e5-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e058318a-8379-4f10-9860-7af36b3278e5\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:32:22 crc kubenswrapper[4831]: I1204 10:32:22.157110 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e058318a-8379-4f10-9860-7af36b3278e5-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e058318a-8379-4f10-9860-7af36b3278e5\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:32:22 crc kubenswrapper[4831]: I1204 10:32:22.157131 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fgwv\" (UniqueName: \"kubernetes.io/projected/e058318a-8379-4f10-9860-7af36b3278e5-kube-api-access-7fgwv\") pod \"prometheus-metric-storage-0\" (UID: \"e058318a-8379-4f10-9860-7af36b3278e5\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:32:22 crc kubenswrapper[4831]: I1204 10:32:22.157149 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7b44ca90-3490-4c8d-99fe-c1474d342303\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b44ca90-3490-4c8d-99fe-c1474d342303\") pod \"prometheus-metric-storage-0\" (UID: \"e058318a-8379-4f10-9860-7af36b3278e5\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:32:22 crc kubenswrapper[4831]: I1204 10:32:22.157183 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e058318a-8379-4f10-9860-7af36b3278e5-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e058318a-8379-4f10-9860-7af36b3278e5\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:32:22 crc kubenswrapper[4831]: I1204 10:32:22.258653 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e058318a-8379-4f10-9860-7af36b3278e5-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e058318a-8379-4f10-9860-7af36b3278e5\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:32:22 crc kubenswrapper[4831]: I1204 10:32:22.259230 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e058318a-8379-4f10-9860-7af36b3278e5-config\") pod \"prometheus-metric-storage-0\" (UID: \"e058318a-8379-4f10-9860-7af36b3278e5\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:32:22 crc kubenswrapper[4831]: I1204 10:32:22.259447 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e058318a-8379-4f10-9860-7af36b3278e5-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e058318a-8379-4f10-9860-7af36b3278e5\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:32:22 crc kubenswrapper[4831]: I1204 10:32:22.259639 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e058318a-8379-4f10-9860-7af36b3278e5-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e058318a-8379-4f10-9860-7af36b3278e5\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:32:22 crc kubenswrapper[4831]: I1204 10:32:22.259971 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e058318a-8379-4f10-9860-7af36b3278e5-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e058318a-8379-4f10-9860-7af36b3278e5\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:32:22 crc kubenswrapper[4831]: I1204 10:32:22.260151 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e058318a-8379-4f10-9860-7af36b3278e5-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e058318a-8379-4f10-9860-7af36b3278e5\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:32:22 crc kubenswrapper[4831]: I1204 10:32:22.260375 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fgwv\" (UniqueName: \"kubernetes.io/projected/e058318a-8379-4f10-9860-7af36b3278e5-kube-api-access-7fgwv\") pod \"prometheus-metric-storage-0\" (UID: \"e058318a-8379-4f10-9860-7af36b3278e5\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:32:22 crc kubenswrapper[4831]: I1204 10:32:22.260535 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7b44ca90-3490-4c8d-99fe-c1474d342303\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b44ca90-3490-4c8d-99fe-c1474d342303\") pod \"prometheus-metric-storage-0\" (UID: \"e058318a-8379-4f10-9860-7af36b3278e5\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:32:22 crc kubenswrapper[4831]: I1204 10:32:22.260562 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e058318a-8379-4f10-9860-7af36b3278e5-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e058318a-8379-4f10-9860-7af36b3278e5\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:32:22 crc kubenswrapper[4831]: I1204 10:32:22.262337 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e058318a-8379-4f10-9860-7af36b3278e5-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e058318a-8379-4f10-9860-7af36b3278e5\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:32:22 crc kubenswrapper[4831]: I1204 10:32:22.262893 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e058318a-8379-4f10-9860-7af36b3278e5-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e058318a-8379-4f10-9860-7af36b3278e5\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:32:22 crc kubenswrapper[4831]: I1204 10:32:22.264804 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e058318a-8379-4f10-9860-7af36b3278e5-config\") pod \"prometheus-metric-storage-0\" (UID: \"e058318a-8379-4f10-9860-7af36b3278e5\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:32:22 crc kubenswrapper[4831]: I1204 10:32:22.265101 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e058318a-8379-4f10-9860-7af36b3278e5-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e058318a-8379-4f10-9860-7af36b3278e5\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:32:22 crc kubenswrapper[4831]: I1204 10:32:22.271375 4831 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 04 10:32:22 crc kubenswrapper[4831]: I1204 10:32:22.271408 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7b44ca90-3490-4c8d-99fe-c1474d342303\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b44ca90-3490-4c8d-99fe-c1474d342303\") pod \"prometheus-metric-storage-0\" (UID: \"e058318a-8379-4f10-9860-7af36b3278e5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c51bcc27fcdd05e3b872f332894f06756ce616ce82fb1b91e1091af3050124aa/globalmount\"" pod="openstack/prometheus-metric-storage-0" Dec 04 10:32:22 crc kubenswrapper[4831]: I1204 10:32:22.278293 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e058318a-8379-4f10-9860-7af36b3278e5-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e058318a-8379-4f10-9860-7af36b3278e5\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:32:22 crc kubenswrapper[4831]: I1204 10:32:22.290622 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fgwv\" (UniqueName: \"kubernetes.io/projected/e058318a-8379-4f10-9860-7af36b3278e5-kube-api-access-7fgwv\") pod \"prometheus-metric-storage-0\" (UID: \"e058318a-8379-4f10-9860-7af36b3278e5\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:32:22 crc kubenswrapper[4831]: I1204 10:32:22.305339 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7b44ca90-3490-4c8d-99fe-c1474d342303\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b44ca90-3490-4c8d-99fe-c1474d342303\") pod \"prometheus-metric-storage-0\" (UID: \"e058318a-8379-4f10-9860-7af36b3278e5\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:32:22 crc kubenswrapper[4831]: I1204 10:32:22.337361 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 04 10:32:23 crc kubenswrapper[4831]: I1204 10:32:23.732254 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-2l4h5"] Dec 04 10:32:23 crc kubenswrapper[4831]: I1204 10:32:23.733511 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2l4h5" Dec 04 10:32:23 crc kubenswrapper[4831]: I1204 10:32:23.735314 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 04 10:32:23 crc kubenswrapper[4831]: I1204 10:32:23.735493 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-qzlcv" Dec 04 10:32:23 crc kubenswrapper[4831]: I1204 10:32:23.735525 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 04 10:32:23 crc kubenswrapper[4831]: I1204 10:32:23.750449 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2l4h5"] Dec 04 10:32:23 crc kubenswrapper[4831]: I1204 10:32:23.793896 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-p9dj4"] Dec 04 10:32:23 crc kubenswrapper[4831]: I1204 10:32:23.795860 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-p9dj4" Dec 04 10:32:23 crc kubenswrapper[4831]: I1204 10:32:23.804822 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-p9dj4"] Dec 04 10:32:23 crc kubenswrapper[4831]: I1204 10:32:23.895928 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bc807f5-5d11-4f15-aff6-7c5377f10b33-ovn-controller-tls-certs\") pod \"ovn-controller-2l4h5\" (UID: \"3bc807f5-5d11-4f15-aff6-7c5377f10b33\") " pod="openstack/ovn-controller-2l4h5" Dec 04 10:32:23 crc kubenswrapper[4831]: I1204 10:32:23.896011 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljfcb\" (UniqueName: \"kubernetes.io/projected/3bc807f5-5d11-4f15-aff6-7c5377f10b33-kube-api-access-ljfcb\") pod \"ovn-controller-2l4h5\" (UID: \"3bc807f5-5d11-4f15-aff6-7c5377f10b33\") " pod="openstack/ovn-controller-2l4h5" Dec 04 10:32:23 crc kubenswrapper[4831]: I1204 10:32:23.896046 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3bc807f5-5d11-4f15-aff6-7c5377f10b33-var-run\") pod \"ovn-controller-2l4h5\" (UID: \"3bc807f5-5d11-4f15-aff6-7c5377f10b33\") " pod="openstack/ovn-controller-2l4h5" Dec 04 10:32:23 crc kubenswrapper[4831]: I1204 10:32:23.896076 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3bc807f5-5d11-4f15-aff6-7c5377f10b33-var-log-ovn\") pod \"ovn-controller-2l4h5\" (UID: \"3bc807f5-5d11-4f15-aff6-7c5377f10b33\") " pod="openstack/ovn-controller-2l4h5" Dec 04 10:32:23 crc kubenswrapper[4831]: I1204 10:32:23.896112 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3bc807f5-5d11-4f15-aff6-7c5377f10b33-scripts\") pod \"ovn-controller-2l4h5\" (UID: \"3bc807f5-5d11-4f15-aff6-7c5377f10b33\") " pod="openstack/ovn-controller-2l4h5" Dec 04 10:32:23 crc kubenswrapper[4831]: I1204 10:32:23.896137 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bc807f5-5d11-4f15-aff6-7c5377f10b33-combined-ca-bundle\") pod \"ovn-controller-2l4h5\" (UID: \"3bc807f5-5d11-4f15-aff6-7c5377f10b33\") " pod="openstack/ovn-controller-2l4h5" Dec 04 10:32:23 crc kubenswrapper[4831]: I1204 10:32:23.896192 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3bc807f5-5d11-4f15-aff6-7c5377f10b33-var-run-ovn\") pod \"ovn-controller-2l4h5\" (UID: \"3bc807f5-5d11-4f15-aff6-7c5377f10b33\") " pod="openstack/ovn-controller-2l4h5" Dec 04 10:32:23 crc kubenswrapper[4831]: I1204 10:32:23.998006 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/104e4cc3-8dfb-4315-8a21-91454f3b1a45-var-log\") pod \"ovn-controller-ovs-p9dj4\" (UID: \"104e4cc3-8dfb-4315-8a21-91454f3b1a45\") " pod="openstack/ovn-controller-ovs-p9dj4" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.000867 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bc807f5-5d11-4f15-aff6-7c5377f10b33-ovn-controller-tls-certs\") pod \"ovn-controller-2l4h5\" (UID: \"3bc807f5-5d11-4f15-aff6-7c5377f10b33\") " pod="openstack/ovn-controller-2l4h5" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.001328 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/104e4cc3-8dfb-4315-8a21-91454f3b1a45-etc-ovs\") pod \"ovn-controller-ovs-p9dj4\" (UID: \"104e4cc3-8dfb-4315-8a21-91454f3b1a45\") " pod="openstack/ovn-controller-ovs-p9dj4" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.001493 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5xnm\" (UniqueName: \"kubernetes.io/projected/104e4cc3-8dfb-4315-8a21-91454f3b1a45-kube-api-access-r5xnm\") pod \"ovn-controller-ovs-p9dj4\" (UID: \"104e4cc3-8dfb-4315-8a21-91454f3b1a45\") " pod="openstack/ovn-controller-ovs-p9dj4" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.001622 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljfcb\" (UniqueName: \"kubernetes.io/projected/3bc807f5-5d11-4f15-aff6-7c5377f10b33-kube-api-access-ljfcb\") pod \"ovn-controller-2l4h5\" (UID: \"3bc807f5-5d11-4f15-aff6-7c5377f10b33\") " pod="openstack/ovn-controller-2l4h5" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.001780 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3bc807f5-5d11-4f15-aff6-7c5377f10b33-var-run\") pod \"ovn-controller-2l4h5\" (UID: \"3bc807f5-5d11-4f15-aff6-7c5377f10b33\") " pod="openstack/ovn-controller-2l4h5" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.002012 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/104e4cc3-8dfb-4315-8a21-91454f3b1a45-scripts\") pod \"ovn-controller-ovs-p9dj4\" (UID: \"104e4cc3-8dfb-4315-8a21-91454f3b1a45\") " pod="openstack/ovn-controller-ovs-p9dj4" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.002048 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/104e4cc3-8dfb-4315-8a21-91454f3b1a45-var-lib\") pod \"ovn-controller-ovs-p9dj4\" (UID: \"104e4cc3-8dfb-4315-8a21-91454f3b1a45\") " pod="openstack/ovn-controller-ovs-p9dj4" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.002097 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3bc807f5-5d11-4f15-aff6-7c5377f10b33-var-log-ovn\") pod \"ovn-controller-2l4h5\" (UID: \"3bc807f5-5d11-4f15-aff6-7c5377f10b33\") " pod="openstack/ovn-controller-2l4h5" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.002369 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3bc807f5-5d11-4f15-aff6-7c5377f10b33-scripts\") pod \"ovn-controller-2l4h5\" (UID: \"3bc807f5-5d11-4f15-aff6-7c5377f10b33\") " pod="openstack/ovn-controller-2l4h5" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.002417 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bc807f5-5d11-4f15-aff6-7c5377f10b33-combined-ca-bundle\") pod \"ovn-controller-2l4h5\" (UID: \"3bc807f5-5d11-4f15-aff6-7c5377f10b33\") " pod="openstack/ovn-controller-2l4h5" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.002496 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3bc807f5-5d11-4f15-aff6-7c5377f10b33-var-run-ovn\") pod \"ovn-controller-2l4h5\" (UID: \"3bc807f5-5d11-4f15-aff6-7c5377f10b33\") " pod="openstack/ovn-controller-2l4h5" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.002526 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/104e4cc3-8dfb-4315-8a21-91454f3b1a45-var-run\") pod \"ovn-controller-ovs-p9dj4\" (UID: \"104e4cc3-8dfb-4315-8a21-91454f3b1a45\") " pod="openstack/ovn-controller-ovs-p9dj4" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.003799 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3bc807f5-5d11-4f15-aff6-7c5377f10b33-var-run-ovn\") pod \"ovn-controller-2l4h5\" (UID: \"3bc807f5-5d11-4f15-aff6-7c5377f10b33\") " pod="openstack/ovn-controller-2l4h5" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.003932 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3bc807f5-5d11-4f15-aff6-7c5377f10b33-var-run\") pod \"ovn-controller-2l4h5\" (UID: \"3bc807f5-5d11-4f15-aff6-7c5377f10b33\") " pod="openstack/ovn-controller-2l4h5" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.004153 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3bc807f5-5d11-4f15-aff6-7c5377f10b33-var-log-ovn\") pod \"ovn-controller-2l4h5\" (UID: \"3bc807f5-5d11-4f15-aff6-7c5377f10b33\") " pod="openstack/ovn-controller-2l4h5" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.006329 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3bc807f5-5d11-4f15-aff6-7c5377f10b33-scripts\") pod \"ovn-controller-2l4h5\" (UID: \"3bc807f5-5d11-4f15-aff6-7c5377f10b33\") " pod="openstack/ovn-controller-2l4h5" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.007673 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bc807f5-5d11-4f15-aff6-7c5377f10b33-ovn-controller-tls-certs\") pod \"ovn-controller-2l4h5\" (UID: \"3bc807f5-5d11-4f15-aff6-7c5377f10b33\") " pod="openstack/ovn-controller-2l4h5" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.008750 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bc807f5-5d11-4f15-aff6-7c5377f10b33-combined-ca-bundle\") pod \"ovn-controller-2l4h5\" (UID: \"3bc807f5-5d11-4f15-aff6-7c5377f10b33\") " pod="openstack/ovn-controller-2l4h5" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.034709 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljfcb\" (UniqueName: \"kubernetes.io/projected/3bc807f5-5d11-4f15-aff6-7c5377f10b33-kube-api-access-ljfcb\") pod \"ovn-controller-2l4h5\" (UID: \"3bc807f5-5d11-4f15-aff6-7c5377f10b33\") " pod="openstack/ovn-controller-2l4h5" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.056844 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2l4h5" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.105049 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5xnm\" (UniqueName: \"kubernetes.io/projected/104e4cc3-8dfb-4315-8a21-91454f3b1a45-kube-api-access-r5xnm\") pod \"ovn-controller-ovs-p9dj4\" (UID: \"104e4cc3-8dfb-4315-8a21-91454f3b1a45\") " pod="openstack/ovn-controller-ovs-p9dj4" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.105115 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/104e4cc3-8dfb-4315-8a21-91454f3b1a45-scripts\") pod \"ovn-controller-ovs-p9dj4\" (UID: \"104e4cc3-8dfb-4315-8a21-91454f3b1a45\") " pod="openstack/ovn-controller-ovs-p9dj4" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.105140 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/104e4cc3-8dfb-4315-8a21-91454f3b1a45-var-lib\") pod \"ovn-controller-ovs-p9dj4\" (UID: \"104e4cc3-8dfb-4315-8a21-91454f3b1a45\") " pod="openstack/ovn-controller-ovs-p9dj4" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.105221 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/104e4cc3-8dfb-4315-8a21-91454f3b1a45-var-run\") pod \"ovn-controller-ovs-p9dj4\" (UID: \"104e4cc3-8dfb-4315-8a21-91454f3b1a45\") " pod="openstack/ovn-controller-ovs-p9dj4" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.105286 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/104e4cc3-8dfb-4315-8a21-91454f3b1a45-var-log\") pod \"ovn-controller-ovs-p9dj4\" (UID: \"104e4cc3-8dfb-4315-8a21-91454f3b1a45\") " pod="openstack/ovn-controller-ovs-p9dj4" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.105343 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/104e4cc3-8dfb-4315-8a21-91454f3b1a45-etc-ovs\") pod \"ovn-controller-ovs-p9dj4\" (UID: \"104e4cc3-8dfb-4315-8a21-91454f3b1a45\") " pod="openstack/ovn-controller-ovs-p9dj4" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.105637 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/104e4cc3-8dfb-4315-8a21-91454f3b1a45-etc-ovs\") pod \"ovn-controller-ovs-p9dj4\" (UID: \"104e4cc3-8dfb-4315-8a21-91454f3b1a45\") " pod="openstack/ovn-controller-ovs-p9dj4" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.106782 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/104e4cc3-8dfb-4315-8a21-91454f3b1a45-var-run\") pod \"ovn-controller-ovs-p9dj4\" (UID: \"104e4cc3-8dfb-4315-8a21-91454f3b1a45\") " pod="openstack/ovn-controller-ovs-p9dj4" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.107016 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/104e4cc3-8dfb-4315-8a21-91454f3b1a45-var-lib\") pod \"ovn-controller-ovs-p9dj4\" (UID: \"104e4cc3-8dfb-4315-8a21-91454f3b1a45\") " pod="openstack/ovn-controller-ovs-p9dj4" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.107157 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/104e4cc3-8dfb-4315-8a21-91454f3b1a45-var-log\") pod \"ovn-controller-ovs-p9dj4\" (UID: \"104e4cc3-8dfb-4315-8a21-91454f3b1a45\") " pod="openstack/ovn-controller-ovs-p9dj4" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.107808 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/104e4cc3-8dfb-4315-8a21-91454f3b1a45-scripts\") pod \"ovn-controller-ovs-p9dj4\" (UID: \"104e4cc3-8dfb-4315-8a21-91454f3b1a45\") " pod="openstack/ovn-controller-ovs-p9dj4" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.126284 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5xnm\" (UniqueName: \"kubernetes.io/projected/104e4cc3-8dfb-4315-8a21-91454f3b1a45-kube-api-access-r5xnm\") pod \"ovn-controller-ovs-p9dj4\" (UID: \"104e4cc3-8dfb-4315-8a21-91454f3b1a45\") " pod="openstack/ovn-controller-ovs-p9dj4" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.295769 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.297460 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.313183 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.313200 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.313420 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.313792 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-gnblb" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.314143 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.335316 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.410562 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d7c249d5-00a5-428b-b259-54d4147f8392\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.410633 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d7c249d5-00a5-428b-b259-54d4147f8392-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d7c249d5-00a5-428b-b259-54d4147f8392\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.410712 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2w5d\" (UniqueName: \"kubernetes.io/projected/d7c249d5-00a5-428b-b259-54d4147f8392-kube-api-access-v2w5d\") pod \"ovsdbserver-nb-0\" (UID: \"d7c249d5-00a5-428b-b259-54d4147f8392\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.410755 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7c249d5-00a5-428b-b259-54d4147f8392-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d7c249d5-00a5-428b-b259-54d4147f8392\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.410788 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7c249d5-00a5-428b-b259-54d4147f8392-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d7c249d5-00a5-428b-b259-54d4147f8392\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.413047 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7c249d5-00a5-428b-b259-54d4147f8392-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d7c249d5-00a5-428b-b259-54d4147f8392\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.413153 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d7c249d5-00a5-428b-b259-54d4147f8392-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d7c249d5-00a5-428b-b259-54d4147f8392\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.413427 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7c249d5-00a5-428b-b259-54d4147f8392-config\") pod \"ovsdbserver-nb-0\" (UID: \"d7c249d5-00a5-428b-b259-54d4147f8392\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.427080 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-p9dj4" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.514991 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d7c249d5-00a5-428b-b259-54d4147f8392\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.515065 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d7c249d5-00a5-428b-b259-54d4147f8392-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d7c249d5-00a5-428b-b259-54d4147f8392\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.515096 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2w5d\" (UniqueName: \"kubernetes.io/projected/d7c249d5-00a5-428b-b259-54d4147f8392-kube-api-access-v2w5d\") pod \"ovsdbserver-nb-0\" (UID: \"d7c249d5-00a5-428b-b259-54d4147f8392\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.515115 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7c249d5-00a5-428b-b259-54d4147f8392-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d7c249d5-00a5-428b-b259-54d4147f8392\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.515154 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7c249d5-00a5-428b-b259-54d4147f8392-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d7c249d5-00a5-428b-b259-54d4147f8392\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.515187 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7c249d5-00a5-428b-b259-54d4147f8392-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d7c249d5-00a5-428b-b259-54d4147f8392\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.515235 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d7c249d5-00a5-428b-b259-54d4147f8392-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d7c249d5-00a5-428b-b259-54d4147f8392\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.515316 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7c249d5-00a5-428b-b259-54d4147f8392-config\") pod \"ovsdbserver-nb-0\" (UID: \"d7c249d5-00a5-428b-b259-54d4147f8392\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.516189 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7c249d5-00a5-428b-b259-54d4147f8392-config\") pod \"ovsdbserver-nb-0\" (UID: \"d7c249d5-00a5-428b-b259-54d4147f8392\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.516410 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d7c249d5-00a5-428b-b259-54d4147f8392\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-nb-0" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.518270 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d7c249d5-00a5-428b-b259-54d4147f8392-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d7c249d5-00a5-428b-b259-54d4147f8392\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.519684 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d7c249d5-00a5-428b-b259-54d4147f8392-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d7c249d5-00a5-428b-b259-54d4147f8392\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.521132 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7c249d5-00a5-428b-b259-54d4147f8392-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d7c249d5-00a5-428b-b259-54d4147f8392\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.524178 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7c249d5-00a5-428b-b259-54d4147f8392-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d7c249d5-00a5-428b-b259-54d4147f8392\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.524443 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7c249d5-00a5-428b-b259-54d4147f8392-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d7c249d5-00a5-428b-b259-54d4147f8392\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.535361 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2w5d\" (UniqueName: \"kubernetes.io/projected/d7c249d5-00a5-428b-b259-54d4147f8392-kube-api-access-v2w5d\") pod \"ovsdbserver-nb-0\" (UID: \"d7c249d5-00a5-428b-b259-54d4147f8392\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.583868 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d7c249d5-00a5-428b-b259-54d4147f8392\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:32:24 crc kubenswrapper[4831]: I1204 10:32:24.633021 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 04 10:32:28 crc kubenswrapper[4831]: I1204 10:32:28.207268 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 04 10:32:28 crc kubenswrapper[4831]: I1204 10:32:28.210743 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 04 10:32:28 crc kubenswrapper[4831]: I1204 10:32:28.214442 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 04 10:32:28 crc kubenswrapper[4831]: I1204 10:32:28.214980 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 04 10:32:28 crc kubenswrapper[4831]: I1204 10:32:28.215319 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-pz9fr" Dec 04 10:32:28 crc kubenswrapper[4831]: I1204 10:32:28.215448 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 04 10:32:28 crc kubenswrapper[4831]: I1204 10:32:28.246348 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 04 10:32:28 crc kubenswrapper[4831]: I1204 10:32:28.378989 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99fth\" (UniqueName: \"kubernetes.io/projected/0e48a25d-4544-4e4e-a835-8086fbb60f4d-kube-api-access-99fth\") pod \"ovsdbserver-sb-0\" (UID: \"0e48a25d-4544-4e4e-a835-8086fbb60f4d\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:32:28 crc kubenswrapper[4831]: I1204 10:32:28.379056 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e48a25d-4544-4e4e-a835-8086fbb60f4d-config\") pod \"ovsdbserver-sb-0\" (UID: \"0e48a25d-4544-4e4e-a835-8086fbb60f4d\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:32:28 crc kubenswrapper[4831]: I1204 10:32:28.379083 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e48a25d-4544-4e4e-a835-8086fbb60f4d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"0e48a25d-4544-4e4e-a835-8086fbb60f4d\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:32:28 crc kubenswrapper[4831]: I1204 10:32:28.379142 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0e48a25d-4544-4e4e-a835-8086fbb60f4d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"0e48a25d-4544-4e4e-a835-8086fbb60f4d\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:32:28 crc kubenswrapper[4831]: I1204 10:32:28.379182 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"0e48a25d-4544-4e4e-a835-8086fbb60f4d\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:32:28 crc kubenswrapper[4831]: I1204 10:32:28.380314 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e48a25d-4544-4e4e-a835-8086fbb60f4d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"0e48a25d-4544-4e4e-a835-8086fbb60f4d\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:32:28 crc kubenswrapper[4831]: I1204 10:32:28.380369 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e48a25d-4544-4e4e-a835-8086fbb60f4d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0e48a25d-4544-4e4e-a835-8086fbb60f4d\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:32:28 crc kubenswrapper[4831]: I1204 10:32:28.380466 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e48a25d-4544-4e4e-a835-8086fbb60f4d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0e48a25d-4544-4e4e-a835-8086fbb60f4d\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:32:28 crc kubenswrapper[4831]: I1204 10:32:28.482271 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e48a25d-4544-4e4e-a835-8086fbb60f4d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0e48a25d-4544-4e4e-a835-8086fbb60f4d\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:32:28 crc kubenswrapper[4831]: I1204 10:32:28.482391 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99fth\" (UniqueName: \"kubernetes.io/projected/0e48a25d-4544-4e4e-a835-8086fbb60f4d-kube-api-access-99fth\") pod \"ovsdbserver-sb-0\" (UID: \"0e48a25d-4544-4e4e-a835-8086fbb60f4d\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:32:28 crc kubenswrapper[4831]: I1204 10:32:28.482471 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e48a25d-4544-4e4e-a835-8086fbb60f4d-config\") pod \"ovsdbserver-sb-0\" (UID: \"0e48a25d-4544-4e4e-a835-8086fbb60f4d\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:32:28 crc kubenswrapper[4831]: I1204 10:32:28.482538 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e48a25d-4544-4e4e-a835-8086fbb60f4d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"0e48a25d-4544-4e4e-a835-8086fbb60f4d\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:32:28 crc kubenswrapper[4831]: I1204 10:32:28.482651 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0e48a25d-4544-4e4e-a835-8086fbb60f4d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"0e48a25d-4544-4e4e-a835-8086fbb60f4d\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:32:28 crc kubenswrapper[4831]: I1204 10:32:28.482818 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"0e48a25d-4544-4e4e-a835-8086fbb60f4d\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:32:28 crc kubenswrapper[4831]: I1204 10:32:28.482899 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e48a25d-4544-4e4e-a835-8086fbb60f4d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"0e48a25d-4544-4e4e-a835-8086fbb60f4d\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:32:28 crc kubenswrapper[4831]: I1204 10:32:28.482961 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e48a25d-4544-4e4e-a835-8086fbb60f4d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0e48a25d-4544-4e4e-a835-8086fbb60f4d\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:32:28 crc kubenswrapper[4831]: I1204 10:32:28.484172 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"0e48a25d-4544-4e4e-a835-8086fbb60f4d\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-sb-0" Dec 04 10:32:28 crc kubenswrapper[4831]: I1204 10:32:28.484425 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0e48a25d-4544-4e4e-a835-8086fbb60f4d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"0e48a25d-4544-4e4e-a835-8086fbb60f4d\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:32:28 crc kubenswrapper[4831]: I1204 10:32:28.484629 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e48a25d-4544-4e4e-a835-8086fbb60f4d-config\") pod \"ovsdbserver-sb-0\" (UID: \"0e48a25d-4544-4e4e-a835-8086fbb60f4d\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:32:28 crc kubenswrapper[4831]: I1204 10:32:28.485807 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e48a25d-4544-4e4e-a835-8086fbb60f4d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"0e48a25d-4544-4e4e-a835-8086fbb60f4d\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:32:28 crc kubenswrapper[4831]: I1204 10:32:28.490532 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e48a25d-4544-4e4e-a835-8086fbb60f4d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0e48a25d-4544-4e4e-a835-8086fbb60f4d\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:32:28 crc kubenswrapper[4831]: I1204 10:32:28.490725 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e48a25d-4544-4e4e-a835-8086fbb60f4d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0e48a25d-4544-4e4e-a835-8086fbb60f4d\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:32:28 crc kubenswrapper[4831]: I1204 10:32:28.502862 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e48a25d-4544-4e4e-a835-8086fbb60f4d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"0e48a25d-4544-4e4e-a835-8086fbb60f4d\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:32:28 crc kubenswrapper[4831]: I1204 10:32:28.508102 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99fth\" (UniqueName: \"kubernetes.io/projected/0e48a25d-4544-4e4e-a835-8086fbb60f4d-kube-api-access-99fth\") pod \"ovsdbserver-sb-0\" (UID: \"0e48a25d-4544-4e4e-a835-8086fbb60f4d\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:32:28 crc kubenswrapper[4831]: I1204 10:32:28.508379 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"0e48a25d-4544-4e4e-a835-8086fbb60f4d\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:32:28 crc kubenswrapper[4831]: I1204 10:32:28.542775 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 04 10:32:31 crc kubenswrapper[4831]: E1204 10:32:31.742097 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.47:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Dec 04 10:32:31 crc kubenswrapper[4831]: E1204 10:32:31.742492 4831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.47:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Dec 04 10:32:31 crc kubenswrapper[4831]: E1204 10:32:31.742605 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.47:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j46f2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-659447cd97-nphz2_openstack(1c23142e-4541-40cb-b870-aeb15f8af94a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 10:32:31 crc kubenswrapper[4831]: E1204 10:32:31.743824 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-659447cd97-nphz2" podUID="1c23142e-4541-40cb-b870-aeb15f8af94a" Dec 04 10:32:31 crc kubenswrapper[4831]: E1204 10:32:31.920981 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.47:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Dec 04 10:32:31 crc kubenswrapper[4831]: E1204 10:32:31.921026 4831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.47:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Dec 04 10:32:31 crc kubenswrapper[4831]: E1204 10:32:31.921124 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.47:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d46hz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6f59f64fdc-hl8mg_openstack(6e5d4c67-551c-49d3-b204-0562a898def5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 10:32:31 crc kubenswrapper[4831]: E1204 10:32:31.922468 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-6f59f64fdc-hl8mg" podUID="6e5d4c67-551c-49d3-b204-0562a898def5" Dec 04 10:32:32 crc kubenswrapper[4831]: I1204 10:32:32.427592 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Dec 04 10:32:32 crc kubenswrapper[4831]: I1204 10:32:32.451115 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 10:32:32 crc kubenswrapper[4831]: I1204 10:32:32.463570 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bc4fb897-d58q6"] Dec 04 10:32:32 crc kubenswrapper[4831]: I1204 10:32:32.714279 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bc4fb897-d58q6" event={"ID":"bd68160b-20a1-40d4-b7c9-827aa56cf7db","Type":"ContainerStarted","Data":"66b07a23dcd35cb0c180f3483f01cc599a1d62861f676143a9001d5ad22c04a5"} Dec 04 10:32:32 crc kubenswrapper[4831]: I1204 10:32:32.715547 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"d13ed0c0-494b-46b5-965d-1426a9575119","Type":"ContainerStarted","Data":"5350bfd078517cb9a32f2b222cfe1c31d10a726b97d2301031c641bf58888040"} Dec 04 10:32:32 crc kubenswrapper[4831]: I1204 10:32:32.716724 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2d1e04df-4c2a-440f-b533-9903a58c8ecc","Type":"ContainerStarted","Data":"82a1c770b6e2cc059228f39ee796551f8e0645e72f5d1690a5c64bdb79664d6e"} Dec 04 10:32:32 crc kubenswrapper[4831]: I1204 10:32:32.819626 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c99ddcf47-zm45x"] Dec 04 10:32:32 crc kubenswrapper[4831]: I1204 10:32:32.842481 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 04 10:32:32 crc kubenswrapper[4831]: W1204 10:32:32.849430 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9d7ac1c_c0c4_477c_babd_74166b286800.slice/crio-ff783f3933db93c0a48cbe9f90d308a2f732bbcdcc140f046f945e01b2caf29f WatchSource:0}: Error finding container ff783f3933db93c0a48cbe9f90d308a2f732bbcdcc140f046f945e01b2caf29f: Status 404 returned error can't find the container with id ff783f3933db93c0a48cbe9f90d308a2f732bbcdcc140f046f945e01b2caf29f Dec 04 10:32:32 crc kubenswrapper[4831]: I1204 10:32:32.859442 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7964976f-vpdr9"] Dec 04 10:32:32 crc kubenswrapper[4831]: I1204 10:32:32.863877 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 04 10:32:32 crc kubenswrapper[4831]: W1204 10:32:32.870488 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e102e9d_f79c_447b_9dff_dfb887b330fe.slice/crio-e449d8ca3b90f399a3e2c080c07c546eeea47638604d966621f1ad37be7f1178 WatchSource:0}: Error finding container e449d8ca3b90f399a3e2c080c07c546eeea47638604d966621f1ad37be7f1178: Status 404 returned error can't find the container with id e449d8ca3b90f399a3e2c080c07c546eeea47638604d966621f1ad37be7f1178 Dec 04 10:32:32 crc kubenswrapper[4831]: I1204 10:32:32.870812 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 10:32:32 crc kubenswrapper[4831]: W1204 10:32:32.874029 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4335ff7d_e14c_4dcc_9c0e_53638cc7cb07.slice/crio-1b826e1aafd49dfff0fb8d05a33eafeb9d43c99657aa681b655db865dd715666 WatchSource:0}: Error finding container 1b826e1aafd49dfff0fb8d05a33eafeb9d43c99657aa681b655db865dd715666: Status 404 returned error can't find the container with id 1b826e1aafd49dfff0fb8d05a33eafeb9d43c99657aa681b655db865dd715666 Dec 04 10:32:32 crc kubenswrapper[4831]: I1204 10:32:32.959826 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.040671 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2l4h5"] Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.080561 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.096381 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.116571 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-659447cd97-nphz2" Dec 04 10:32:33 crc kubenswrapper[4831]: W1204 10:32:33.119241 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52a381c8_f093_4972_90ac_64799e0184c2.slice/crio-b6430f35e1ea09b90b169b9825f390619c6c2acc8669b3c7a91400a7d2e03fe9 WatchSource:0}: Error finding container b6430f35e1ea09b90b169b9825f390619c6c2acc8669b3c7a91400a7d2e03fe9: Status 404 returned error can't find the container with id b6430f35e1ea09b90b169b9825f390619c6c2acc8669b3c7a91400a7d2e03fe9 Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.122217 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.161162 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-p9dj4"] Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.267833 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c23142e-4541-40cb-b870-aeb15f8af94a-config\") pod \"1c23142e-4541-40cb-b870-aeb15f8af94a\" (UID: \"1c23142e-4541-40cb-b870-aeb15f8af94a\") " Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.267879 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c23142e-4541-40cb-b870-aeb15f8af94a-dns-svc\") pod \"1c23142e-4541-40cb-b870-aeb15f8af94a\" (UID: \"1c23142e-4541-40cb-b870-aeb15f8af94a\") " Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.267966 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j46f2\" (UniqueName: \"kubernetes.io/projected/1c23142e-4541-40cb-b870-aeb15f8af94a-kube-api-access-j46f2\") pod \"1c23142e-4541-40cb-b870-aeb15f8af94a\" (UID: \"1c23142e-4541-40cb-b870-aeb15f8af94a\") " Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.268494 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c23142e-4541-40cb-b870-aeb15f8af94a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1c23142e-4541-40cb-b870-aeb15f8af94a" (UID: "1c23142e-4541-40cb-b870-aeb15f8af94a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.268514 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c23142e-4541-40cb-b870-aeb15f8af94a-config" (OuterVolumeSpecName: "config") pod "1c23142e-4541-40cb-b870-aeb15f8af94a" (UID: "1c23142e-4541-40cb-b870-aeb15f8af94a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.274392 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c23142e-4541-40cb-b870-aeb15f8af94a-kube-api-access-j46f2" (OuterVolumeSpecName: "kube-api-access-j46f2") pod "1c23142e-4541-40cb-b870-aeb15f8af94a" (UID: "1c23142e-4541-40cb-b870-aeb15f8af94a"). InnerVolumeSpecName "kube-api-access-j46f2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.274409 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f59f64fdc-hl8mg" Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.369207 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d46hz\" (UniqueName: \"kubernetes.io/projected/6e5d4c67-551c-49d3-b204-0562a898def5-kube-api-access-d46hz\") pod \"6e5d4c67-551c-49d3-b204-0562a898def5\" (UID: \"6e5d4c67-551c-49d3-b204-0562a898def5\") " Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.369435 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e5d4c67-551c-49d3-b204-0562a898def5-config\") pod \"6e5d4c67-551c-49d3-b204-0562a898def5\" (UID: \"6e5d4c67-551c-49d3-b204-0562a898def5\") " Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.370627 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e5d4c67-551c-49d3-b204-0562a898def5-config" (OuterVolumeSpecName: "config") pod "6e5d4c67-551c-49d3-b204-0562a898def5" (UID: "6e5d4c67-551c-49d3-b204-0562a898def5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.371123 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j46f2\" (UniqueName: \"kubernetes.io/projected/1c23142e-4541-40cb-b870-aeb15f8af94a-kube-api-access-j46f2\") on node \"crc\" DevicePath \"\"" Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.371139 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e5d4c67-551c-49d3-b204-0562a898def5-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.371148 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c23142e-4541-40cb-b870-aeb15f8af94a-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.371157 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c23142e-4541-40cb-b870-aeb15f8af94a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.375135 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e5d4c67-551c-49d3-b204-0562a898def5-kube-api-access-d46hz" (OuterVolumeSpecName: "kube-api-access-d46hz") pod "6e5d4c67-551c-49d3-b204-0562a898def5" (UID: "6e5d4c67-551c-49d3-b204-0562a898def5"). InnerVolumeSpecName "kube-api-access-d46hz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.501207 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d46hz\" (UniqueName: \"kubernetes.io/projected/6e5d4c67-551c-49d3-b204-0562a898def5-kube-api-access-d46hz\") on node \"crc\" DevicePath \"\"" Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.730464 4831 generic.go:334] "Generic (PLEG): container finished" podID="bd68160b-20a1-40d4-b7c9-827aa56cf7db" containerID="4f81b583533d4e0dcc9389e7a6d1e4bf31f2cf087244eb0f86b24c714fd2aa81" exitCode=0 Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.730528 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bc4fb897-d58q6" event={"ID":"bd68160b-20a1-40d4-b7c9-827aa56cf7db","Type":"ContainerDied","Data":"4f81b583533d4e0dcc9389e7a6d1e4bf31f2cf087244eb0f86b24c714fd2aa81"} Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.732637 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"4e102e9d-f79c-447b-9dff-dfb887b330fe","Type":"ContainerStarted","Data":"e449d8ca3b90f399a3e2c080c07c546eeea47638604d966621f1ad37be7f1178"} Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.734520 4831 generic.go:334] "Generic (PLEG): container finished" podID="4335ff7d-e14c-4dcc-9c0e-53638cc7cb07" containerID="ce2cbf40eecb6e3b2d3def3688272112834483b01f2228275d71a6a7c23316ff" exitCode=0 Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.734561 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7964976f-vpdr9" event={"ID":"4335ff7d-e14c-4dcc-9c0e-53638cc7cb07","Type":"ContainerDied","Data":"ce2cbf40eecb6e3b2d3def3688272112834483b01f2228275d71a6a7c23316ff"} Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.734601 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7964976f-vpdr9" event={"ID":"4335ff7d-e14c-4dcc-9c0e-53638cc7cb07","Type":"ContainerStarted","Data":"1b826e1aafd49dfff0fb8d05a33eafeb9d43c99657aa681b655db865dd715666"} Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.735747 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"0e48a25d-4544-4e4e-a835-8086fbb60f4d","Type":"ContainerStarted","Data":"8d5de5c1c2cfe02e9a393ea57a1ef6e5064903906a08069949f7601a117bac3f"} Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.737616 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f59f64fdc-hl8mg" event={"ID":"6e5d4c67-551c-49d3-b204-0562a898def5","Type":"ContainerDied","Data":"879fe661b14f8e581859602c747dabc5fa6ba62d89f2433f4db94352b98b8138"} Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.737633 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f59f64fdc-hl8mg" Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.740172 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e058318a-8379-4f10-9860-7af36b3278e5","Type":"ContainerStarted","Data":"5641e197a46725bffb15cd9332375a4b4af9a436ec9f385e633700f071ba879e"} Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.742166 4831 generic.go:334] "Generic (PLEG): container finished" podID="b9d7ac1c-c0c4-477c-babd-74166b286800" containerID="6de4ae7976682f98d8c72cbb94360dd276e0786934bfccc7ba958fdf15b89635" exitCode=0 Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.742225 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c99ddcf47-zm45x" event={"ID":"b9d7ac1c-c0c4-477c-babd-74166b286800","Type":"ContainerDied","Data":"6de4ae7976682f98d8c72cbb94360dd276e0786934bfccc7ba958fdf15b89635"} Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.742251 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c99ddcf47-zm45x" event={"ID":"b9d7ac1c-c0c4-477c-babd-74166b286800","Type":"ContainerStarted","Data":"ff783f3933db93c0a48cbe9f90d308a2f732bbcdcc140f046f945e01b2caf29f"} Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.749875 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-659447cd97-nphz2" event={"ID":"1c23142e-4541-40cb-b870-aeb15f8af94a","Type":"ContainerDied","Data":"b73117ebb1926e8dcc06240025af80dcda76757aa84d03c8f4b169d546f09c7c"} Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.749943 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-659447cd97-nphz2" Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.759444 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"52a381c8-f093-4972-90ac-64799e0184c2","Type":"ContainerStarted","Data":"b6430f35e1ea09b90b169b9825f390619c6c2acc8669b3c7a91400a7d2e03fe9"} Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.760993 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2l4h5" event={"ID":"3bc807f5-5d11-4f15-aff6-7c5377f10b33","Type":"ContainerStarted","Data":"5c5ea299754af33720729dc977e5cb63a85ad29044c59df0e7cad8753b0c544e"} Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.762974 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8b3ada8b-0b18-4561-a770-80c259283ce1","Type":"ContainerStarted","Data":"6a3e10621b1fb3dc7f87759b741584f43a04d08c7692afd048d1548e59cda8bc"} Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.775563 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-p9dj4" event={"ID":"104e4cc3-8dfb-4315-8a21-91454f3b1a45","Type":"ContainerStarted","Data":"83ed653b7cf4f529e559a82e0e56e9ced5515f9131f79dd05f41032c090677e9"} Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.778600 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d6faa4d2-bf02-4ed3-baa8-4fcf903fa422","Type":"ContainerStarted","Data":"b125eb860f5fbc35813a5e05bbfb018cb09038d5832e6a85cea25a79f6fa97dd"} Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.782032 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"462ff702-35a4-4cbe-8155-3ce8a321bf48","Type":"ContainerStarted","Data":"13964deb8d083e42080833315f3f4f2ab2c52f2675d2f0a8d620b17210c9e80b"} Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.827691 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-659447cd97-nphz2"] Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.835882 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-659447cd97-nphz2"] Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.867027 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f59f64fdc-hl8mg"] Dec 04 10:32:33 crc kubenswrapper[4831]: I1204 10:32:33.888994 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f59f64fdc-hl8mg"] Dec 04 10:32:34 crc kubenswrapper[4831]: I1204 10:32:34.015465 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 04 10:32:34 crc kubenswrapper[4831]: I1204 10:32:34.683284 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c99ddcf47-zm45x" Dec 04 10:32:34 crc kubenswrapper[4831]: I1204 10:32:34.723671 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9d7ac1c-c0c4-477c-babd-74166b286800-config\") pod \"b9d7ac1c-c0c4-477c-babd-74166b286800\" (UID: \"b9d7ac1c-c0c4-477c-babd-74166b286800\") " Dec 04 10:32:34 crc kubenswrapper[4831]: I1204 10:32:34.723768 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9d7ac1c-c0c4-477c-babd-74166b286800-dns-svc\") pod \"b9d7ac1c-c0c4-477c-babd-74166b286800\" (UID: \"b9d7ac1c-c0c4-477c-babd-74166b286800\") " Dec 04 10:32:34 crc kubenswrapper[4831]: I1204 10:32:34.724589 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wg72\" (UniqueName: \"kubernetes.io/projected/b9d7ac1c-c0c4-477c-babd-74166b286800-kube-api-access-8wg72\") pod \"b9d7ac1c-c0c4-477c-babd-74166b286800\" (UID: \"b9d7ac1c-c0c4-477c-babd-74166b286800\") " Dec 04 10:32:34 crc kubenswrapper[4831]: I1204 10:32:34.729458 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9d7ac1c-c0c4-477c-babd-74166b286800-kube-api-access-8wg72" (OuterVolumeSpecName: "kube-api-access-8wg72") pod "b9d7ac1c-c0c4-477c-babd-74166b286800" (UID: "b9d7ac1c-c0c4-477c-babd-74166b286800"). InnerVolumeSpecName "kube-api-access-8wg72". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:32:34 crc kubenswrapper[4831]: I1204 10:32:34.747485 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9d7ac1c-c0c4-477c-babd-74166b286800-config" (OuterVolumeSpecName: "config") pod "b9d7ac1c-c0c4-477c-babd-74166b286800" (UID: "b9d7ac1c-c0c4-477c-babd-74166b286800"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:32:34 crc kubenswrapper[4831]: I1204 10:32:34.756769 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9d7ac1c-c0c4-477c-babd-74166b286800-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b9d7ac1c-c0c4-477c-babd-74166b286800" (UID: "b9d7ac1c-c0c4-477c-babd-74166b286800"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:32:34 crc kubenswrapper[4831]: I1204 10:32:34.805180 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d7c249d5-00a5-428b-b259-54d4147f8392","Type":"ContainerStarted","Data":"fe349af874170067e4b09021ac83731e9c9441c5570bad36c13792a55eecfb00"} Dec 04 10:32:34 crc kubenswrapper[4831]: I1204 10:32:34.807268 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c99ddcf47-zm45x" event={"ID":"b9d7ac1c-c0c4-477c-babd-74166b286800","Type":"ContainerDied","Data":"ff783f3933db93c0a48cbe9f90d308a2f732bbcdcc140f046f945e01b2caf29f"} Dec 04 10:32:34 crc kubenswrapper[4831]: I1204 10:32:34.807365 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c99ddcf47-zm45x" Dec 04 10:32:34 crc kubenswrapper[4831]: I1204 10:32:34.807501 4831 scope.go:117] "RemoveContainer" containerID="6de4ae7976682f98d8c72cbb94360dd276e0786934bfccc7ba958fdf15b89635" Dec 04 10:32:34 crc kubenswrapper[4831]: I1204 10:32:34.825925 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9d7ac1c-c0c4-477c-babd-74166b286800-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 10:32:34 crc kubenswrapper[4831]: I1204 10:32:34.826271 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wg72\" (UniqueName: \"kubernetes.io/projected/b9d7ac1c-c0c4-477c-babd-74166b286800-kube-api-access-8wg72\") on node \"crc\" DevicePath \"\"" Dec 04 10:32:34 crc kubenswrapper[4831]: I1204 10:32:34.826281 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9d7ac1c-c0c4-477c-babd-74166b286800-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:32:34 crc kubenswrapper[4831]: I1204 10:32:34.859968 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c99ddcf47-zm45x"] Dec 04 10:32:34 crc kubenswrapper[4831]: I1204 10:32:34.869330 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c99ddcf47-zm45x"] Dec 04 10:32:35 crc kubenswrapper[4831]: I1204 10:32:35.290083 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c23142e-4541-40cb-b870-aeb15f8af94a" path="/var/lib/kubelet/pods/1c23142e-4541-40cb-b870-aeb15f8af94a/volumes" Dec 04 10:32:35 crc kubenswrapper[4831]: I1204 10:32:35.290618 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e5d4c67-551c-49d3-b204-0562a898def5" path="/var/lib/kubelet/pods/6e5d4c67-551c-49d3-b204-0562a898def5/volumes" Dec 04 10:32:35 crc kubenswrapper[4831]: I1204 10:32:35.291115 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9d7ac1c-c0c4-477c-babd-74166b286800" path="/var/lib/kubelet/pods/b9d7ac1c-c0c4-477c-babd-74166b286800/volumes" Dec 04 10:32:40 crc kubenswrapper[4831]: I1204 10:32:40.869226 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8b3ada8b-0b18-4561-a770-80c259283ce1","Type":"ContainerStarted","Data":"cb488ff7e48935af44fbc4c178fbf7398480be03d1dd57fb697e99f89a873028"} Dec 04 10:32:40 crc kubenswrapper[4831]: I1204 10:32:40.872694 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-p9dj4" event={"ID":"104e4cc3-8dfb-4315-8a21-91454f3b1a45","Type":"ContainerStarted","Data":"a89f8c75ca971dba3ff938c1040080f4b542eee801e6976d9e0707ffea5cf663"} Dec 04 10:32:40 crc kubenswrapper[4831]: I1204 10:32:40.875097 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bc4fb897-d58q6" event={"ID":"bd68160b-20a1-40d4-b7c9-827aa56cf7db","Type":"ContainerStarted","Data":"4a3abc2dd23062f9fe42690bb2b07751b4051de1783982c9bbb17bd0f32d4289"} Dec 04 10:32:40 crc kubenswrapper[4831]: I1204 10:32:40.875432 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79bc4fb897-d58q6" Dec 04 10:32:40 crc kubenswrapper[4831]: I1204 10:32:40.877335 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"4e102e9d-f79c-447b-9dff-dfb887b330fe","Type":"ContainerStarted","Data":"86d7db8c3d6e0e06dc91cddf3a4040dab57a34ea3beb4c4b0fea039ec8d900d4"} Dec 04 10:32:40 crc kubenswrapper[4831]: I1204 10:32:40.877394 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 04 10:32:40 crc kubenswrapper[4831]: I1204 10:32:40.882886 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d6faa4d2-bf02-4ed3-baa8-4fcf903fa422","Type":"ContainerStarted","Data":"a053594f9250e42a3b8b485681880ee2c9de541904be9ad71228a06ce03f765f"} Dec 04 10:32:40 crc kubenswrapper[4831]: I1204 10:32:40.887850 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7964976f-vpdr9" event={"ID":"4335ff7d-e14c-4dcc-9c0e-53638cc7cb07","Type":"ContainerStarted","Data":"ddae15c5752a0812980305f78e11f7fa4f1fbc5d33f8b14a20939b578fa1c690"} Dec 04 10:32:40 crc kubenswrapper[4831]: I1204 10:32:40.888547 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d7964976f-vpdr9" Dec 04 10:32:40 crc kubenswrapper[4831]: I1204 10:32:40.917672 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d7964976f-vpdr9" podStartSLOduration=27.917651231 podStartE2EDuration="27.917651231s" podCreationTimestamp="2025-12-04 10:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:32:40.913720317 +0000 UTC m=+1057.862895641" watchObservedRunningTime="2025-12-04 10:32:40.917651231 +0000 UTC m=+1057.866826545" Dec 04 10:32:40 crc kubenswrapper[4831]: I1204 10:32:40.967072 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=17.061391983 podStartE2EDuration="22.967054433s" podCreationTimestamp="2025-12-04 10:32:18 +0000 UTC" firstStartedPulling="2025-12-04 10:32:32.879416995 +0000 UTC m=+1049.828592309" lastFinishedPulling="2025-12-04 10:32:38.785079435 +0000 UTC m=+1055.734254759" observedRunningTime="2025-12-04 10:32:40.956438833 +0000 UTC m=+1057.905614167" watchObservedRunningTime="2025-12-04 10:32:40.967054433 +0000 UTC m=+1057.916229747" Dec 04 10:32:40 crc kubenswrapper[4831]: I1204 10:32:40.981725 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79bc4fb897-d58q6" podStartSLOduration=27.897836089 podStartE2EDuration="27.98170338s" podCreationTimestamp="2025-12-04 10:32:13 +0000 UTC" firstStartedPulling="2025-12-04 10:32:32.473315816 +0000 UTC m=+1049.422491140" lastFinishedPulling="2025-12-04 10:32:32.557183117 +0000 UTC m=+1049.506358431" observedRunningTime="2025-12-04 10:32:40.971986123 +0000 UTC m=+1057.921161437" watchObservedRunningTime="2025-12-04 10:32:40.98170338 +0000 UTC m=+1057.930878714" Dec 04 10:32:41 crc kubenswrapper[4831]: I1204 10:32:41.901034 4831 generic.go:334] "Generic (PLEG): container finished" podID="104e4cc3-8dfb-4315-8a21-91454f3b1a45" containerID="a89f8c75ca971dba3ff938c1040080f4b542eee801e6976d9e0707ffea5cf663" exitCode=0 Dec 04 10:32:41 crc kubenswrapper[4831]: I1204 10:32:41.901099 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-p9dj4" event={"ID":"104e4cc3-8dfb-4315-8a21-91454f3b1a45","Type":"ContainerDied","Data":"a89f8c75ca971dba3ff938c1040080f4b542eee801e6976d9e0707ffea5cf663"} Dec 04 10:32:41 crc kubenswrapper[4831]: I1204 10:32:41.904196 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2l4h5" event={"ID":"3bc807f5-5d11-4f15-aff6-7c5377f10b33","Type":"ContainerStarted","Data":"08d1d14e0ff40646d2dcabb01108368cc4204d3b74fa3e0cd6e3f4b439746b7c"} Dec 04 10:32:41 crc kubenswrapper[4831]: I1204 10:32:41.904649 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-2l4h5" Dec 04 10:32:41 crc kubenswrapper[4831]: I1204 10:32:41.910843 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"0e48a25d-4544-4e4e-a835-8086fbb60f4d","Type":"ContainerStarted","Data":"00dbf3f9256915593a4d7e0a5ad184921c511d5c1bc682f1567a50aaacf6be15"} Dec 04 10:32:41 crc kubenswrapper[4831]: I1204 10:32:41.950638 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-2l4h5" podStartSLOduration=12.777176988 podStartE2EDuration="18.950622039s" podCreationTimestamp="2025-12-04 10:32:23 +0000 UTC" firstStartedPulling="2025-12-04 10:32:33.063842598 +0000 UTC m=+1050.013017912" lastFinishedPulling="2025-12-04 10:32:39.237287649 +0000 UTC m=+1056.186462963" observedRunningTime="2025-12-04 10:32:41.943934493 +0000 UTC m=+1058.893109827" watchObservedRunningTime="2025-12-04 10:32:41.950622039 +0000 UTC m=+1058.899797353" Dec 04 10:32:42 crc kubenswrapper[4831]: I1204 10:32:42.919051 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2d1e04df-4c2a-440f-b533-9903a58c8ecc","Type":"ContainerStarted","Data":"c1f343b6b370799b8079dd35ff1cdd9be187941142d4acadb941c8c7445a556a"} Dec 04 10:32:42 crc kubenswrapper[4831]: I1204 10:32:42.920867 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"462ff702-35a4-4cbe-8155-3ce8a321bf48","Type":"ContainerStarted","Data":"92a2d996ef4484246c9e0c6a7199e478aa295d1ca7aa713e7d588129a1cc9d96"} Dec 04 10:32:42 crc kubenswrapper[4831]: I1204 10:32:42.922497 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-p9dj4" event={"ID":"104e4cc3-8dfb-4315-8a21-91454f3b1a45","Type":"ContainerStarted","Data":"73aeccc0d483124ecf1262a259b7e7d4ce9a2dfd25c6a349528424c75c1224ee"} Dec 04 10:32:42 crc kubenswrapper[4831]: I1204 10:32:42.925045 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"52a381c8-f093-4972-90ac-64799e0184c2","Type":"ContainerStarted","Data":"98c180c693a926e92092f69d0d0e65bd179b51f8837e7a9a74f9b28c70fd7985"} Dec 04 10:32:42 crc kubenswrapper[4831]: I1204 10:32:42.966061 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=14.554156378 podStartE2EDuration="22.966043366s" podCreationTimestamp="2025-12-04 10:32:20 +0000 UTC" firstStartedPulling="2025-12-04 10:32:33.123155052 +0000 UTC m=+1050.072330366" lastFinishedPulling="2025-12-04 10:32:41.53504204 +0000 UTC m=+1058.484217354" observedRunningTime="2025-12-04 10:32:42.96053707 +0000 UTC m=+1059.909712394" watchObservedRunningTime="2025-12-04 10:32:42.966043366 +0000 UTC m=+1059.915218700" Dec 04 10:32:43 crc kubenswrapper[4831]: I1204 10:32:43.935391 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d7c249d5-00a5-428b-b259-54d4147f8392","Type":"ContainerStarted","Data":"af4370eb3930263c1c10d0ead9aa8a60115d268de5396a0e0d4678e2337d43b7"} Dec 04 10:32:43 crc kubenswrapper[4831]: I1204 10:32:43.944784 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-p9dj4" event={"ID":"104e4cc3-8dfb-4315-8a21-91454f3b1a45","Type":"ContainerStarted","Data":"ff1f54b0c91c65a55644db884256a80017f7f386e51df6149ff24283e72af7f0"} Dec 04 10:32:43 crc kubenswrapper[4831]: I1204 10:32:43.945086 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-p9dj4" Dec 04 10:32:43 crc kubenswrapper[4831]: I1204 10:32:43.946467 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"d13ed0c0-494b-46b5-965d-1426a9575119","Type":"ContainerStarted","Data":"ce1e83f2670b8a1f56e0cacd4ecd566e73227dc859e046189c7f385da260a6de"} Dec 04 10:32:43 crc kubenswrapper[4831]: I1204 10:32:43.952425 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e058318a-8379-4f10-9860-7af36b3278e5","Type":"ContainerStarted","Data":"d125746330cd0b36bdef72c8c0c34dbc600aa31a341f76e101598642834a0abf"} Dec 04 10:32:43 crc kubenswrapper[4831]: I1204 10:32:43.954034 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 04 10:32:43 crc kubenswrapper[4831]: I1204 10:32:43.997713 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-p9dj4" podStartSLOduration=15.124871166 podStartE2EDuration="20.99769256s" podCreationTimestamp="2025-12-04 10:32:23 +0000 UTC" firstStartedPulling="2025-12-04 10:32:33.184112099 +0000 UTC m=+1050.133287413" lastFinishedPulling="2025-12-04 10:32:39.056933493 +0000 UTC m=+1056.006108807" observedRunningTime="2025-12-04 10:32:43.967413702 +0000 UTC m=+1060.916589026" watchObservedRunningTime="2025-12-04 10:32:43.99769256 +0000 UTC m=+1060.946867874" Dec 04 10:32:44 crc kubenswrapper[4831]: I1204 10:32:44.428279 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-p9dj4" Dec 04 10:32:44 crc kubenswrapper[4831]: I1204 10:32:44.964024 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d7c249d5-00a5-428b-b259-54d4147f8392","Type":"ContainerStarted","Data":"cf406906ece1a13e6b758ecc4e21eecaf8b2f85e0eb30b1df38c80a77d9b40d8"} Dec 04 10:32:44 crc kubenswrapper[4831]: I1204 10:32:44.967995 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"0e48a25d-4544-4e4e-a835-8086fbb60f4d","Type":"ContainerStarted","Data":"1d25d2a9466fe61af4998932105175977f2d4b1cf59e446e9bf0999cc05679a8"} Dec 04 10:32:44 crc kubenswrapper[4831]: I1204 10:32:44.990189 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=12.023145784 podStartE2EDuration="21.990131861s" podCreationTimestamp="2025-12-04 10:32:23 +0000 UTC" firstStartedPulling="2025-12-04 10:32:34.613500781 +0000 UTC m=+1051.562676095" lastFinishedPulling="2025-12-04 10:32:44.580486858 +0000 UTC m=+1061.529662172" observedRunningTime="2025-12-04 10:32:44.989995037 +0000 UTC m=+1061.939170351" watchObservedRunningTime="2025-12-04 10:32:44.990131861 +0000 UTC m=+1061.939307175" Dec 04 10:32:45 crc kubenswrapper[4831]: I1204 10:32:45.011304 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=6.420763839 podStartE2EDuration="18.011282468s" podCreationTimestamp="2025-12-04 10:32:27 +0000 UTC" firstStartedPulling="2025-12-04 10:32:32.984784423 +0000 UTC m=+1049.933959737" lastFinishedPulling="2025-12-04 10:32:44.575303052 +0000 UTC m=+1061.524478366" observedRunningTime="2025-12-04 10:32:45.009472641 +0000 UTC m=+1061.958647985" watchObservedRunningTime="2025-12-04 10:32:45.011282468 +0000 UTC m=+1061.960457792" Dec 04 10:32:45 crc kubenswrapper[4831]: I1204 10:32:45.633624 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 04 10:32:45 crc kubenswrapper[4831]: I1204 10:32:45.978205 4831 generic.go:334] "Generic (PLEG): container finished" podID="d6faa4d2-bf02-4ed3-baa8-4fcf903fa422" containerID="a053594f9250e42a3b8b485681880ee2c9de541904be9ad71228a06ce03f765f" exitCode=0 Dec 04 10:32:45 crc kubenswrapper[4831]: I1204 10:32:45.978471 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d6faa4d2-bf02-4ed3-baa8-4fcf903fa422","Type":"ContainerDied","Data":"a053594f9250e42a3b8b485681880ee2c9de541904be9ad71228a06ce03f765f"} Dec 04 10:32:45 crc kubenswrapper[4831]: I1204 10:32:45.980497 4831 generic.go:334] "Generic (PLEG): container finished" podID="8b3ada8b-0b18-4561-a770-80c259283ce1" containerID="cb488ff7e48935af44fbc4c178fbf7398480be03d1dd57fb697e99f89a873028" exitCode=0 Dec 04 10:32:45 crc kubenswrapper[4831]: I1204 10:32:45.980602 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8b3ada8b-0b18-4561-a770-80c259283ce1","Type":"ContainerDied","Data":"cb488ff7e48935af44fbc4c178fbf7398480be03d1dd57fb697e99f89a873028"} Dec 04 10:32:46 crc kubenswrapper[4831]: I1204 10:32:46.543254 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 04 10:32:46 crc kubenswrapper[4831]: I1204 10:32:46.598031 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 04 10:32:46 crc kubenswrapper[4831]: I1204 10:32:46.990696 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d6faa4d2-bf02-4ed3-baa8-4fcf903fa422","Type":"ContainerStarted","Data":"a0eead162efafa4acd91521e3c79337fc0daddf3cec6d2c734c31b8809378db0"} Dec 04 10:32:46 crc kubenswrapper[4831]: I1204 10:32:46.992403 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8b3ada8b-0b18-4561-a770-80c259283ce1","Type":"ContainerStarted","Data":"13879e128a448d5ade104f687e0e5a40abd962bd9ca722d3598e99b0cd46f25b"} Dec 04 10:32:46 crc kubenswrapper[4831]: I1204 10:32:46.992901 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.014653 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=25.408478384 podStartE2EDuration="32.014633916s" podCreationTimestamp="2025-12-04 10:32:15 +0000 UTC" firstStartedPulling="2025-12-04 10:32:32.882120586 +0000 UTC m=+1049.831295900" lastFinishedPulling="2025-12-04 10:32:39.488276118 +0000 UTC m=+1056.437451432" observedRunningTime="2025-12-04 10:32:47.008963836 +0000 UTC m=+1063.958139150" watchObservedRunningTime="2025-12-04 10:32:47.014633916 +0000 UTC m=+1063.963809230" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.037157 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=23.910733778 podStartE2EDuration="30.037134929s" podCreationTimestamp="2025-12-04 10:32:17 +0000 UTC" firstStartedPulling="2025-12-04 10:32:33.110857608 +0000 UTC m=+1050.060032922" lastFinishedPulling="2025-12-04 10:32:39.237258759 +0000 UTC m=+1056.186434073" observedRunningTime="2025-12-04 10:32:47.030653438 +0000 UTC m=+1063.979828772" watchObservedRunningTime="2025-12-04 10:32:47.037134929 +0000 UTC m=+1063.986310253" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.051779 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.306573 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.307017 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.340346 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7964976f-vpdr9"] Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.340722 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d7964976f-vpdr9" podUID="4335ff7d-e14c-4dcc-9c0e-53638cc7cb07" containerName="dnsmasq-dns" containerID="cri-o://ddae15c5752a0812980305f78e11f7fa4f1fbc5d33f8b14a20939b578fa1c690" gracePeriod=10 Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.349441 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d7964976f-vpdr9" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.423863 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-lwc45"] Dec 04 10:32:47 crc kubenswrapper[4831]: E1204 10:32:47.424349 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9d7ac1c-c0c4-477c-babd-74166b286800" containerName="init" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.424373 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9d7ac1c-c0c4-477c-babd-74166b286800" containerName="init" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.424675 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9d7ac1c-c0c4-477c-babd-74166b286800" containerName="init" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.425372 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-lwc45" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.428317 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.434238 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-lwc45"] Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.456933 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c8f654669-pc8vw"] Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.464885 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c8f654669-pc8vw" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.466919 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.476036 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c8f654669-pc8vw"] Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.555868 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qld2k\" (UniqueName: \"kubernetes.io/projected/85ec4bf8-71be-4822-9afe-8d09a32d8a11-kube-api-access-qld2k\") pod \"ovn-controller-metrics-lwc45\" (UID: \"85ec4bf8-71be-4822-9afe-8d09a32d8a11\") " pod="openstack/ovn-controller-metrics-lwc45" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.555931 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ec4bf8-71be-4822-9afe-8d09a32d8a11-combined-ca-bundle\") pod \"ovn-controller-metrics-lwc45\" (UID: \"85ec4bf8-71be-4822-9afe-8d09a32d8a11\") " pod="openstack/ovn-controller-metrics-lwc45" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.556031 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85ec4bf8-71be-4822-9afe-8d09a32d8a11-config\") pod \"ovn-controller-metrics-lwc45\" (UID: \"85ec4bf8-71be-4822-9afe-8d09a32d8a11\") " pod="openstack/ovn-controller-metrics-lwc45" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.556062 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/85ec4bf8-71be-4822-9afe-8d09a32d8a11-ovn-rundir\") pod \"ovn-controller-metrics-lwc45\" (UID: \"85ec4bf8-71be-4822-9afe-8d09a32d8a11\") " pod="openstack/ovn-controller-metrics-lwc45" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.556084 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/85ec4bf8-71be-4822-9afe-8d09a32d8a11-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-lwc45\" (UID: \"85ec4bf8-71be-4822-9afe-8d09a32d8a11\") " pod="openstack/ovn-controller-metrics-lwc45" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.556114 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/85ec4bf8-71be-4822-9afe-8d09a32d8a11-ovs-rundir\") pod \"ovn-controller-metrics-lwc45\" (UID: \"85ec4bf8-71be-4822-9afe-8d09a32d8a11\") " pod="openstack/ovn-controller-metrics-lwc45" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.626039 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bc4fb897-d58q6"] Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.626356 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79bc4fb897-d58q6" podUID="bd68160b-20a1-40d4-b7c9-827aa56cf7db" containerName="dnsmasq-dns" containerID="cri-o://4a3abc2dd23062f9fe42690bb2b07751b4051de1783982c9bbb17bd0f32d4289" gracePeriod=10 Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.633849 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79bc4fb897-d58q6" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.661018 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50ea30fe-69af-42c7-baf7-3a3bb0ab7882-dns-svc\") pod \"dnsmasq-dns-6c8f654669-pc8vw\" (UID: \"50ea30fe-69af-42c7-baf7-3a3bb0ab7882\") " pod="openstack/dnsmasq-dns-6c8f654669-pc8vw" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.661080 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50ea30fe-69af-42c7-baf7-3a3bb0ab7882-ovsdbserver-sb\") pod \"dnsmasq-dns-6c8f654669-pc8vw\" (UID: \"50ea30fe-69af-42c7-baf7-3a3bb0ab7882\") " pod="openstack/dnsmasq-dns-6c8f654669-pc8vw" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.661112 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qld2k\" (UniqueName: \"kubernetes.io/projected/85ec4bf8-71be-4822-9afe-8d09a32d8a11-kube-api-access-qld2k\") pod \"ovn-controller-metrics-lwc45\" (UID: \"85ec4bf8-71be-4822-9afe-8d09a32d8a11\") " pod="openstack/ovn-controller-metrics-lwc45" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.661146 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ec4bf8-71be-4822-9afe-8d09a32d8a11-combined-ca-bundle\") pod \"ovn-controller-metrics-lwc45\" (UID: \"85ec4bf8-71be-4822-9afe-8d09a32d8a11\") " pod="openstack/ovn-controller-metrics-lwc45" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.661204 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xndbr\" (UniqueName: \"kubernetes.io/projected/50ea30fe-69af-42c7-baf7-3a3bb0ab7882-kube-api-access-xndbr\") pod \"dnsmasq-dns-6c8f654669-pc8vw\" (UID: \"50ea30fe-69af-42c7-baf7-3a3bb0ab7882\") " pod="openstack/dnsmasq-dns-6c8f654669-pc8vw" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.661240 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50ea30fe-69af-42c7-baf7-3a3bb0ab7882-config\") pod \"dnsmasq-dns-6c8f654669-pc8vw\" (UID: \"50ea30fe-69af-42c7-baf7-3a3bb0ab7882\") " pod="openstack/dnsmasq-dns-6c8f654669-pc8vw" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.661278 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85ec4bf8-71be-4822-9afe-8d09a32d8a11-config\") pod \"ovn-controller-metrics-lwc45\" (UID: \"85ec4bf8-71be-4822-9afe-8d09a32d8a11\") " pod="openstack/ovn-controller-metrics-lwc45" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.661308 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/85ec4bf8-71be-4822-9afe-8d09a32d8a11-ovn-rundir\") pod \"ovn-controller-metrics-lwc45\" (UID: \"85ec4bf8-71be-4822-9afe-8d09a32d8a11\") " pod="openstack/ovn-controller-metrics-lwc45" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.661345 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/85ec4bf8-71be-4822-9afe-8d09a32d8a11-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-lwc45\" (UID: \"85ec4bf8-71be-4822-9afe-8d09a32d8a11\") " pod="openstack/ovn-controller-metrics-lwc45" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.661374 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/85ec4bf8-71be-4822-9afe-8d09a32d8a11-ovs-rundir\") pod \"ovn-controller-metrics-lwc45\" (UID: \"85ec4bf8-71be-4822-9afe-8d09a32d8a11\") " pod="openstack/ovn-controller-metrics-lwc45" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.661709 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/85ec4bf8-71be-4822-9afe-8d09a32d8a11-ovs-rundir\") pod \"ovn-controller-metrics-lwc45\" (UID: \"85ec4bf8-71be-4822-9afe-8d09a32d8a11\") " pod="openstack/ovn-controller-metrics-lwc45" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.661958 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/85ec4bf8-71be-4822-9afe-8d09a32d8a11-ovn-rundir\") pod \"ovn-controller-metrics-lwc45\" (UID: \"85ec4bf8-71be-4822-9afe-8d09a32d8a11\") " pod="openstack/ovn-controller-metrics-lwc45" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.662987 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85ec4bf8-71be-4822-9afe-8d09a32d8a11-config\") pod \"ovn-controller-metrics-lwc45\" (UID: \"85ec4bf8-71be-4822-9afe-8d09a32d8a11\") " pod="openstack/ovn-controller-metrics-lwc45" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.665130 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c8959dd5-nbrqs"] Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.671521 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c8959dd5-nbrqs" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.673101 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/85ec4bf8-71be-4822-9afe-8d09a32d8a11-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-lwc45\" (UID: \"85ec4bf8-71be-4822-9afe-8d09a32d8a11\") " pod="openstack/ovn-controller-metrics-lwc45" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.677535 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.681294 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ec4bf8-71be-4822-9afe-8d09a32d8a11-combined-ca-bundle\") pod \"ovn-controller-metrics-lwc45\" (UID: \"85ec4bf8-71be-4822-9afe-8d09a32d8a11\") " pod="openstack/ovn-controller-metrics-lwc45" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.685630 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qld2k\" (UniqueName: \"kubernetes.io/projected/85ec4bf8-71be-4822-9afe-8d09a32d8a11-kube-api-access-qld2k\") pod \"ovn-controller-metrics-lwc45\" (UID: \"85ec4bf8-71be-4822-9afe-8d09a32d8a11\") " pod="openstack/ovn-controller-metrics-lwc45" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.702716 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c8959dd5-nbrqs"] Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.765533 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50ea30fe-69af-42c7-baf7-3a3bb0ab7882-dns-svc\") pod \"dnsmasq-dns-6c8f654669-pc8vw\" (UID: \"50ea30fe-69af-42c7-baf7-3a3bb0ab7882\") " pod="openstack/dnsmasq-dns-6c8f654669-pc8vw" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.765603 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50ea30fe-69af-42c7-baf7-3a3bb0ab7882-ovsdbserver-sb\") pod \"dnsmasq-dns-6c8f654669-pc8vw\" (UID: \"50ea30fe-69af-42c7-baf7-3a3bb0ab7882\") " pod="openstack/dnsmasq-dns-6c8f654669-pc8vw" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.765681 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xndbr\" (UniqueName: \"kubernetes.io/projected/50ea30fe-69af-42c7-baf7-3a3bb0ab7882-kube-api-access-xndbr\") pod \"dnsmasq-dns-6c8f654669-pc8vw\" (UID: \"50ea30fe-69af-42c7-baf7-3a3bb0ab7882\") " pod="openstack/dnsmasq-dns-6c8f654669-pc8vw" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.765708 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50ea30fe-69af-42c7-baf7-3a3bb0ab7882-config\") pod \"dnsmasq-dns-6c8f654669-pc8vw\" (UID: \"50ea30fe-69af-42c7-baf7-3a3bb0ab7882\") " pod="openstack/dnsmasq-dns-6c8f654669-pc8vw" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.766583 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50ea30fe-69af-42c7-baf7-3a3bb0ab7882-config\") pod \"dnsmasq-dns-6c8f654669-pc8vw\" (UID: \"50ea30fe-69af-42c7-baf7-3a3bb0ab7882\") " pod="openstack/dnsmasq-dns-6c8f654669-pc8vw" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.767651 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50ea30fe-69af-42c7-baf7-3a3bb0ab7882-dns-svc\") pod \"dnsmasq-dns-6c8f654669-pc8vw\" (UID: \"50ea30fe-69af-42c7-baf7-3a3bb0ab7882\") " pod="openstack/dnsmasq-dns-6c8f654669-pc8vw" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.768531 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50ea30fe-69af-42c7-baf7-3a3bb0ab7882-ovsdbserver-sb\") pod \"dnsmasq-dns-6c8f654669-pc8vw\" (UID: \"50ea30fe-69af-42c7-baf7-3a3bb0ab7882\") " pod="openstack/dnsmasq-dns-6c8f654669-pc8vw" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.810534 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xndbr\" (UniqueName: \"kubernetes.io/projected/50ea30fe-69af-42c7-baf7-3a3bb0ab7882-kube-api-access-xndbr\") pod \"dnsmasq-dns-6c8f654669-pc8vw\" (UID: \"50ea30fe-69af-42c7-baf7-3a3bb0ab7882\") " pod="openstack/dnsmasq-dns-6c8f654669-pc8vw" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.821377 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-lwc45" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.834901 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c8f654669-pc8vw" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.870587 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bffa83b9-4c9f-4223-9390-d8dacbdf2436-ovsdbserver-sb\") pod \"dnsmasq-dns-5c8959dd5-nbrqs\" (UID: \"bffa83b9-4c9f-4223-9390-d8dacbdf2436\") " pod="openstack/dnsmasq-dns-5c8959dd5-nbrqs" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.870709 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bffa83b9-4c9f-4223-9390-d8dacbdf2436-dns-svc\") pod \"dnsmasq-dns-5c8959dd5-nbrqs\" (UID: \"bffa83b9-4c9f-4223-9390-d8dacbdf2436\") " pod="openstack/dnsmasq-dns-5c8959dd5-nbrqs" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.870748 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bffa83b9-4c9f-4223-9390-d8dacbdf2436-config\") pod \"dnsmasq-dns-5c8959dd5-nbrqs\" (UID: \"bffa83b9-4c9f-4223-9390-d8dacbdf2436\") " pod="openstack/dnsmasq-dns-5c8959dd5-nbrqs" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.870789 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bffa83b9-4c9f-4223-9390-d8dacbdf2436-ovsdbserver-nb\") pod \"dnsmasq-dns-5c8959dd5-nbrqs\" (UID: \"bffa83b9-4c9f-4223-9390-d8dacbdf2436\") " pod="openstack/dnsmasq-dns-5c8959dd5-nbrqs" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.870849 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf69q\" (UniqueName: \"kubernetes.io/projected/bffa83b9-4c9f-4223-9390-d8dacbdf2436-kube-api-access-sf69q\") pod \"dnsmasq-dns-5c8959dd5-nbrqs\" (UID: \"bffa83b9-4c9f-4223-9390-d8dacbdf2436\") " pod="openstack/dnsmasq-dns-5c8959dd5-nbrqs" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.914047 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7964976f-vpdr9" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.972866 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bffa83b9-4c9f-4223-9390-d8dacbdf2436-dns-svc\") pod \"dnsmasq-dns-5c8959dd5-nbrqs\" (UID: \"bffa83b9-4c9f-4223-9390-d8dacbdf2436\") " pod="openstack/dnsmasq-dns-5c8959dd5-nbrqs" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.972930 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bffa83b9-4c9f-4223-9390-d8dacbdf2436-config\") pod \"dnsmasq-dns-5c8959dd5-nbrqs\" (UID: \"bffa83b9-4c9f-4223-9390-d8dacbdf2436\") " pod="openstack/dnsmasq-dns-5c8959dd5-nbrqs" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.972953 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bffa83b9-4c9f-4223-9390-d8dacbdf2436-ovsdbserver-nb\") pod \"dnsmasq-dns-5c8959dd5-nbrqs\" (UID: \"bffa83b9-4c9f-4223-9390-d8dacbdf2436\") " pod="openstack/dnsmasq-dns-5c8959dd5-nbrqs" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.973142 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf69q\" (UniqueName: \"kubernetes.io/projected/bffa83b9-4c9f-4223-9390-d8dacbdf2436-kube-api-access-sf69q\") pod \"dnsmasq-dns-5c8959dd5-nbrqs\" (UID: \"bffa83b9-4c9f-4223-9390-d8dacbdf2436\") " pod="openstack/dnsmasq-dns-5c8959dd5-nbrqs" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.973219 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bffa83b9-4c9f-4223-9390-d8dacbdf2436-ovsdbserver-sb\") pod \"dnsmasq-dns-5c8959dd5-nbrqs\" (UID: \"bffa83b9-4c9f-4223-9390-d8dacbdf2436\") " pod="openstack/dnsmasq-dns-5c8959dd5-nbrqs" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.975779 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bffa83b9-4c9f-4223-9390-d8dacbdf2436-ovsdbserver-nb\") pod \"dnsmasq-dns-5c8959dd5-nbrqs\" (UID: \"bffa83b9-4c9f-4223-9390-d8dacbdf2436\") " pod="openstack/dnsmasq-dns-5c8959dd5-nbrqs" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.975729 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bffa83b9-4c9f-4223-9390-d8dacbdf2436-ovsdbserver-sb\") pod \"dnsmasq-dns-5c8959dd5-nbrqs\" (UID: \"bffa83b9-4c9f-4223-9390-d8dacbdf2436\") " pod="openstack/dnsmasq-dns-5c8959dd5-nbrqs" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.976018 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bffa83b9-4c9f-4223-9390-d8dacbdf2436-dns-svc\") pod \"dnsmasq-dns-5c8959dd5-nbrqs\" (UID: \"bffa83b9-4c9f-4223-9390-d8dacbdf2436\") " pod="openstack/dnsmasq-dns-5c8959dd5-nbrqs" Dec 04 10:32:47 crc kubenswrapper[4831]: I1204 10:32:47.976783 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bffa83b9-4c9f-4223-9390-d8dacbdf2436-config\") pod \"dnsmasq-dns-5c8959dd5-nbrqs\" (UID: \"bffa83b9-4c9f-4223-9390-d8dacbdf2436\") " pod="openstack/dnsmasq-dns-5c8959dd5-nbrqs" Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.013457 4831 generic.go:334] "Generic (PLEG): container finished" podID="bd68160b-20a1-40d4-b7c9-827aa56cf7db" containerID="4a3abc2dd23062f9fe42690bb2b07751b4051de1783982c9bbb17bd0f32d4289" exitCode=0 Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.013543 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bc4fb897-d58q6" event={"ID":"bd68160b-20a1-40d4-b7c9-827aa56cf7db","Type":"ContainerDied","Data":"4a3abc2dd23062f9fe42690bb2b07751b4051de1783982c9bbb17bd0f32d4289"} Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.016625 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf69q\" (UniqueName: \"kubernetes.io/projected/bffa83b9-4c9f-4223-9390-d8dacbdf2436-kube-api-access-sf69q\") pod \"dnsmasq-dns-5c8959dd5-nbrqs\" (UID: \"bffa83b9-4c9f-4223-9390-d8dacbdf2436\") " pod="openstack/dnsmasq-dns-5c8959dd5-nbrqs" Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.021161 4831 generic.go:334] "Generic (PLEG): container finished" podID="4335ff7d-e14c-4dcc-9c0e-53638cc7cb07" containerID="ddae15c5752a0812980305f78e11f7fa4f1fbc5d33f8b14a20939b578fa1c690" exitCode=0 Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.022010 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7964976f-vpdr9" Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.022417 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7964976f-vpdr9" event={"ID":"4335ff7d-e14c-4dcc-9c0e-53638cc7cb07","Type":"ContainerDied","Data":"ddae15c5752a0812980305f78e11f7fa4f1fbc5d33f8b14a20939b578fa1c690"} Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.022442 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7964976f-vpdr9" event={"ID":"4335ff7d-e14c-4dcc-9c0e-53638cc7cb07","Type":"ContainerDied","Data":"1b826e1aafd49dfff0fb8d05a33eafeb9d43c99657aa681b655db865dd715666"} Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.022458 4831 scope.go:117] "RemoveContainer" containerID="ddae15c5752a0812980305f78e11f7fa4f1fbc5d33f8b14a20939b578fa1c690" Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.068287 4831 scope.go:117] "RemoveContainer" containerID="ce2cbf40eecb6e3b2d3def3688272112834483b01f2228275d71a6a7c23316ff" Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.074170 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4slkr\" (UniqueName: \"kubernetes.io/projected/4335ff7d-e14c-4dcc-9c0e-53638cc7cb07-kube-api-access-4slkr\") pod \"4335ff7d-e14c-4dcc-9c0e-53638cc7cb07\" (UID: \"4335ff7d-e14c-4dcc-9c0e-53638cc7cb07\") " Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.074231 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4335ff7d-e14c-4dcc-9c0e-53638cc7cb07-config\") pod \"4335ff7d-e14c-4dcc-9c0e-53638cc7cb07\" (UID: \"4335ff7d-e14c-4dcc-9c0e-53638cc7cb07\") " Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.074323 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4335ff7d-e14c-4dcc-9c0e-53638cc7cb07-dns-svc\") pod \"4335ff7d-e14c-4dcc-9c0e-53638cc7cb07\" (UID: \"4335ff7d-e14c-4dcc-9c0e-53638cc7cb07\") " Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.077802 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4335ff7d-e14c-4dcc-9c0e-53638cc7cb07-kube-api-access-4slkr" (OuterVolumeSpecName: "kube-api-access-4slkr") pod "4335ff7d-e14c-4dcc-9c0e-53638cc7cb07" (UID: "4335ff7d-e14c-4dcc-9c0e-53638cc7cb07"). InnerVolumeSpecName "kube-api-access-4slkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.088350 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c8959dd5-nbrqs" Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.099276 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bc4fb897-d58q6" Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.133993 4831 scope.go:117] "RemoveContainer" containerID="ddae15c5752a0812980305f78e11f7fa4f1fbc5d33f8b14a20939b578fa1c690" Dec 04 10:32:48 crc kubenswrapper[4831]: E1204 10:32:48.135015 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddae15c5752a0812980305f78e11f7fa4f1fbc5d33f8b14a20939b578fa1c690\": container with ID starting with ddae15c5752a0812980305f78e11f7fa4f1fbc5d33f8b14a20939b578fa1c690 not found: ID does not exist" containerID="ddae15c5752a0812980305f78e11f7fa4f1fbc5d33f8b14a20939b578fa1c690" Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.135049 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddae15c5752a0812980305f78e11f7fa4f1fbc5d33f8b14a20939b578fa1c690"} err="failed to get container status \"ddae15c5752a0812980305f78e11f7fa4f1fbc5d33f8b14a20939b578fa1c690\": rpc error: code = NotFound desc = could not find container \"ddae15c5752a0812980305f78e11f7fa4f1fbc5d33f8b14a20939b578fa1c690\": container with ID starting with ddae15c5752a0812980305f78e11f7fa4f1fbc5d33f8b14a20939b578fa1c690 not found: ID does not exist" Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.135076 4831 scope.go:117] "RemoveContainer" containerID="ce2cbf40eecb6e3b2d3def3688272112834483b01f2228275d71a6a7c23316ff" Dec 04 10:32:48 crc kubenswrapper[4831]: E1204 10:32:48.135278 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce2cbf40eecb6e3b2d3def3688272112834483b01f2228275d71a6a7c23316ff\": container with ID starting with ce2cbf40eecb6e3b2d3def3688272112834483b01f2228275d71a6a7c23316ff not found: ID does not exist" containerID="ce2cbf40eecb6e3b2d3def3688272112834483b01f2228275d71a6a7c23316ff" Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.135298 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce2cbf40eecb6e3b2d3def3688272112834483b01f2228275d71a6a7c23316ff"} err="failed to get container status \"ce2cbf40eecb6e3b2d3def3688272112834483b01f2228275d71a6a7c23316ff\": rpc error: code = NotFound desc = could not find container \"ce2cbf40eecb6e3b2d3def3688272112834483b01f2228275d71a6a7c23316ff\": container with ID starting with ce2cbf40eecb6e3b2d3def3688272112834483b01f2228275d71a6a7c23316ff not found: ID does not exist" Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.140538 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4335ff7d-e14c-4dcc-9c0e-53638cc7cb07-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4335ff7d-e14c-4dcc-9c0e-53638cc7cb07" (UID: "4335ff7d-e14c-4dcc-9c0e-53638cc7cb07"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.159009 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4335ff7d-e14c-4dcc-9c0e-53638cc7cb07-config" (OuterVolumeSpecName: "config") pod "4335ff7d-e14c-4dcc-9c0e-53638cc7cb07" (UID: "4335ff7d-e14c-4dcc-9c0e-53638cc7cb07"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.175790 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4slkr\" (UniqueName: \"kubernetes.io/projected/4335ff7d-e14c-4dcc-9c0e-53638cc7cb07-kube-api-access-4slkr\") on node \"crc\" DevicePath \"\"" Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.175821 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4335ff7d-e14c-4dcc-9c0e-53638cc7cb07-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.175829 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4335ff7d-e14c-4dcc-9c0e-53638cc7cb07-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.276475 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd68160b-20a1-40d4-b7c9-827aa56cf7db-config\") pod \"bd68160b-20a1-40d4-b7c9-827aa56cf7db\" (UID: \"bd68160b-20a1-40d4-b7c9-827aa56cf7db\") " Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.276832 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd68160b-20a1-40d4-b7c9-827aa56cf7db-dns-svc\") pod \"bd68160b-20a1-40d4-b7c9-827aa56cf7db\" (UID: \"bd68160b-20a1-40d4-b7c9-827aa56cf7db\") " Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.276911 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zx56s\" (UniqueName: \"kubernetes.io/projected/bd68160b-20a1-40d4-b7c9-827aa56cf7db-kube-api-access-zx56s\") pod \"bd68160b-20a1-40d4-b7c9-827aa56cf7db\" (UID: \"bd68160b-20a1-40d4-b7c9-827aa56cf7db\") " Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.280641 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd68160b-20a1-40d4-b7c9-827aa56cf7db-kube-api-access-zx56s" (OuterVolumeSpecName: "kube-api-access-zx56s") pod "bd68160b-20a1-40d4-b7c9-827aa56cf7db" (UID: "bd68160b-20a1-40d4-b7c9-827aa56cf7db"). InnerVolumeSpecName "kube-api-access-zx56s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.336427 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-lwc45"] Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.339207 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd68160b-20a1-40d4-b7c9-827aa56cf7db-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bd68160b-20a1-40d4-b7c9-827aa56cf7db" (UID: "bd68160b-20a1-40d4-b7c9-827aa56cf7db"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:32:48 crc kubenswrapper[4831]: W1204 10:32:48.340398 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85ec4bf8_71be_4822_9afe_8d09a32d8a11.slice/crio-2f97224236a94f88d418659aaafd0c356a7e329be96e339a0378d9f9225d203e WatchSource:0}: Error finding container 2f97224236a94f88d418659aaafd0c356a7e329be96e339a0378d9f9225d203e: Status 404 returned error can't find the container with id 2f97224236a94f88d418659aaafd0c356a7e329be96e339a0378d9f9225d203e Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.341037 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd68160b-20a1-40d4-b7c9-827aa56cf7db-config" (OuterVolumeSpecName: "config") pod "bd68160b-20a1-40d4-b7c9-827aa56cf7db" (UID: "bd68160b-20a1-40d4-b7c9-827aa56cf7db"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.358507 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c8959dd5-nbrqs"] Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.374415 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7964976f-vpdr9"] Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.378799 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zx56s\" (UniqueName: \"kubernetes.io/projected/bd68160b-20a1-40d4-b7c9-827aa56cf7db-kube-api-access-zx56s\") on node \"crc\" DevicePath \"\"" Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.378821 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd68160b-20a1-40d4-b7c9-827aa56cf7db-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.378832 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd68160b-20a1-40d4-b7c9-827aa56cf7db-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.384791 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d7964976f-vpdr9"] Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.473398 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c8f654669-pc8vw"] Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.673287 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.673603 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.737239 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.738133 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.810819 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.828043 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.981568 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 04 10:32:48 crc kubenswrapper[4831]: E1204 10:32:48.983634 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4335ff7d-e14c-4dcc-9c0e-53638cc7cb07" containerName="dnsmasq-dns" Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.983678 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="4335ff7d-e14c-4dcc-9c0e-53638cc7cb07" containerName="dnsmasq-dns" Dec 04 10:32:48 crc kubenswrapper[4831]: E1204 10:32:48.983720 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd68160b-20a1-40d4-b7c9-827aa56cf7db" containerName="dnsmasq-dns" Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.983728 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd68160b-20a1-40d4-b7c9-827aa56cf7db" containerName="dnsmasq-dns" Dec 04 10:32:48 crc kubenswrapper[4831]: E1204 10:32:48.983746 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4335ff7d-e14c-4dcc-9c0e-53638cc7cb07" containerName="init" Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.983754 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="4335ff7d-e14c-4dcc-9c0e-53638cc7cb07" containerName="init" Dec 04 10:32:48 crc kubenswrapper[4831]: E1204 10:32:48.983768 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd68160b-20a1-40d4-b7c9-827aa56cf7db" containerName="init" Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.983778 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd68160b-20a1-40d4-b7c9-827aa56cf7db" containerName="init" Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.983992 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="4335ff7d-e14c-4dcc-9c0e-53638cc7cb07" containerName="dnsmasq-dns" Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.984008 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd68160b-20a1-40d4-b7c9-827aa56cf7db" containerName="dnsmasq-dns" Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.985146 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.990159 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-lszfh" Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.990252 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.990388 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.990511 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 04 10:32:48 crc kubenswrapper[4831]: I1204 10:32:48.995414 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 04 10:32:49 crc kubenswrapper[4831]: I1204 10:32:49.031359 4831 generic.go:334] "Generic (PLEG): container finished" podID="bffa83b9-4c9f-4223-9390-d8dacbdf2436" containerID="82c03e7465660e1854decede12406448d7b3c039a3c938df11039f030fd90513" exitCode=0 Dec 04 10:32:49 crc kubenswrapper[4831]: I1204 10:32:49.031431 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c8959dd5-nbrqs" event={"ID":"bffa83b9-4c9f-4223-9390-d8dacbdf2436","Type":"ContainerDied","Data":"82c03e7465660e1854decede12406448d7b3c039a3c938df11039f030fd90513"} Dec 04 10:32:49 crc kubenswrapper[4831]: I1204 10:32:49.031457 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c8959dd5-nbrqs" event={"ID":"bffa83b9-4c9f-4223-9390-d8dacbdf2436","Type":"ContainerStarted","Data":"c59fc25014e988262b7408560b81bbda3037d7a6c386916242115c5999b85d7b"} Dec 04 10:32:49 crc kubenswrapper[4831]: I1204 10:32:49.052449 4831 generic.go:334] "Generic (PLEG): container finished" podID="50ea30fe-69af-42c7-baf7-3a3bb0ab7882" containerID="e7333a20bd1aa184b46175df8c8ee084e6bad29cb8fb6a720bb07c9342782426" exitCode=0 Dec 04 10:32:49 crc kubenswrapper[4831]: I1204 10:32:49.052558 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c8f654669-pc8vw" event={"ID":"50ea30fe-69af-42c7-baf7-3a3bb0ab7882","Type":"ContainerDied","Data":"e7333a20bd1aa184b46175df8c8ee084e6bad29cb8fb6a720bb07c9342782426"} Dec 04 10:32:49 crc kubenswrapper[4831]: I1204 10:32:49.052592 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c8f654669-pc8vw" event={"ID":"50ea30fe-69af-42c7-baf7-3a3bb0ab7882","Type":"ContainerStarted","Data":"f1bf1600d403e8f94bf1136132bda283d6c9dcba948f3317e6d91777b75b2c58"} Dec 04 10:32:49 crc kubenswrapper[4831]: I1204 10:32:49.062152 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bc4fb897-d58q6" event={"ID":"bd68160b-20a1-40d4-b7c9-827aa56cf7db","Type":"ContainerDied","Data":"66b07a23dcd35cb0c180f3483f01cc599a1d62861f676143a9001d5ad22c04a5"} Dec 04 10:32:49 crc kubenswrapper[4831]: I1204 10:32:49.062200 4831 scope.go:117] "RemoveContainer" containerID="4a3abc2dd23062f9fe42690bb2b07751b4051de1783982c9bbb17bd0f32d4289" Dec 04 10:32:49 crc kubenswrapper[4831]: I1204 10:32:49.062329 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bc4fb897-d58q6" Dec 04 10:32:49 crc kubenswrapper[4831]: I1204 10:32:49.099947 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-lwc45" event={"ID":"85ec4bf8-71be-4822-9afe-8d09a32d8a11","Type":"ContainerStarted","Data":"d0f5d87988ec815ed3480a0963b9f4f037cf770bf3cd5b6f9fe4ddb970e94b53"} Dec 04 10:32:49 crc kubenswrapper[4831]: I1204 10:32:49.100002 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-lwc45" event={"ID":"85ec4bf8-71be-4822-9afe-8d09a32d8a11","Type":"ContainerStarted","Data":"2f97224236a94f88d418659aaafd0c356a7e329be96e339a0378d9f9225d203e"} Dec 04 10:32:49 crc kubenswrapper[4831]: I1204 10:32:49.111830 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9959d097-1e28-411b-a24c-6040036e2f1a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"9959d097-1e28-411b-a24c-6040036e2f1a\") " pod="openstack/ovn-northd-0" Dec 04 10:32:49 crc kubenswrapper[4831]: I1204 10:32:49.111877 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxzzz\" (UniqueName: \"kubernetes.io/projected/9959d097-1e28-411b-a24c-6040036e2f1a-kube-api-access-jxzzz\") pod \"ovn-northd-0\" (UID: \"9959d097-1e28-411b-a24c-6040036e2f1a\") " pod="openstack/ovn-northd-0" Dec 04 10:32:49 crc kubenswrapper[4831]: I1204 10:32:49.111906 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9959d097-1e28-411b-a24c-6040036e2f1a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"9959d097-1e28-411b-a24c-6040036e2f1a\") " pod="openstack/ovn-northd-0" Dec 04 10:32:49 crc kubenswrapper[4831]: I1204 10:32:49.111931 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9959d097-1e28-411b-a24c-6040036e2f1a-scripts\") pod \"ovn-northd-0\" (UID: \"9959d097-1e28-411b-a24c-6040036e2f1a\") " pod="openstack/ovn-northd-0" Dec 04 10:32:49 crc kubenswrapper[4831]: I1204 10:32:49.111949 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9959d097-1e28-411b-a24c-6040036e2f1a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"9959d097-1e28-411b-a24c-6040036e2f1a\") " pod="openstack/ovn-northd-0" Dec 04 10:32:49 crc kubenswrapper[4831]: I1204 10:32:49.111969 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/9959d097-1e28-411b-a24c-6040036e2f1a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"9959d097-1e28-411b-a24c-6040036e2f1a\") " pod="openstack/ovn-northd-0" Dec 04 10:32:49 crc kubenswrapper[4831]: I1204 10:32:49.112044 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9959d097-1e28-411b-a24c-6040036e2f1a-config\") pod \"ovn-northd-0\" (UID: \"9959d097-1e28-411b-a24c-6040036e2f1a\") " pod="openstack/ovn-northd-0" Dec 04 10:32:49 crc kubenswrapper[4831]: I1204 10:32:49.126421 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-lwc45" podStartSLOduration=2.126404772 podStartE2EDuration="2.126404772s" podCreationTimestamp="2025-12-04 10:32:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:32:49.121288387 +0000 UTC m=+1066.070463701" watchObservedRunningTime="2025-12-04 10:32:49.126404772 +0000 UTC m=+1066.075580086" Dec 04 10:32:49 crc kubenswrapper[4831]: I1204 10:32:49.158858 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bc4fb897-d58q6"] Dec 04 10:32:49 crc kubenswrapper[4831]: I1204 10:32:49.159268 4831 scope.go:117] "RemoveContainer" containerID="4f81b583533d4e0dcc9389e7a6d1e4bf31f2cf087244eb0f86b24c714fd2aa81" Dec 04 10:32:49 crc kubenswrapper[4831]: I1204 10:32:49.163729 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79bc4fb897-d58q6"] Dec 04 10:32:49 crc kubenswrapper[4831]: I1204 10:32:49.213899 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9959d097-1e28-411b-a24c-6040036e2f1a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"9959d097-1e28-411b-a24c-6040036e2f1a\") " pod="openstack/ovn-northd-0" Dec 04 10:32:49 crc kubenswrapper[4831]: I1204 10:32:49.213936 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxzzz\" (UniqueName: \"kubernetes.io/projected/9959d097-1e28-411b-a24c-6040036e2f1a-kube-api-access-jxzzz\") pod \"ovn-northd-0\" (UID: \"9959d097-1e28-411b-a24c-6040036e2f1a\") " pod="openstack/ovn-northd-0" Dec 04 10:32:49 crc kubenswrapper[4831]: I1204 10:32:49.213971 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9959d097-1e28-411b-a24c-6040036e2f1a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"9959d097-1e28-411b-a24c-6040036e2f1a\") " pod="openstack/ovn-northd-0" Dec 04 10:32:49 crc kubenswrapper[4831]: I1204 10:32:49.214030 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9959d097-1e28-411b-a24c-6040036e2f1a-scripts\") pod \"ovn-northd-0\" (UID: \"9959d097-1e28-411b-a24c-6040036e2f1a\") " pod="openstack/ovn-northd-0" Dec 04 10:32:49 crc kubenswrapper[4831]: I1204 10:32:49.214055 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9959d097-1e28-411b-a24c-6040036e2f1a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"9959d097-1e28-411b-a24c-6040036e2f1a\") " pod="openstack/ovn-northd-0" Dec 04 10:32:49 crc kubenswrapper[4831]: I1204 10:32:49.214077 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/9959d097-1e28-411b-a24c-6040036e2f1a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"9959d097-1e28-411b-a24c-6040036e2f1a\") " pod="openstack/ovn-northd-0" Dec 04 10:32:49 crc kubenswrapper[4831]: I1204 10:32:49.214262 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9959d097-1e28-411b-a24c-6040036e2f1a-config\") pod \"ovn-northd-0\" (UID: \"9959d097-1e28-411b-a24c-6040036e2f1a\") " pod="openstack/ovn-northd-0" Dec 04 10:32:49 crc kubenswrapper[4831]: I1204 10:32:49.215054 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9959d097-1e28-411b-a24c-6040036e2f1a-config\") pod \"ovn-northd-0\" (UID: \"9959d097-1e28-411b-a24c-6040036e2f1a\") " pod="openstack/ovn-northd-0" Dec 04 10:32:49 crc kubenswrapper[4831]: I1204 10:32:49.215605 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9959d097-1e28-411b-a24c-6040036e2f1a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"9959d097-1e28-411b-a24c-6040036e2f1a\") " pod="openstack/ovn-northd-0" Dec 04 10:32:49 crc kubenswrapper[4831]: I1204 10:32:49.219638 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9959d097-1e28-411b-a24c-6040036e2f1a-scripts\") pod \"ovn-northd-0\" (UID: \"9959d097-1e28-411b-a24c-6040036e2f1a\") " pod="openstack/ovn-northd-0" Dec 04 10:32:49 crc kubenswrapper[4831]: I1204 10:32:49.220892 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/9959d097-1e28-411b-a24c-6040036e2f1a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"9959d097-1e28-411b-a24c-6040036e2f1a\") " pod="openstack/ovn-northd-0" Dec 04 10:32:49 crc kubenswrapper[4831]: I1204 10:32:49.226747 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9959d097-1e28-411b-a24c-6040036e2f1a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"9959d097-1e28-411b-a24c-6040036e2f1a\") " pod="openstack/ovn-northd-0" Dec 04 10:32:49 crc kubenswrapper[4831]: I1204 10:32:49.232802 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9959d097-1e28-411b-a24c-6040036e2f1a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"9959d097-1e28-411b-a24c-6040036e2f1a\") " pod="openstack/ovn-northd-0" Dec 04 10:32:49 crc kubenswrapper[4831]: I1204 10:32:49.235123 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxzzz\" (UniqueName: \"kubernetes.io/projected/9959d097-1e28-411b-a24c-6040036e2f1a-kube-api-access-jxzzz\") pod \"ovn-northd-0\" (UID: \"9959d097-1e28-411b-a24c-6040036e2f1a\") " pod="openstack/ovn-northd-0" Dec 04 10:32:49 crc kubenswrapper[4831]: I1204 10:32:49.287586 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4335ff7d-e14c-4dcc-9c0e-53638cc7cb07" path="/var/lib/kubelet/pods/4335ff7d-e14c-4dcc-9c0e-53638cc7cb07/volumes" Dec 04 10:32:49 crc kubenswrapper[4831]: I1204 10:32:49.288976 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd68160b-20a1-40d4-b7c9-827aa56cf7db" path="/var/lib/kubelet/pods/bd68160b-20a1-40d4-b7c9-827aa56cf7db/volumes" Dec 04 10:32:49 crc kubenswrapper[4831]: I1204 10:32:49.320461 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 04 10:32:49 crc kubenswrapper[4831]: I1204 10:32:49.813311 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 04 10:32:50 crc kubenswrapper[4831]: I1204 10:32:50.117325 4831 generic.go:334] "Generic (PLEG): container finished" podID="e058318a-8379-4f10-9860-7af36b3278e5" containerID="d125746330cd0b36bdef72c8c0c34dbc600aa31a341f76e101598642834a0abf" exitCode=0 Dec 04 10:32:50 crc kubenswrapper[4831]: I1204 10:32:50.117406 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e058318a-8379-4f10-9860-7af36b3278e5","Type":"ContainerDied","Data":"d125746330cd0b36bdef72c8c0c34dbc600aa31a341f76e101598642834a0abf"} Dec 04 10:32:50 crc kubenswrapper[4831]: I1204 10:32:50.123947 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c8959dd5-nbrqs" event={"ID":"bffa83b9-4c9f-4223-9390-d8dacbdf2436","Type":"ContainerStarted","Data":"a2df90fc4a087c6a37c1d21c67b8f2758ee28b366dc705cb113621a7fb7d3078"} Dec 04 10:32:50 crc kubenswrapper[4831]: I1204 10:32:50.124059 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c8959dd5-nbrqs" Dec 04 10:32:50 crc kubenswrapper[4831]: I1204 10:32:50.126487 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c8f654669-pc8vw" event={"ID":"50ea30fe-69af-42c7-baf7-3a3bb0ab7882","Type":"ContainerStarted","Data":"2de252bfc4262bbbdc329f8131e651dd08c0e9d8efbfee87b1fdc426745cb9fa"} Dec 04 10:32:50 crc kubenswrapper[4831]: I1204 10:32:50.127006 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c8f654669-pc8vw" Dec 04 10:32:50 crc kubenswrapper[4831]: I1204 10:32:50.133444 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"9959d097-1e28-411b-a24c-6040036e2f1a","Type":"ContainerStarted","Data":"0e9d2dfdb0714661aca5c90cde51bbd7969b63982bed7efc913dabd056d3c060"} Dec 04 10:32:50 crc kubenswrapper[4831]: I1204 10:32:50.162134 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c8959dd5-nbrqs" podStartSLOduration=3.162115054 podStartE2EDuration="3.162115054s" podCreationTimestamp="2025-12-04 10:32:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:32:50.156774093 +0000 UTC m=+1067.105949407" watchObservedRunningTime="2025-12-04 10:32:50.162115054 +0000 UTC m=+1067.111290368" Dec 04 10:32:50 crc kubenswrapper[4831]: I1204 10:32:50.175622 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c8f654669-pc8vw" podStartSLOduration=3.175602089 podStartE2EDuration="3.175602089s" podCreationTimestamp="2025-12-04 10:32:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:32:50.172427466 +0000 UTC m=+1067.121602790" watchObservedRunningTime="2025-12-04 10:32:50.175602089 +0000 UTC m=+1067.124777403" Dec 04 10:32:50 crc kubenswrapper[4831]: E1204 10:32:50.176776 4831 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.146:55522->38.102.83.146:38009: write tcp 38.102.83.146:55522->38.102.83.146:38009: write: broken pipe Dec 04 10:32:51 crc kubenswrapper[4831]: I1204 10:32:51.094221 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c8959dd5-nbrqs"] Dec 04 10:32:51 crc kubenswrapper[4831]: I1204 10:32:51.185010 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-798f579549-frr45"] Dec 04 10:32:51 crc kubenswrapper[4831]: I1204 10:32:51.197074 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-798f579549-frr45" Dec 04 10:32:51 crc kubenswrapper[4831]: I1204 10:32:51.203595 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-798f579549-frr45"] Dec 04 10:32:51 crc kubenswrapper[4831]: I1204 10:32:51.207572 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"9959d097-1e28-411b-a24c-6040036e2f1a","Type":"ContainerStarted","Data":"179f9a809f6f2aa83bd7f32301f5e3f0561046244535345c88ad952635a1846e"} Dec 04 10:32:51 crc kubenswrapper[4831]: I1204 10:32:51.207636 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"9959d097-1e28-411b-a24c-6040036e2f1a","Type":"ContainerStarted","Data":"0cc53048a0c869d826fb2f423f6d84552e7f6a499f8f472eef31c461d0f243a7"} Dec 04 10:32:51 crc kubenswrapper[4831]: I1204 10:32:51.208407 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 04 10:32:51 crc kubenswrapper[4831]: I1204 10:32:51.257607 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmcwk\" (UniqueName: \"kubernetes.io/projected/83014775-6674-44f9-86ed-952fec02b15b-kube-api-access-dmcwk\") pod \"dnsmasq-dns-798f579549-frr45\" (UID: \"83014775-6674-44f9-86ed-952fec02b15b\") " pod="openstack/dnsmasq-dns-798f579549-frr45" Dec 04 10:32:51 crc kubenswrapper[4831]: I1204 10:32:51.257702 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83014775-6674-44f9-86ed-952fec02b15b-dns-svc\") pod \"dnsmasq-dns-798f579549-frr45\" (UID: \"83014775-6674-44f9-86ed-952fec02b15b\") " pod="openstack/dnsmasq-dns-798f579549-frr45" Dec 04 10:32:51 crc kubenswrapper[4831]: I1204 10:32:51.257763 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83014775-6674-44f9-86ed-952fec02b15b-config\") pod \"dnsmasq-dns-798f579549-frr45\" (UID: \"83014775-6674-44f9-86ed-952fec02b15b\") " pod="openstack/dnsmasq-dns-798f579549-frr45" Dec 04 10:32:51 crc kubenswrapper[4831]: I1204 10:32:51.257783 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83014775-6674-44f9-86ed-952fec02b15b-ovsdbserver-sb\") pod \"dnsmasq-dns-798f579549-frr45\" (UID: \"83014775-6674-44f9-86ed-952fec02b15b\") " pod="openstack/dnsmasq-dns-798f579549-frr45" Dec 04 10:32:51 crc kubenswrapper[4831]: I1204 10:32:51.257821 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83014775-6674-44f9-86ed-952fec02b15b-ovsdbserver-nb\") pod \"dnsmasq-dns-798f579549-frr45\" (UID: \"83014775-6674-44f9-86ed-952fec02b15b\") " pod="openstack/dnsmasq-dns-798f579549-frr45" Dec 04 10:32:51 crc kubenswrapper[4831]: I1204 10:32:51.292957 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 04 10:32:51 crc kubenswrapper[4831]: I1204 10:32:51.313953 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.582270583 podStartE2EDuration="3.313931497s" podCreationTimestamp="2025-12-04 10:32:48 +0000 UTC" firstStartedPulling="2025-12-04 10:32:49.828191448 +0000 UTC m=+1066.777366762" lastFinishedPulling="2025-12-04 10:32:50.559852362 +0000 UTC m=+1067.509027676" observedRunningTime="2025-12-04 10:32:51.298444689 +0000 UTC m=+1068.247620003" watchObservedRunningTime="2025-12-04 10:32:51.313931497 +0000 UTC m=+1068.263106821" Dec 04 10:32:51 crc kubenswrapper[4831]: I1204 10:32:51.362021 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83014775-6674-44f9-86ed-952fec02b15b-config\") pod \"dnsmasq-dns-798f579549-frr45\" (UID: \"83014775-6674-44f9-86ed-952fec02b15b\") " pod="openstack/dnsmasq-dns-798f579549-frr45" Dec 04 10:32:51 crc kubenswrapper[4831]: I1204 10:32:51.362073 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83014775-6674-44f9-86ed-952fec02b15b-ovsdbserver-sb\") pod \"dnsmasq-dns-798f579549-frr45\" (UID: \"83014775-6674-44f9-86ed-952fec02b15b\") " pod="openstack/dnsmasq-dns-798f579549-frr45" Dec 04 10:32:51 crc kubenswrapper[4831]: I1204 10:32:51.362167 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83014775-6674-44f9-86ed-952fec02b15b-ovsdbserver-nb\") pod \"dnsmasq-dns-798f579549-frr45\" (UID: \"83014775-6674-44f9-86ed-952fec02b15b\") " pod="openstack/dnsmasq-dns-798f579549-frr45" Dec 04 10:32:51 crc kubenswrapper[4831]: I1204 10:32:51.362326 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmcwk\" (UniqueName: \"kubernetes.io/projected/83014775-6674-44f9-86ed-952fec02b15b-kube-api-access-dmcwk\") pod \"dnsmasq-dns-798f579549-frr45\" (UID: \"83014775-6674-44f9-86ed-952fec02b15b\") " pod="openstack/dnsmasq-dns-798f579549-frr45" Dec 04 10:32:51 crc kubenswrapper[4831]: I1204 10:32:51.362383 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83014775-6674-44f9-86ed-952fec02b15b-dns-svc\") pod \"dnsmasq-dns-798f579549-frr45\" (UID: \"83014775-6674-44f9-86ed-952fec02b15b\") " pod="openstack/dnsmasq-dns-798f579549-frr45" Dec 04 10:32:51 crc kubenswrapper[4831]: I1204 10:32:51.365909 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83014775-6674-44f9-86ed-952fec02b15b-config\") pod \"dnsmasq-dns-798f579549-frr45\" (UID: \"83014775-6674-44f9-86ed-952fec02b15b\") " pod="openstack/dnsmasq-dns-798f579549-frr45" Dec 04 10:32:51 crc kubenswrapper[4831]: I1204 10:32:51.366239 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83014775-6674-44f9-86ed-952fec02b15b-dns-svc\") pod \"dnsmasq-dns-798f579549-frr45\" (UID: \"83014775-6674-44f9-86ed-952fec02b15b\") " pod="openstack/dnsmasq-dns-798f579549-frr45" Dec 04 10:32:51 crc kubenswrapper[4831]: I1204 10:32:51.367054 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83014775-6674-44f9-86ed-952fec02b15b-ovsdbserver-sb\") pod \"dnsmasq-dns-798f579549-frr45\" (UID: \"83014775-6674-44f9-86ed-952fec02b15b\") " pod="openstack/dnsmasq-dns-798f579549-frr45" Dec 04 10:32:51 crc kubenswrapper[4831]: I1204 10:32:51.371118 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83014775-6674-44f9-86ed-952fec02b15b-ovsdbserver-nb\") pod \"dnsmasq-dns-798f579549-frr45\" (UID: \"83014775-6674-44f9-86ed-952fec02b15b\") " pod="openstack/dnsmasq-dns-798f579549-frr45" Dec 04 10:32:51 crc kubenswrapper[4831]: I1204 10:32:51.394609 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmcwk\" (UniqueName: \"kubernetes.io/projected/83014775-6674-44f9-86ed-952fec02b15b-kube-api-access-dmcwk\") pod \"dnsmasq-dns-798f579549-frr45\" (UID: \"83014775-6674-44f9-86ed-952fec02b15b\") " pod="openstack/dnsmasq-dns-798f579549-frr45" Dec 04 10:32:51 crc kubenswrapper[4831]: I1204 10:32:51.548332 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-798f579549-frr45" Dec 04 10:32:51 crc kubenswrapper[4831]: I1204 10:32:51.971339 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:32:51 crc kubenswrapper[4831]: I1204 10:32:51.971743 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:32:51 crc kubenswrapper[4831]: I1204 10:32:51.971801 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" Dec 04 10:32:51 crc kubenswrapper[4831]: I1204 10:32:51.973452 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e3d48d2b893c2eb24b90277e2cc2d8a14727460b3bf0732b9f1999efdd5e7c27"} pod="openshift-machine-config-operator/machine-config-daemon-g76nn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 10:32:51 crc kubenswrapper[4831]: I1204 10:32:51.973554 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" containerID="cri-o://e3d48d2b893c2eb24b90277e2cc2d8a14727460b3bf0732b9f1999efdd5e7c27" gracePeriod=600 Dec 04 10:32:52 crc kubenswrapper[4831]: I1204 10:32:52.013419 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-798f579549-frr45"] Dec 04 10:32:52 crc kubenswrapper[4831]: W1204 10:32:52.043986 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83014775_6674_44f9_86ed_952fec02b15b.slice/crio-29354a5193235f5a90ab979a3cc8c805f488f3be134ad5471c395b75724e083f WatchSource:0}: Error finding container 29354a5193235f5a90ab979a3cc8c805f488f3be134ad5471c395b75724e083f: Status 404 returned error can't find the container with id 29354a5193235f5a90ab979a3cc8c805f488f3be134ad5471c395b75724e083f Dec 04 10:32:52 crc kubenswrapper[4831]: I1204 10:32:52.218766 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-798f579549-frr45" event={"ID":"83014775-6674-44f9-86ed-952fec02b15b","Type":"ContainerStarted","Data":"29354a5193235f5a90ab979a3cc8c805f488f3be134ad5471c395b75724e083f"} Dec 04 10:32:52 crc kubenswrapper[4831]: I1204 10:32:52.218910 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c8959dd5-nbrqs" podUID="bffa83b9-4c9f-4223-9390-d8dacbdf2436" containerName="dnsmasq-dns" containerID="cri-o://a2df90fc4a087c6a37c1d21c67b8f2758ee28b366dc705cb113621a7fb7d3078" gracePeriod=10 Dec 04 10:32:52 crc kubenswrapper[4831]: I1204 10:32:52.263371 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 04 10:32:52 crc kubenswrapper[4831]: I1204 10:32:52.269061 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 04 10:32:52 crc kubenswrapper[4831]: I1204 10:32:52.272887 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-czzs5" Dec 04 10:32:52 crc kubenswrapper[4831]: I1204 10:32:52.273068 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 04 10:32:52 crc kubenswrapper[4831]: I1204 10:32:52.273160 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 04 10:32:52 crc kubenswrapper[4831]: I1204 10:32:52.279774 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 04 10:32:52 crc kubenswrapper[4831]: I1204 10:32:52.288407 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 04 10:32:52 crc kubenswrapper[4831]: I1204 10:32:52.379852 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/c8b58305-3383-4cf9-9127-481c1bf16ba5-lock\") pod \"swift-storage-0\" (UID: \"c8b58305-3383-4cf9-9127-481c1bf16ba5\") " pod="openstack/swift-storage-0" Dec 04 10:32:52 crc kubenswrapper[4831]: I1204 10:32:52.379907 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/c8b58305-3383-4cf9-9127-481c1bf16ba5-cache\") pod \"swift-storage-0\" (UID: \"c8b58305-3383-4cf9-9127-481c1bf16ba5\") " pod="openstack/swift-storage-0" Dec 04 10:32:52 crc kubenswrapper[4831]: I1204 10:32:52.379928 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c8b58305-3383-4cf9-9127-481c1bf16ba5-etc-swift\") pod \"swift-storage-0\" (UID: \"c8b58305-3383-4cf9-9127-481c1bf16ba5\") " pod="openstack/swift-storage-0" Dec 04 10:32:52 crc kubenswrapper[4831]: I1204 10:32:52.379955 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp74m\" (UniqueName: \"kubernetes.io/projected/c8b58305-3383-4cf9-9127-481c1bf16ba5-kube-api-access-fp74m\") pod \"swift-storage-0\" (UID: \"c8b58305-3383-4cf9-9127-481c1bf16ba5\") " pod="openstack/swift-storage-0" Dec 04 10:32:52 crc kubenswrapper[4831]: I1204 10:32:52.379981 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"c8b58305-3383-4cf9-9127-481c1bf16ba5\") " pod="openstack/swift-storage-0" Dec 04 10:32:52 crc kubenswrapper[4831]: I1204 10:32:52.481331 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/c8b58305-3383-4cf9-9127-481c1bf16ba5-lock\") pod \"swift-storage-0\" (UID: \"c8b58305-3383-4cf9-9127-481c1bf16ba5\") " pod="openstack/swift-storage-0" Dec 04 10:32:52 crc kubenswrapper[4831]: I1204 10:32:52.481390 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/c8b58305-3383-4cf9-9127-481c1bf16ba5-cache\") pod \"swift-storage-0\" (UID: \"c8b58305-3383-4cf9-9127-481c1bf16ba5\") " pod="openstack/swift-storage-0" Dec 04 10:32:52 crc kubenswrapper[4831]: I1204 10:32:52.481417 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c8b58305-3383-4cf9-9127-481c1bf16ba5-etc-swift\") pod \"swift-storage-0\" (UID: \"c8b58305-3383-4cf9-9127-481c1bf16ba5\") " pod="openstack/swift-storage-0" Dec 04 10:32:52 crc kubenswrapper[4831]: I1204 10:32:52.481447 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp74m\" (UniqueName: \"kubernetes.io/projected/c8b58305-3383-4cf9-9127-481c1bf16ba5-kube-api-access-fp74m\") pod \"swift-storage-0\" (UID: \"c8b58305-3383-4cf9-9127-481c1bf16ba5\") " pod="openstack/swift-storage-0" Dec 04 10:32:52 crc kubenswrapper[4831]: I1204 10:32:52.481474 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"c8b58305-3383-4cf9-9127-481c1bf16ba5\") " pod="openstack/swift-storage-0" Dec 04 10:32:52 crc kubenswrapper[4831]: I1204 10:32:52.481882 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"c8b58305-3383-4cf9-9127-481c1bf16ba5\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/swift-storage-0" Dec 04 10:32:52 crc kubenswrapper[4831]: E1204 10:32:52.482139 4831 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 04 10:32:52 crc kubenswrapper[4831]: E1204 10:32:52.482163 4831 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 04 10:32:52 crc kubenswrapper[4831]: E1204 10:32:52.482218 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c8b58305-3383-4cf9-9127-481c1bf16ba5-etc-swift podName:c8b58305-3383-4cf9-9127-481c1bf16ba5 nodeName:}" failed. No retries permitted until 2025-12-04 10:32:52.982196953 +0000 UTC m=+1069.931372267 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c8b58305-3383-4cf9-9127-481c1bf16ba5-etc-swift") pod "swift-storage-0" (UID: "c8b58305-3383-4cf9-9127-481c1bf16ba5") : configmap "swift-ring-files" not found Dec 04 10:32:52 crc kubenswrapper[4831]: I1204 10:32:52.482457 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/c8b58305-3383-4cf9-9127-481c1bf16ba5-cache\") pod \"swift-storage-0\" (UID: \"c8b58305-3383-4cf9-9127-481c1bf16ba5\") " pod="openstack/swift-storage-0" Dec 04 10:32:52 crc kubenswrapper[4831]: I1204 10:32:52.482639 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/c8b58305-3383-4cf9-9127-481c1bf16ba5-lock\") pod \"swift-storage-0\" (UID: \"c8b58305-3383-4cf9-9127-481c1bf16ba5\") " pod="openstack/swift-storage-0" Dec 04 10:32:52 crc kubenswrapper[4831]: I1204 10:32:52.504982 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp74m\" (UniqueName: \"kubernetes.io/projected/c8b58305-3383-4cf9-9127-481c1bf16ba5-kube-api-access-fp74m\") pod \"swift-storage-0\" (UID: \"c8b58305-3383-4cf9-9127-481c1bf16ba5\") " pod="openstack/swift-storage-0" Dec 04 10:32:52 crc kubenswrapper[4831]: I1204 10:32:52.511016 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"c8b58305-3383-4cf9-9127-481c1bf16ba5\") " pod="openstack/swift-storage-0" Dec 04 10:32:52 crc kubenswrapper[4831]: I1204 10:32:52.821220 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-xh7nf"] Dec 04 10:32:52 crc kubenswrapper[4831]: I1204 10:32:52.824066 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-xh7nf" Dec 04 10:32:52 crc kubenswrapper[4831]: I1204 10:32:52.826462 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 04 10:32:52 crc kubenswrapper[4831]: I1204 10:32:52.829632 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 04 10:32:52 crc kubenswrapper[4831]: I1204 10:32:52.837295 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 04 10:32:52 crc kubenswrapper[4831]: I1204 10:32:52.870293 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-xh7nf"] Dec 04 10:32:52 crc kubenswrapper[4831]: I1204 10:32:52.902190 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6f9eb652-90e2-4231-a441-a9947e9fc782-etc-swift\") pod \"swift-ring-rebalance-xh7nf\" (UID: \"6f9eb652-90e2-4231-a441-a9947e9fc782\") " pod="openstack/swift-ring-rebalance-xh7nf" Dec 04 10:32:52 crc kubenswrapper[4831]: I1204 10:32:52.902281 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zccvx\" (UniqueName: \"kubernetes.io/projected/6f9eb652-90e2-4231-a441-a9947e9fc782-kube-api-access-zccvx\") pod \"swift-ring-rebalance-xh7nf\" (UID: \"6f9eb652-90e2-4231-a441-a9947e9fc782\") " pod="openstack/swift-ring-rebalance-xh7nf" Dec 04 10:32:52 crc kubenswrapper[4831]: I1204 10:32:52.902308 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6f9eb652-90e2-4231-a441-a9947e9fc782-dispersionconf\") pod \"swift-ring-rebalance-xh7nf\" (UID: \"6f9eb652-90e2-4231-a441-a9947e9fc782\") " pod="openstack/swift-ring-rebalance-xh7nf" Dec 04 10:32:52 crc kubenswrapper[4831]: I1204 10:32:52.902328 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f9eb652-90e2-4231-a441-a9947e9fc782-combined-ca-bundle\") pod \"swift-ring-rebalance-xh7nf\" (UID: \"6f9eb652-90e2-4231-a441-a9947e9fc782\") " pod="openstack/swift-ring-rebalance-xh7nf" Dec 04 10:32:52 crc kubenswrapper[4831]: I1204 10:32:52.902373 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6f9eb652-90e2-4231-a441-a9947e9fc782-swiftconf\") pod \"swift-ring-rebalance-xh7nf\" (UID: \"6f9eb652-90e2-4231-a441-a9947e9fc782\") " pod="openstack/swift-ring-rebalance-xh7nf" Dec 04 10:32:52 crc kubenswrapper[4831]: I1204 10:32:52.902417 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6f9eb652-90e2-4231-a441-a9947e9fc782-ring-data-devices\") pod \"swift-ring-rebalance-xh7nf\" (UID: \"6f9eb652-90e2-4231-a441-a9947e9fc782\") " pod="openstack/swift-ring-rebalance-xh7nf" Dec 04 10:32:52 crc kubenswrapper[4831]: I1204 10:32:52.902442 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f9eb652-90e2-4231-a441-a9947e9fc782-scripts\") pod \"swift-ring-rebalance-xh7nf\" (UID: \"6f9eb652-90e2-4231-a441-a9947e9fc782\") " pod="openstack/swift-ring-rebalance-xh7nf" Dec 04 10:32:53 crc kubenswrapper[4831]: I1204 10:32:53.004395 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6f9eb652-90e2-4231-a441-a9947e9fc782-etc-swift\") pod \"swift-ring-rebalance-xh7nf\" (UID: \"6f9eb652-90e2-4231-a441-a9947e9fc782\") " pod="openstack/swift-ring-rebalance-xh7nf" Dec 04 10:32:53 crc kubenswrapper[4831]: I1204 10:32:53.004494 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c8b58305-3383-4cf9-9127-481c1bf16ba5-etc-swift\") pod \"swift-storage-0\" (UID: \"c8b58305-3383-4cf9-9127-481c1bf16ba5\") " pod="openstack/swift-storage-0" Dec 04 10:32:53 crc kubenswrapper[4831]: I1204 10:32:53.004599 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zccvx\" (UniqueName: \"kubernetes.io/projected/6f9eb652-90e2-4231-a441-a9947e9fc782-kube-api-access-zccvx\") pod \"swift-ring-rebalance-xh7nf\" (UID: \"6f9eb652-90e2-4231-a441-a9947e9fc782\") " pod="openstack/swift-ring-rebalance-xh7nf" Dec 04 10:32:53 crc kubenswrapper[4831]: I1204 10:32:53.004636 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6f9eb652-90e2-4231-a441-a9947e9fc782-dispersionconf\") pod \"swift-ring-rebalance-xh7nf\" (UID: \"6f9eb652-90e2-4231-a441-a9947e9fc782\") " pod="openstack/swift-ring-rebalance-xh7nf" Dec 04 10:32:53 crc kubenswrapper[4831]: I1204 10:32:53.004788 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f9eb652-90e2-4231-a441-a9947e9fc782-combined-ca-bundle\") pod \"swift-ring-rebalance-xh7nf\" (UID: \"6f9eb652-90e2-4231-a441-a9947e9fc782\") " pod="openstack/swift-ring-rebalance-xh7nf" Dec 04 10:32:53 crc kubenswrapper[4831]: I1204 10:32:53.004862 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6f9eb652-90e2-4231-a441-a9947e9fc782-swiftconf\") pod \"swift-ring-rebalance-xh7nf\" (UID: \"6f9eb652-90e2-4231-a441-a9947e9fc782\") " pod="openstack/swift-ring-rebalance-xh7nf" Dec 04 10:32:53 crc kubenswrapper[4831]: I1204 10:32:53.004952 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6f9eb652-90e2-4231-a441-a9947e9fc782-ring-data-devices\") pod \"swift-ring-rebalance-xh7nf\" (UID: \"6f9eb652-90e2-4231-a441-a9947e9fc782\") " pod="openstack/swift-ring-rebalance-xh7nf" Dec 04 10:32:53 crc kubenswrapper[4831]: I1204 10:32:53.004984 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f9eb652-90e2-4231-a441-a9947e9fc782-scripts\") pod \"swift-ring-rebalance-xh7nf\" (UID: \"6f9eb652-90e2-4231-a441-a9947e9fc782\") " pod="openstack/swift-ring-rebalance-xh7nf" Dec 04 10:32:53 crc kubenswrapper[4831]: I1204 10:32:53.005175 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6f9eb652-90e2-4231-a441-a9947e9fc782-etc-swift\") pod \"swift-ring-rebalance-xh7nf\" (UID: \"6f9eb652-90e2-4231-a441-a9947e9fc782\") " pod="openstack/swift-ring-rebalance-xh7nf" Dec 04 10:32:53 crc kubenswrapper[4831]: E1204 10:32:53.005472 4831 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 04 10:32:53 crc kubenswrapper[4831]: E1204 10:32:53.005495 4831 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 04 10:32:53 crc kubenswrapper[4831]: E1204 10:32:53.005585 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c8b58305-3383-4cf9-9127-481c1bf16ba5-etc-swift podName:c8b58305-3383-4cf9-9127-481c1bf16ba5 nodeName:}" failed. No retries permitted until 2025-12-04 10:32:54.005558814 +0000 UTC m=+1070.954734428 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c8b58305-3383-4cf9-9127-481c1bf16ba5-etc-swift") pod "swift-storage-0" (UID: "c8b58305-3383-4cf9-9127-481c1bf16ba5") : configmap "swift-ring-files" not found Dec 04 10:32:53 crc kubenswrapper[4831]: I1204 10:32:53.006137 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f9eb652-90e2-4231-a441-a9947e9fc782-scripts\") pod \"swift-ring-rebalance-xh7nf\" (UID: \"6f9eb652-90e2-4231-a441-a9947e9fc782\") " pod="openstack/swift-ring-rebalance-xh7nf" Dec 04 10:32:53 crc kubenswrapper[4831]: I1204 10:32:53.008612 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6f9eb652-90e2-4231-a441-a9947e9fc782-ring-data-devices\") pod \"swift-ring-rebalance-xh7nf\" (UID: \"6f9eb652-90e2-4231-a441-a9947e9fc782\") " pod="openstack/swift-ring-rebalance-xh7nf" Dec 04 10:32:53 crc kubenswrapper[4831]: I1204 10:32:53.008980 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6f9eb652-90e2-4231-a441-a9947e9fc782-dispersionconf\") pod \"swift-ring-rebalance-xh7nf\" (UID: \"6f9eb652-90e2-4231-a441-a9947e9fc782\") " pod="openstack/swift-ring-rebalance-xh7nf" Dec 04 10:32:53 crc kubenswrapper[4831]: I1204 10:32:53.010603 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f9eb652-90e2-4231-a441-a9947e9fc782-combined-ca-bundle\") pod \"swift-ring-rebalance-xh7nf\" (UID: \"6f9eb652-90e2-4231-a441-a9947e9fc782\") " pod="openstack/swift-ring-rebalance-xh7nf" Dec 04 10:32:53 crc kubenswrapper[4831]: I1204 10:32:53.016515 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6f9eb652-90e2-4231-a441-a9947e9fc782-swiftconf\") pod \"swift-ring-rebalance-xh7nf\" (UID: \"6f9eb652-90e2-4231-a441-a9947e9fc782\") " pod="openstack/swift-ring-rebalance-xh7nf" Dec 04 10:32:53 crc kubenswrapper[4831]: I1204 10:32:53.053005 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zccvx\" (UniqueName: \"kubernetes.io/projected/6f9eb652-90e2-4231-a441-a9947e9fc782-kube-api-access-zccvx\") pod \"swift-ring-rebalance-xh7nf\" (UID: \"6f9eb652-90e2-4231-a441-a9947e9fc782\") " pod="openstack/swift-ring-rebalance-xh7nf" Dec 04 10:32:53 crc kubenswrapper[4831]: I1204 10:32:53.221309 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-xh7nf" Dec 04 10:32:53 crc kubenswrapper[4831]: I1204 10:32:53.234974 4831 generic.go:334] "Generic (PLEG): container finished" podID="8475bb26-8864-4d49-935b-db7d4cb73387" containerID="e3d48d2b893c2eb24b90277e2cc2d8a14727460b3bf0732b9f1999efdd5e7c27" exitCode=0 Dec 04 10:32:53 crc kubenswrapper[4831]: I1204 10:32:53.235055 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" event={"ID":"8475bb26-8864-4d49-935b-db7d4cb73387","Type":"ContainerDied","Data":"e3d48d2b893c2eb24b90277e2cc2d8a14727460b3bf0732b9f1999efdd5e7c27"} Dec 04 10:32:53 crc kubenswrapper[4831]: I1204 10:32:53.235123 4831 scope.go:117] "RemoveContainer" containerID="3282642ca73576095ec0c4e4a39a50ef22334c6e41917a860b9796381706e28c" Dec 04 10:32:53 crc kubenswrapper[4831]: I1204 10:32:53.238039 4831 generic.go:334] "Generic (PLEG): container finished" podID="bffa83b9-4c9f-4223-9390-d8dacbdf2436" containerID="a2df90fc4a087c6a37c1d21c67b8f2758ee28b366dc705cb113621a7fb7d3078" exitCode=0 Dec 04 10:32:53 crc kubenswrapper[4831]: I1204 10:32:53.238134 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c8959dd5-nbrqs" event={"ID":"bffa83b9-4c9f-4223-9390-d8dacbdf2436","Type":"ContainerDied","Data":"a2df90fc4a087c6a37c1d21c67b8f2758ee28b366dc705cb113621a7fb7d3078"} Dec 04 10:32:53 crc kubenswrapper[4831]: I1204 10:32:53.902490 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-xh7nf"] Dec 04 10:32:54 crc kubenswrapper[4831]: I1204 10:32:54.034320 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c8b58305-3383-4cf9-9127-481c1bf16ba5-etc-swift\") pod \"swift-storage-0\" (UID: \"c8b58305-3383-4cf9-9127-481c1bf16ba5\") " pod="openstack/swift-storage-0" Dec 04 10:32:54 crc kubenswrapper[4831]: E1204 10:32:54.034476 4831 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 04 10:32:54 crc kubenswrapper[4831]: E1204 10:32:54.034498 4831 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 04 10:32:54 crc kubenswrapper[4831]: E1204 10:32:54.034556 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c8b58305-3383-4cf9-9127-481c1bf16ba5-etc-swift podName:c8b58305-3383-4cf9-9127-481c1bf16ba5 nodeName:}" failed. No retries permitted until 2025-12-04 10:32:56.034536178 +0000 UTC m=+1072.983711492 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c8b58305-3383-4cf9-9127-481c1bf16ba5-etc-swift") pod "swift-storage-0" (UID: "c8b58305-3383-4cf9-9127-481c1bf16ba5") : configmap "swift-ring-files" not found Dec 04 10:32:54 crc kubenswrapper[4831]: I1204 10:32:54.249302 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-xh7nf" event={"ID":"6f9eb652-90e2-4231-a441-a9947e9fc782","Type":"ContainerStarted","Data":"bbacc049ede87a697a393c4a560ed3bdbc6f7c1f88a6140244c0fa57f36ce832"} Dec 04 10:32:56 crc kubenswrapper[4831]: I1204 10:32:56.073206 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c8b58305-3383-4cf9-9127-481c1bf16ba5-etc-swift\") pod \"swift-storage-0\" (UID: \"c8b58305-3383-4cf9-9127-481c1bf16ba5\") " pod="openstack/swift-storage-0" Dec 04 10:32:56 crc kubenswrapper[4831]: E1204 10:32:56.073504 4831 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 04 10:32:56 crc kubenswrapper[4831]: E1204 10:32:56.073752 4831 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 04 10:32:56 crc kubenswrapper[4831]: E1204 10:32:56.073848 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c8b58305-3383-4cf9-9127-481c1bf16ba5-etc-swift podName:c8b58305-3383-4cf9-9127-481c1bf16ba5 nodeName:}" failed. No retries permitted until 2025-12-04 10:33:00.073819414 +0000 UTC m=+1077.022994738 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c8b58305-3383-4cf9-9127-481c1bf16ba5-etc-swift") pod "swift-storage-0" (UID: "c8b58305-3383-4cf9-9127-481c1bf16ba5") : configmap "swift-ring-files" not found Dec 04 10:32:57 crc kubenswrapper[4831]: I1204 10:32:57.836775 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c8f654669-pc8vw" Dec 04 10:32:58 crc kubenswrapper[4831]: I1204 10:32:58.092194 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c8959dd5-nbrqs" podUID="bffa83b9-4c9f-4223-9390-d8dacbdf2436" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.119:5353: connect: connection refused" Dec 04 10:32:59 crc kubenswrapper[4831]: I1204 10:32:59.568430 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 04 10:32:59 crc kubenswrapper[4831]: I1204 10:32:59.651017 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="8b3ada8b-0b18-4561-a770-80c259283ce1" containerName="galera" probeResult="failure" output=< Dec 04 10:32:59 crc kubenswrapper[4831]: wsrep_local_state_comment (Joined) differs from Synced Dec 04 10:32:59 crc kubenswrapper[4831]: > Dec 04 10:33:00 crc kubenswrapper[4831]: I1204 10:33:00.152800 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c8b58305-3383-4cf9-9127-481c1bf16ba5-etc-swift\") pod \"swift-storage-0\" (UID: \"c8b58305-3383-4cf9-9127-481c1bf16ba5\") " pod="openstack/swift-storage-0" Dec 04 10:33:00 crc kubenswrapper[4831]: E1204 10:33:00.153044 4831 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 04 10:33:00 crc kubenswrapper[4831]: E1204 10:33:00.153073 4831 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 04 10:33:00 crc kubenswrapper[4831]: E1204 10:33:00.153145 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c8b58305-3383-4cf9-9127-481c1bf16ba5-etc-swift podName:c8b58305-3383-4cf9-9127-481c1bf16ba5 nodeName:}" failed. No retries permitted until 2025-12-04 10:33:08.153122185 +0000 UTC m=+1085.102297499 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c8b58305-3383-4cf9-9127-481c1bf16ba5-etc-swift") pod "swift-storage-0" (UID: "c8b58305-3383-4cf9-9127-481c1bf16ba5") : configmap "swift-ring-files" not found Dec 04 10:33:00 crc kubenswrapper[4831]: I1204 10:33:00.299728 4831 generic.go:334] "Generic (PLEG): container finished" podID="83014775-6674-44f9-86ed-952fec02b15b" containerID="b5d85a848c1efe10786a4e2f27f8145b231618abe83e713948a9be8257ff2bab" exitCode=0 Dec 04 10:33:00 crc kubenswrapper[4831]: I1204 10:33:00.299773 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-798f579549-frr45" event={"ID":"83014775-6674-44f9-86ed-952fec02b15b","Type":"ContainerDied","Data":"b5d85a848c1efe10786a4e2f27f8145b231618abe83e713948a9be8257ff2bab"} Dec 04 10:33:00 crc kubenswrapper[4831]: I1204 10:33:00.749608 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 04 10:33:00 crc kubenswrapper[4831]: I1204 10:33:00.816835 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 04 10:33:04 crc kubenswrapper[4831]: E1204 10:33:04.335088 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:17ea20be390a94ab39f5cdd7f0cbc2498046eebcf77fe3dec9aa288d5c2cf46b" Dec 04 10:33:04 crc kubenswrapper[4831]: E1204 10:33:04.335652 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus,Image:registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:17ea20be390a94ab39f5cdd7f0cbc2498046eebcf77fe3dec9aa288d5c2cf46b,Command:[],Args:[--config.file=/etc/prometheus/config_out/prometheus.env.yaml --web.enable-lifecycle --web.enable-remote-write-receiver --web.route-prefix=/ --storage.tsdb.retention.time=24h --storage.tsdb.path=/prometheus --web.config.file=/etc/prometheus/web_config/web-config.yaml],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:web,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-out,ReadOnly:true,MountPath:/etc/prometheus/config_out,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tls-assets,ReadOnly:true,MountPath:/etc/prometheus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-db,ReadOnly:false,MountPath:/prometheus,SubPath:prometheus-db,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-rulefiles-0,ReadOnly:false,MountPath:/etc/prometheus/rules/prometheus-metric-storage-rulefiles-0,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:web-config,ReadOnly:true,MountPath:/etc/prometheus/web_config/web-config.yaml,SubPath:web-config.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7fgwv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/-/healthy,Port:{1 0 web},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:3,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/-/ready,Port:{1 0 web},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:3,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/-/ready,Port:{1 0 web},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:3,PeriodSeconds:15,SuccessThreshold:1,FailureThreshold:60,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod prometheus-metric-storage-0_openstack(e058318a-8379-4f10-9860-7af36b3278e5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 10:33:04 crc kubenswrapper[4831]: I1204 10:33:04.352654 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c8959dd5-nbrqs" event={"ID":"bffa83b9-4c9f-4223-9390-d8dacbdf2436","Type":"ContainerDied","Data":"c59fc25014e988262b7408560b81bbda3037d7a6c386916242115c5999b85d7b"} Dec 04 10:33:04 crc kubenswrapper[4831]: I1204 10:33:04.352720 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c59fc25014e988262b7408560b81bbda3037d7a6c386916242115c5999b85d7b" Dec 04 10:33:04 crc kubenswrapper[4831]: I1204 10:33:04.389776 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 04 10:33:04 crc kubenswrapper[4831]: I1204 10:33:04.433195 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c8959dd5-nbrqs" Dec 04 10:33:04 crc kubenswrapper[4831]: I1204 10:33:04.538861 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bffa83b9-4c9f-4223-9390-d8dacbdf2436-ovsdbserver-nb\") pod \"bffa83b9-4c9f-4223-9390-d8dacbdf2436\" (UID: \"bffa83b9-4c9f-4223-9390-d8dacbdf2436\") " Dec 04 10:33:04 crc kubenswrapper[4831]: I1204 10:33:04.538971 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bffa83b9-4c9f-4223-9390-d8dacbdf2436-ovsdbserver-sb\") pod \"bffa83b9-4c9f-4223-9390-d8dacbdf2436\" (UID: \"bffa83b9-4c9f-4223-9390-d8dacbdf2436\") " Dec 04 10:33:04 crc kubenswrapper[4831]: I1204 10:33:04.539021 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bffa83b9-4c9f-4223-9390-d8dacbdf2436-dns-svc\") pod \"bffa83b9-4c9f-4223-9390-d8dacbdf2436\" (UID: \"bffa83b9-4c9f-4223-9390-d8dacbdf2436\") " Dec 04 10:33:04 crc kubenswrapper[4831]: I1204 10:33:04.539090 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bffa83b9-4c9f-4223-9390-d8dacbdf2436-config\") pod \"bffa83b9-4c9f-4223-9390-d8dacbdf2436\" (UID: \"bffa83b9-4c9f-4223-9390-d8dacbdf2436\") " Dec 04 10:33:04 crc kubenswrapper[4831]: I1204 10:33:04.539131 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sf69q\" (UniqueName: \"kubernetes.io/projected/bffa83b9-4c9f-4223-9390-d8dacbdf2436-kube-api-access-sf69q\") pod \"bffa83b9-4c9f-4223-9390-d8dacbdf2436\" (UID: \"bffa83b9-4c9f-4223-9390-d8dacbdf2436\") " Dec 04 10:33:04 crc kubenswrapper[4831]: I1204 10:33:04.547222 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bffa83b9-4c9f-4223-9390-d8dacbdf2436-kube-api-access-sf69q" (OuterVolumeSpecName: "kube-api-access-sf69q") pod "bffa83b9-4c9f-4223-9390-d8dacbdf2436" (UID: "bffa83b9-4c9f-4223-9390-d8dacbdf2436"). InnerVolumeSpecName "kube-api-access-sf69q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:33:04 crc kubenswrapper[4831]: I1204 10:33:04.586715 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bffa83b9-4c9f-4223-9390-d8dacbdf2436-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bffa83b9-4c9f-4223-9390-d8dacbdf2436" (UID: "bffa83b9-4c9f-4223-9390-d8dacbdf2436"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:33:04 crc kubenswrapper[4831]: I1204 10:33:04.594066 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bffa83b9-4c9f-4223-9390-d8dacbdf2436-config" (OuterVolumeSpecName: "config") pod "bffa83b9-4c9f-4223-9390-d8dacbdf2436" (UID: "bffa83b9-4c9f-4223-9390-d8dacbdf2436"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:33:04 crc kubenswrapper[4831]: I1204 10:33:04.602346 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bffa83b9-4c9f-4223-9390-d8dacbdf2436-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bffa83b9-4c9f-4223-9390-d8dacbdf2436" (UID: "bffa83b9-4c9f-4223-9390-d8dacbdf2436"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:33:04 crc kubenswrapper[4831]: I1204 10:33:04.602449 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bffa83b9-4c9f-4223-9390-d8dacbdf2436-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bffa83b9-4c9f-4223-9390-d8dacbdf2436" (UID: "bffa83b9-4c9f-4223-9390-d8dacbdf2436"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:33:04 crc kubenswrapper[4831]: I1204 10:33:04.640541 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bffa83b9-4c9f-4223-9390-d8dacbdf2436-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:04 crc kubenswrapper[4831]: I1204 10:33:04.640565 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sf69q\" (UniqueName: \"kubernetes.io/projected/bffa83b9-4c9f-4223-9390-d8dacbdf2436-kube-api-access-sf69q\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:04 crc kubenswrapper[4831]: I1204 10:33:04.640575 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bffa83b9-4c9f-4223-9390-d8dacbdf2436-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:04 crc kubenswrapper[4831]: I1204 10:33:04.640583 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bffa83b9-4c9f-4223-9390-d8dacbdf2436-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:04 crc kubenswrapper[4831]: I1204 10:33:04.640592 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bffa83b9-4c9f-4223-9390-d8dacbdf2436-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:05 crc kubenswrapper[4831]: I1204 10:33:05.366911 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c8959dd5-nbrqs" Dec 04 10:33:05 crc kubenswrapper[4831]: I1204 10:33:05.367761 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" event={"ID":"8475bb26-8864-4d49-935b-db7d4cb73387","Type":"ContainerStarted","Data":"0123b3bc298be4c2ac62175c379dc6efb186183e599f1998133a95b106c98408"} Dec 04 10:33:05 crc kubenswrapper[4831]: I1204 10:33:05.403572 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c8959dd5-nbrqs"] Dec 04 10:33:05 crc kubenswrapper[4831]: I1204 10:33:05.411548 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c8959dd5-nbrqs"] Dec 04 10:33:07 crc kubenswrapper[4831]: I1204 10:33:07.296295 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bffa83b9-4c9f-4223-9390-d8dacbdf2436" path="/var/lib/kubelet/pods/bffa83b9-4c9f-4223-9390-d8dacbdf2436/volumes" Dec 04 10:33:08 crc kubenswrapper[4831]: I1204 10:33:08.090767 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c8959dd5-nbrqs" podUID="bffa83b9-4c9f-4223-9390-d8dacbdf2436" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.119:5353: i/o timeout" Dec 04 10:33:08 crc kubenswrapper[4831]: I1204 10:33:08.206817 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c8b58305-3383-4cf9-9127-481c1bf16ba5-etc-swift\") pod \"swift-storage-0\" (UID: \"c8b58305-3383-4cf9-9127-481c1bf16ba5\") " pod="openstack/swift-storage-0" Dec 04 10:33:08 crc kubenswrapper[4831]: E1204 10:33:08.207001 4831 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 04 10:33:08 crc kubenswrapper[4831]: E1204 10:33:08.207022 4831 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 04 10:33:08 crc kubenswrapper[4831]: E1204 10:33:08.207072 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c8b58305-3383-4cf9-9127-481c1bf16ba5-etc-swift podName:c8b58305-3383-4cf9-9127-481c1bf16ba5 nodeName:}" failed. No retries permitted until 2025-12-04 10:33:24.207057765 +0000 UTC m=+1101.156233079 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c8b58305-3383-4cf9-9127-481c1bf16ba5-etc-swift") pod "swift-storage-0" (UID: "c8b58305-3383-4cf9-9127-481c1bf16ba5") : configmap "swift-ring-files" not found Dec 04 10:33:08 crc kubenswrapper[4831]: I1204 10:33:08.561327 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-68cqf"] Dec 04 10:33:08 crc kubenswrapper[4831]: E1204 10:33:08.562110 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bffa83b9-4c9f-4223-9390-d8dacbdf2436" containerName="init" Dec 04 10:33:08 crc kubenswrapper[4831]: I1204 10:33:08.562131 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="bffa83b9-4c9f-4223-9390-d8dacbdf2436" containerName="init" Dec 04 10:33:08 crc kubenswrapper[4831]: E1204 10:33:08.562167 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bffa83b9-4c9f-4223-9390-d8dacbdf2436" containerName="dnsmasq-dns" Dec 04 10:33:08 crc kubenswrapper[4831]: I1204 10:33:08.562177 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="bffa83b9-4c9f-4223-9390-d8dacbdf2436" containerName="dnsmasq-dns" Dec 04 10:33:08 crc kubenswrapper[4831]: I1204 10:33:08.566882 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="bffa83b9-4c9f-4223-9390-d8dacbdf2436" containerName="dnsmasq-dns" Dec 04 10:33:08 crc kubenswrapper[4831]: I1204 10:33:08.567605 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-68cqf" Dec 04 10:33:08 crc kubenswrapper[4831]: I1204 10:33:08.591573 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-68cqf"] Dec 04 10:33:08 crc kubenswrapper[4831]: I1204 10:33:08.616908 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f47rk\" (UniqueName: \"kubernetes.io/projected/a0eba6b8-a3b0-4fb3-997b-6ce23e035ec9-kube-api-access-f47rk\") pod \"keystone-db-create-68cqf\" (UID: \"a0eba6b8-a3b0-4fb3-997b-6ce23e035ec9\") " pod="openstack/keystone-db-create-68cqf" Dec 04 10:33:08 crc kubenswrapper[4831]: I1204 10:33:08.718468 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f47rk\" (UniqueName: \"kubernetes.io/projected/a0eba6b8-a3b0-4fb3-997b-6ce23e035ec9-kube-api-access-f47rk\") pod \"keystone-db-create-68cqf\" (UID: \"a0eba6b8-a3b0-4fb3-997b-6ce23e035ec9\") " pod="openstack/keystone-db-create-68cqf" Dec 04 10:33:08 crc kubenswrapper[4831]: I1204 10:33:08.727178 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 04 10:33:08 crc kubenswrapper[4831]: I1204 10:33:08.738728 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f47rk\" (UniqueName: \"kubernetes.io/projected/a0eba6b8-a3b0-4fb3-997b-6ce23e035ec9-kube-api-access-f47rk\") pod \"keystone-db-create-68cqf\" (UID: \"a0eba6b8-a3b0-4fb3-997b-6ce23e035ec9\") " pod="openstack/keystone-db-create-68cqf" Dec 04 10:33:08 crc kubenswrapper[4831]: I1204 10:33:08.753311 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-hwz95"] Dec 04 10:33:08 crc kubenswrapper[4831]: I1204 10:33:08.754441 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hwz95" Dec 04 10:33:08 crc kubenswrapper[4831]: I1204 10:33:08.762072 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-hwz95"] Dec 04 10:33:08 crc kubenswrapper[4831]: I1204 10:33:08.819575 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6kcd\" (UniqueName: \"kubernetes.io/projected/f8149a56-f5ed-43f5-98a4-8d7324feadce-kube-api-access-q6kcd\") pod \"placement-db-create-hwz95\" (UID: \"f8149a56-f5ed-43f5-98a4-8d7324feadce\") " pod="openstack/placement-db-create-hwz95" Dec 04 10:33:08 crc kubenswrapper[4831]: I1204 10:33:08.921417 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6kcd\" (UniqueName: \"kubernetes.io/projected/f8149a56-f5ed-43f5-98a4-8d7324feadce-kube-api-access-q6kcd\") pod \"placement-db-create-hwz95\" (UID: \"f8149a56-f5ed-43f5-98a4-8d7324feadce\") " pod="openstack/placement-db-create-hwz95" Dec 04 10:33:08 crc kubenswrapper[4831]: I1204 10:33:08.924015 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-68cqf" Dec 04 10:33:08 crc kubenswrapper[4831]: I1204 10:33:08.950762 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6kcd\" (UniqueName: \"kubernetes.io/projected/f8149a56-f5ed-43f5-98a4-8d7324feadce-kube-api-access-q6kcd\") pod \"placement-db-create-hwz95\" (UID: \"f8149a56-f5ed-43f5-98a4-8d7324feadce\") " pod="openstack/placement-db-create-hwz95" Dec 04 10:33:09 crc kubenswrapper[4831]: I1204 10:33:09.073236 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hwz95" Dec 04 10:33:09 crc kubenswrapper[4831]: I1204 10:33:09.385800 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-68cqf"] Dec 04 10:33:09 crc kubenswrapper[4831]: I1204 10:33:09.403880 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-798f579549-frr45" event={"ID":"83014775-6674-44f9-86ed-952fec02b15b","Type":"ContainerStarted","Data":"84e34fae13b6b19084707fd60a8b792c50ecd8ea5894777117ce1d55df471f27"} Dec 04 10:33:09 crc kubenswrapper[4831]: I1204 10:33:09.405056 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-798f579549-frr45" Dec 04 10:33:09 crc kubenswrapper[4831]: I1204 10:33:09.406401 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-xh7nf" event={"ID":"6f9eb652-90e2-4231-a441-a9947e9fc782","Type":"ContainerStarted","Data":"f27145619c1f4f73d8d7584d371c959b21ed34f09a902480829c8583442cb105"} Dec 04 10:33:09 crc kubenswrapper[4831]: I1204 10:33:09.408284 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-68cqf" event={"ID":"a0eba6b8-a3b0-4fb3-997b-6ce23e035ec9","Type":"ContainerStarted","Data":"803572cb865f08df148a5cc143a9ec03a4901b51464c47993501057c57c62eb2"} Dec 04 10:33:09 crc kubenswrapper[4831]: I1204 10:33:09.429663 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-798f579549-frr45" podStartSLOduration=18.429645765 podStartE2EDuration="18.429645765s" podCreationTimestamp="2025-12-04 10:32:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:33:09.424016116 +0000 UTC m=+1086.373191440" watchObservedRunningTime="2025-12-04 10:33:09.429645765 +0000 UTC m=+1086.378821079" Dec 04 10:33:09 crc kubenswrapper[4831]: I1204 10:33:09.444492 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-xh7nf" podStartSLOduration=2.7397352440000002 podStartE2EDuration="17.444471435s" podCreationTimestamp="2025-12-04 10:32:52 +0000 UTC" firstStartedPulling="2025-12-04 10:32:53.914984736 +0000 UTC m=+1070.864160050" lastFinishedPulling="2025-12-04 10:33:08.619720927 +0000 UTC m=+1085.568896241" observedRunningTime="2025-12-04 10:33:09.440052449 +0000 UTC m=+1086.389227753" watchObservedRunningTime="2025-12-04 10:33:09.444471435 +0000 UTC m=+1086.393646750" Dec 04 10:33:09 crc kubenswrapper[4831]: W1204 10:33:09.527548 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8149a56_f5ed_43f5_98a4_8d7324feadce.slice/crio-12071bcce28b9661dbfc6dde3bd604dca69ed1696a65544bd30d475425e7c2af WatchSource:0}: Error finding container 12071bcce28b9661dbfc6dde3bd604dca69ed1696a65544bd30d475425e7c2af: Status 404 returned error can't find the container with id 12071bcce28b9661dbfc6dde3bd604dca69ed1696a65544bd30d475425e7c2af Dec 04 10:33:09 crc kubenswrapper[4831]: I1204 10:33:09.531866 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-hwz95"] Dec 04 10:33:10 crc kubenswrapper[4831]: I1204 10:33:10.419862 4831 generic.go:334] "Generic (PLEG): container finished" podID="a0eba6b8-a3b0-4fb3-997b-6ce23e035ec9" containerID="3f11b7e9c243b75f8a0ef0d67eb7921601bbe2240fbbee0ed72bda35da870121" exitCode=0 Dec 04 10:33:10 crc kubenswrapper[4831]: I1204 10:33:10.420168 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-68cqf" event={"ID":"a0eba6b8-a3b0-4fb3-997b-6ce23e035ec9","Type":"ContainerDied","Data":"3f11b7e9c243b75f8a0ef0d67eb7921601bbe2240fbbee0ed72bda35da870121"} Dec 04 10:33:10 crc kubenswrapper[4831]: I1204 10:33:10.424616 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-hwz95" event={"ID":"f8149a56-f5ed-43f5-98a4-8d7324feadce","Type":"ContainerStarted","Data":"55abb855d2c74bef7973a0805864c932ae617a0fee27b66627a41a0e4900ac2b"} Dec 04 10:33:10 crc kubenswrapper[4831]: I1204 10:33:10.424733 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-hwz95" event={"ID":"f8149a56-f5ed-43f5-98a4-8d7324feadce","Type":"ContainerStarted","Data":"12071bcce28b9661dbfc6dde3bd604dca69ed1696a65544bd30d475425e7c2af"} Dec 04 10:33:10 crc kubenswrapper[4831]: I1204 10:33:10.958837 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-create-45skt"] Dec 04 10:33:10 crc kubenswrapper[4831]: I1204 10:33:10.960736 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-45skt" Dec 04 10:33:10 crc kubenswrapper[4831]: I1204 10:33:10.962797 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-45skt"] Dec 04 10:33:11 crc kubenswrapper[4831]: I1204 10:33:11.063093 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c6dl\" (UniqueName: \"kubernetes.io/projected/b95fde6f-f78a-4a46-ab00-5f817de61b4e-kube-api-access-9c6dl\") pod \"watcher-db-create-45skt\" (UID: \"b95fde6f-f78a-4a46-ab00-5f817de61b4e\") " pod="openstack/watcher-db-create-45skt" Dec 04 10:33:11 crc kubenswrapper[4831]: I1204 10:33:11.165503 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9c6dl\" (UniqueName: \"kubernetes.io/projected/b95fde6f-f78a-4a46-ab00-5f817de61b4e-kube-api-access-9c6dl\") pod \"watcher-db-create-45skt\" (UID: \"b95fde6f-f78a-4a46-ab00-5f817de61b4e\") " pod="openstack/watcher-db-create-45skt" Dec 04 10:33:11 crc kubenswrapper[4831]: I1204 10:33:11.195929 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c6dl\" (UniqueName: \"kubernetes.io/projected/b95fde6f-f78a-4a46-ab00-5f817de61b4e-kube-api-access-9c6dl\") pod \"watcher-db-create-45skt\" (UID: \"b95fde6f-f78a-4a46-ab00-5f817de61b4e\") " pod="openstack/watcher-db-create-45skt" Dec 04 10:33:11 crc kubenswrapper[4831]: I1204 10:33:11.290909 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-45skt" Dec 04 10:33:11 crc kubenswrapper[4831]: I1204 10:33:11.437310 4831 generic.go:334] "Generic (PLEG): container finished" podID="f8149a56-f5ed-43f5-98a4-8d7324feadce" containerID="55abb855d2c74bef7973a0805864c932ae617a0fee27b66627a41a0e4900ac2b" exitCode=0 Dec 04 10:33:11 crc kubenswrapper[4831]: I1204 10:33:11.437373 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-hwz95" event={"ID":"f8149a56-f5ed-43f5-98a4-8d7324feadce","Type":"ContainerDied","Data":"55abb855d2c74bef7973a0805864c932ae617a0fee27b66627a41a0e4900ac2b"} Dec 04 10:33:11 crc kubenswrapper[4831]: I1204 10:33:11.442327 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e058318a-8379-4f10-9860-7af36b3278e5","Type":"ContainerStarted","Data":"e1e401b2381a51ccd3f08295b648ee827e192e48bf52c3ce23d645caafe2437c"} Dec 04 10:33:11 crc kubenswrapper[4831]: W1204 10:33:11.793451 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb95fde6f_f78a_4a46_ab00_5f817de61b4e.slice/crio-9e8daa69317c2a688171ece0cfef489c4b166a4f0915a6e1d57a2d60006cf09f WatchSource:0}: Error finding container 9e8daa69317c2a688171ece0cfef489c4b166a4f0915a6e1d57a2d60006cf09f: Status 404 returned error can't find the container with id 9e8daa69317c2a688171ece0cfef489c4b166a4f0915a6e1d57a2d60006cf09f Dec 04 10:33:11 crc kubenswrapper[4831]: I1204 10:33:11.800263 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-45skt"] Dec 04 10:33:11 crc kubenswrapper[4831]: I1204 10:33:11.859403 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hwz95" Dec 04 10:33:11 crc kubenswrapper[4831]: I1204 10:33:11.884112 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6kcd\" (UniqueName: \"kubernetes.io/projected/f8149a56-f5ed-43f5-98a4-8d7324feadce-kube-api-access-q6kcd\") pod \"f8149a56-f5ed-43f5-98a4-8d7324feadce\" (UID: \"f8149a56-f5ed-43f5-98a4-8d7324feadce\") " Dec 04 10:33:11 crc kubenswrapper[4831]: I1204 10:33:11.891165 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8149a56-f5ed-43f5-98a4-8d7324feadce-kube-api-access-q6kcd" (OuterVolumeSpecName: "kube-api-access-q6kcd") pod "f8149a56-f5ed-43f5-98a4-8d7324feadce" (UID: "f8149a56-f5ed-43f5-98a4-8d7324feadce"). InnerVolumeSpecName "kube-api-access-q6kcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:33:11 crc kubenswrapper[4831]: I1204 10:33:11.894473 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-68cqf" Dec 04 10:33:11 crc kubenswrapper[4831]: I1204 10:33:11.986026 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f47rk\" (UniqueName: \"kubernetes.io/projected/a0eba6b8-a3b0-4fb3-997b-6ce23e035ec9-kube-api-access-f47rk\") pod \"a0eba6b8-a3b0-4fb3-997b-6ce23e035ec9\" (UID: \"a0eba6b8-a3b0-4fb3-997b-6ce23e035ec9\") " Dec 04 10:33:11 crc kubenswrapper[4831]: I1204 10:33:11.986609 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6kcd\" (UniqueName: \"kubernetes.io/projected/f8149a56-f5ed-43f5-98a4-8d7324feadce-kube-api-access-q6kcd\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:11 crc kubenswrapper[4831]: I1204 10:33:11.988812 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0eba6b8-a3b0-4fb3-997b-6ce23e035ec9-kube-api-access-f47rk" (OuterVolumeSpecName: "kube-api-access-f47rk") pod "a0eba6b8-a3b0-4fb3-997b-6ce23e035ec9" (UID: "a0eba6b8-a3b0-4fb3-997b-6ce23e035ec9"). InnerVolumeSpecName "kube-api-access-f47rk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:33:12 crc kubenswrapper[4831]: I1204 10:33:12.088418 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f47rk\" (UniqueName: \"kubernetes.io/projected/a0eba6b8-a3b0-4fb3-997b-6ce23e035ec9-kube-api-access-f47rk\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:12 crc kubenswrapper[4831]: I1204 10:33:12.452198 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-68cqf" event={"ID":"a0eba6b8-a3b0-4fb3-997b-6ce23e035ec9","Type":"ContainerDied","Data":"803572cb865f08df148a5cc143a9ec03a4901b51464c47993501057c57c62eb2"} Dec 04 10:33:12 crc kubenswrapper[4831]: I1204 10:33:12.453495 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="803572cb865f08df148a5cc143a9ec03a4901b51464c47993501057c57c62eb2" Dec 04 10:33:12 crc kubenswrapper[4831]: I1204 10:33:12.453699 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-68cqf" Dec 04 10:33:12 crc kubenswrapper[4831]: I1204 10:33:12.459212 4831 generic.go:334] "Generic (PLEG): container finished" podID="b95fde6f-f78a-4a46-ab00-5f817de61b4e" containerID="0b0d322bf8c7d8d87f97d5c85465dee8ff31c878c6974ec296a81d9912a4d80f" exitCode=0 Dec 04 10:33:12 crc kubenswrapper[4831]: I1204 10:33:12.459276 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-45skt" event={"ID":"b95fde6f-f78a-4a46-ab00-5f817de61b4e","Type":"ContainerDied","Data":"0b0d322bf8c7d8d87f97d5c85465dee8ff31c878c6974ec296a81d9912a4d80f"} Dec 04 10:33:12 crc kubenswrapper[4831]: I1204 10:33:12.459312 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-45skt" event={"ID":"b95fde6f-f78a-4a46-ab00-5f817de61b4e","Type":"ContainerStarted","Data":"9e8daa69317c2a688171ece0cfef489c4b166a4f0915a6e1d57a2d60006cf09f"} Dec 04 10:33:12 crc kubenswrapper[4831]: I1204 10:33:12.460587 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-hwz95" event={"ID":"f8149a56-f5ed-43f5-98a4-8d7324feadce","Type":"ContainerDied","Data":"12071bcce28b9661dbfc6dde3bd604dca69ed1696a65544bd30d475425e7c2af"} Dec 04 10:33:12 crc kubenswrapper[4831]: I1204 10:33:12.460608 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12071bcce28b9661dbfc6dde3bd604dca69ed1696a65544bd30d475425e7c2af" Dec 04 10:33:12 crc kubenswrapper[4831]: I1204 10:33:12.460641 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hwz95" Dec 04 10:33:14 crc kubenswrapper[4831]: I1204 10:33:14.106878 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-2l4h5" podUID="3bc807f5-5d11-4f15-aff6-7c5377f10b33" containerName="ovn-controller" probeResult="failure" output=< Dec 04 10:33:14 crc kubenswrapper[4831]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 04 10:33:14 crc kubenswrapper[4831]: > Dec 04 10:33:14 crc kubenswrapper[4831]: I1204 10:33:14.503394 4831 generic.go:334] "Generic (PLEG): container finished" podID="2d1e04df-4c2a-440f-b533-9903a58c8ecc" containerID="c1f343b6b370799b8079dd35ff1cdd9be187941142d4acadb941c8c7445a556a" exitCode=0 Dec 04 10:33:14 crc kubenswrapper[4831]: I1204 10:33:14.503496 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2d1e04df-4c2a-440f-b533-9903a58c8ecc","Type":"ContainerDied","Data":"c1f343b6b370799b8079dd35ff1cdd9be187941142d4acadb941c8c7445a556a"} Dec 04 10:33:14 crc kubenswrapper[4831]: I1204 10:33:14.505280 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-p9dj4" Dec 04 10:33:14 crc kubenswrapper[4831]: I1204 10:33:14.506635 4831 generic.go:334] "Generic (PLEG): container finished" podID="462ff702-35a4-4cbe-8155-3ce8a321bf48" containerID="92a2d996ef4484246c9e0c6a7199e478aa295d1ca7aa713e7d588129a1cc9d96" exitCode=0 Dec 04 10:33:14 crc kubenswrapper[4831]: I1204 10:33:14.506693 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"462ff702-35a4-4cbe-8155-3ce8a321bf48","Type":"ContainerDied","Data":"92a2d996ef4484246c9e0c6a7199e478aa295d1ca7aa713e7d588129a1cc9d96"} Dec 04 10:33:14 crc kubenswrapper[4831]: I1204 10:33:14.507038 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-p9dj4" Dec 04 10:33:14 crc kubenswrapper[4831]: I1204 10:33:14.762795 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-2l4h5-config-42jp4"] Dec 04 10:33:14 crc kubenswrapper[4831]: E1204 10:33:14.763241 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0eba6b8-a3b0-4fb3-997b-6ce23e035ec9" containerName="mariadb-database-create" Dec 04 10:33:14 crc kubenswrapper[4831]: I1204 10:33:14.763263 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0eba6b8-a3b0-4fb3-997b-6ce23e035ec9" containerName="mariadb-database-create" Dec 04 10:33:14 crc kubenswrapper[4831]: E1204 10:33:14.763304 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8149a56-f5ed-43f5-98a4-8d7324feadce" containerName="mariadb-database-create" Dec 04 10:33:14 crc kubenswrapper[4831]: I1204 10:33:14.763313 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8149a56-f5ed-43f5-98a4-8d7324feadce" containerName="mariadb-database-create" Dec 04 10:33:14 crc kubenswrapper[4831]: I1204 10:33:14.763531 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8149a56-f5ed-43f5-98a4-8d7324feadce" containerName="mariadb-database-create" Dec 04 10:33:14 crc kubenswrapper[4831]: I1204 10:33:14.763563 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0eba6b8-a3b0-4fb3-997b-6ce23e035ec9" containerName="mariadb-database-create" Dec 04 10:33:14 crc kubenswrapper[4831]: I1204 10:33:14.764219 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2l4h5-config-42jp4" Dec 04 10:33:14 crc kubenswrapper[4831]: I1204 10:33:14.767429 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 04 10:33:14 crc kubenswrapper[4831]: I1204 10:33:14.778391 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2l4h5-config-42jp4"] Dec 04 10:33:14 crc kubenswrapper[4831]: I1204 10:33:14.853624 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c3fabd1d-3171-451c-b105-8f33112581c5-scripts\") pod \"ovn-controller-2l4h5-config-42jp4\" (UID: \"c3fabd1d-3171-451c-b105-8f33112581c5\") " pod="openstack/ovn-controller-2l4h5-config-42jp4" Dec 04 10:33:14 crc kubenswrapper[4831]: I1204 10:33:14.853765 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c3fabd1d-3171-451c-b105-8f33112581c5-var-run-ovn\") pod \"ovn-controller-2l4h5-config-42jp4\" (UID: \"c3fabd1d-3171-451c-b105-8f33112581c5\") " pod="openstack/ovn-controller-2l4h5-config-42jp4" Dec 04 10:33:14 crc kubenswrapper[4831]: I1204 10:33:14.853825 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c3fabd1d-3171-451c-b105-8f33112581c5-var-log-ovn\") pod \"ovn-controller-2l4h5-config-42jp4\" (UID: \"c3fabd1d-3171-451c-b105-8f33112581c5\") " pod="openstack/ovn-controller-2l4h5-config-42jp4" Dec 04 10:33:14 crc kubenswrapper[4831]: I1204 10:33:14.853965 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c3fabd1d-3171-451c-b105-8f33112581c5-additional-scripts\") pod \"ovn-controller-2l4h5-config-42jp4\" (UID: \"c3fabd1d-3171-451c-b105-8f33112581c5\") " pod="openstack/ovn-controller-2l4h5-config-42jp4" Dec 04 10:33:14 crc kubenswrapper[4831]: I1204 10:33:14.853988 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjncz\" (UniqueName: \"kubernetes.io/projected/c3fabd1d-3171-451c-b105-8f33112581c5-kube-api-access-vjncz\") pod \"ovn-controller-2l4h5-config-42jp4\" (UID: \"c3fabd1d-3171-451c-b105-8f33112581c5\") " pod="openstack/ovn-controller-2l4h5-config-42jp4" Dec 04 10:33:14 crc kubenswrapper[4831]: I1204 10:33:14.854206 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c3fabd1d-3171-451c-b105-8f33112581c5-var-run\") pod \"ovn-controller-2l4h5-config-42jp4\" (UID: \"c3fabd1d-3171-451c-b105-8f33112581c5\") " pod="openstack/ovn-controller-2l4h5-config-42jp4" Dec 04 10:33:14 crc kubenswrapper[4831]: I1204 10:33:14.955889 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c3fabd1d-3171-451c-b105-8f33112581c5-scripts\") pod \"ovn-controller-2l4h5-config-42jp4\" (UID: \"c3fabd1d-3171-451c-b105-8f33112581c5\") " pod="openstack/ovn-controller-2l4h5-config-42jp4" Dec 04 10:33:14 crc kubenswrapper[4831]: I1204 10:33:14.955953 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c3fabd1d-3171-451c-b105-8f33112581c5-var-run-ovn\") pod \"ovn-controller-2l4h5-config-42jp4\" (UID: \"c3fabd1d-3171-451c-b105-8f33112581c5\") " pod="openstack/ovn-controller-2l4h5-config-42jp4" Dec 04 10:33:14 crc kubenswrapper[4831]: I1204 10:33:14.955984 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c3fabd1d-3171-451c-b105-8f33112581c5-var-log-ovn\") pod \"ovn-controller-2l4h5-config-42jp4\" (UID: \"c3fabd1d-3171-451c-b105-8f33112581c5\") " pod="openstack/ovn-controller-2l4h5-config-42jp4" Dec 04 10:33:14 crc kubenswrapper[4831]: I1204 10:33:14.956040 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c3fabd1d-3171-451c-b105-8f33112581c5-additional-scripts\") pod \"ovn-controller-2l4h5-config-42jp4\" (UID: \"c3fabd1d-3171-451c-b105-8f33112581c5\") " pod="openstack/ovn-controller-2l4h5-config-42jp4" Dec 04 10:33:14 crc kubenswrapper[4831]: I1204 10:33:14.956058 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjncz\" (UniqueName: \"kubernetes.io/projected/c3fabd1d-3171-451c-b105-8f33112581c5-kube-api-access-vjncz\") pod \"ovn-controller-2l4h5-config-42jp4\" (UID: \"c3fabd1d-3171-451c-b105-8f33112581c5\") " pod="openstack/ovn-controller-2l4h5-config-42jp4" Dec 04 10:33:14 crc kubenswrapper[4831]: I1204 10:33:14.956132 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c3fabd1d-3171-451c-b105-8f33112581c5-var-run\") pod \"ovn-controller-2l4h5-config-42jp4\" (UID: \"c3fabd1d-3171-451c-b105-8f33112581c5\") " pod="openstack/ovn-controller-2l4h5-config-42jp4" Dec 04 10:33:14 crc kubenswrapper[4831]: I1204 10:33:14.956261 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c3fabd1d-3171-451c-b105-8f33112581c5-var-run-ovn\") pod \"ovn-controller-2l4h5-config-42jp4\" (UID: \"c3fabd1d-3171-451c-b105-8f33112581c5\") " pod="openstack/ovn-controller-2l4h5-config-42jp4" Dec 04 10:33:14 crc kubenswrapper[4831]: I1204 10:33:14.956272 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c3fabd1d-3171-451c-b105-8f33112581c5-var-run\") pod \"ovn-controller-2l4h5-config-42jp4\" (UID: \"c3fabd1d-3171-451c-b105-8f33112581c5\") " pod="openstack/ovn-controller-2l4h5-config-42jp4" Dec 04 10:33:14 crc kubenswrapper[4831]: I1204 10:33:14.956325 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c3fabd1d-3171-451c-b105-8f33112581c5-var-log-ovn\") pod \"ovn-controller-2l4h5-config-42jp4\" (UID: \"c3fabd1d-3171-451c-b105-8f33112581c5\") " pod="openstack/ovn-controller-2l4h5-config-42jp4" Dec 04 10:33:14 crc kubenswrapper[4831]: I1204 10:33:14.956750 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c3fabd1d-3171-451c-b105-8f33112581c5-additional-scripts\") pod \"ovn-controller-2l4h5-config-42jp4\" (UID: \"c3fabd1d-3171-451c-b105-8f33112581c5\") " pod="openstack/ovn-controller-2l4h5-config-42jp4" Dec 04 10:33:14 crc kubenswrapper[4831]: I1204 10:33:14.957955 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c3fabd1d-3171-451c-b105-8f33112581c5-scripts\") pod \"ovn-controller-2l4h5-config-42jp4\" (UID: \"c3fabd1d-3171-451c-b105-8f33112581c5\") " pod="openstack/ovn-controller-2l4h5-config-42jp4" Dec 04 10:33:14 crc kubenswrapper[4831]: I1204 10:33:14.986444 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjncz\" (UniqueName: \"kubernetes.io/projected/c3fabd1d-3171-451c-b105-8f33112581c5-kube-api-access-vjncz\") pod \"ovn-controller-2l4h5-config-42jp4\" (UID: \"c3fabd1d-3171-451c-b105-8f33112581c5\") " pod="openstack/ovn-controller-2l4h5-config-42jp4" Dec 04 10:33:15 crc kubenswrapper[4831]: I1204 10:33:15.081992 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2l4h5-config-42jp4" Dec 04 10:33:15 crc kubenswrapper[4831]: I1204 10:33:15.109847 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-45skt" Dec 04 10:33:15 crc kubenswrapper[4831]: I1204 10:33:15.158640 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9c6dl\" (UniqueName: \"kubernetes.io/projected/b95fde6f-f78a-4a46-ab00-5f817de61b4e-kube-api-access-9c6dl\") pod \"b95fde6f-f78a-4a46-ab00-5f817de61b4e\" (UID: \"b95fde6f-f78a-4a46-ab00-5f817de61b4e\") " Dec 04 10:33:15 crc kubenswrapper[4831]: I1204 10:33:15.164844 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b95fde6f-f78a-4a46-ab00-5f817de61b4e-kube-api-access-9c6dl" (OuterVolumeSpecName: "kube-api-access-9c6dl") pod "b95fde6f-f78a-4a46-ab00-5f817de61b4e" (UID: "b95fde6f-f78a-4a46-ab00-5f817de61b4e"). InnerVolumeSpecName "kube-api-access-9c6dl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:33:15 crc kubenswrapper[4831]: I1204 10:33:15.262369 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9c6dl\" (UniqueName: \"kubernetes.io/projected/b95fde6f-f78a-4a46-ab00-5f817de61b4e-kube-api-access-9c6dl\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:15 crc kubenswrapper[4831]: E1204 10:33:15.326620 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/prometheus-metric-storage-0" podUID="e058318a-8379-4f10-9860-7af36b3278e5" Dec 04 10:33:15 crc kubenswrapper[4831]: I1204 10:33:15.516535 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2d1e04df-4c2a-440f-b533-9903a58c8ecc","Type":"ContainerStarted","Data":"13a0cc1dacfc84e2f2f951677b6763271ea13bee4c69f204082d8cd8e7422c77"} Dec 04 10:33:15 crc kubenswrapper[4831]: I1204 10:33:15.516722 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:33:15 crc kubenswrapper[4831]: I1204 10:33:15.519018 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"462ff702-35a4-4cbe-8155-3ce8a321bf48","Type":"ContainerStarted","Data":"06ead54d3b9961b374f03f78f071a94d7c8a3739adc54b4521b471bd6df743d1"} Dec 04 10:33:15 crc kubenswrapper[4831]: I1204 10:33:15.519263 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 04 10:33:15 crc kubenswrapper[4831]: I1204 10:33:15.520691 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-45skt" Dec 04 10:33:15 crc kubenswrapper[4831]: I1204 10:33:15.520699 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-45skt" event={"ID":"b95fde6f-f78a-4a46-ab00-5f817de61b4e","Type":"ContainerDied","Data":"9e8daa69317c2a688171ece0cfef489c4b166a4f0915a6e1d57a2d60006cf09f"} Dec 04 10:33:15 crc kubenswrapper[4831]: I1204 10:33:15.521312 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e8daa69317c2a688171ece0cfef489c4b166a4f0915a6e1d57a2d60006cf09f" Dec 04 10:33:15 crc kubenswrapper[4831]: I1204 10:33:15.522737 4831 generic.go:334] "Generic (PLEG): container finished" podID="d13ed0c0-494b-46b5-965d-1426a9575119" containerID="ce1e83f2670b8a1f56e0cacd4ecd566e73227dc859e046189c7f385da260a6de" exitCode=0 Dec 04 10:33:15 crc kubenswrapper[4831]: I1204 10:33:15.522796 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"d13ed0c0-494b-46b5-965d-1426a9575119","Type":"ContainerDied","Data":"ce1e83f2670b8a1f56e0cacd4ecd566e73227dc859e046189c7f385da260a6de"} Dec 04 10:33:15 crc kubenswrapper[4831]: I1204 10:33:15.525416 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e058318a-8379-4f10-9860-7af36b3278e5","Type":"ContainerStarted","Data":"d5101b883935f875441bbee30620f782e02cc4a55144d0af5cc004f056164c07"} Dec 04 10:33:15 crc kubenswrapper[4831]: I1204 10:33:15.551209 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=55.885174928 podStartE2EDuration="1m2.551192178s" podCreationTimestamp="2025-12-04 10:32:13 +0000 UTC" firstStartedPulling="2025-12-04 10:32:32.460847537 +0000 UTC m=+1049.410022861" lastFinishedPulling="2025-12-04 10:32:39.126864797 +0000 UTC m=+1056.076040111" observedRunningTime="2025-12-04 10:33:15.546651598 +0000 UTC m=+1092.495826932" watchObservedRunningTime="2025-12-04 10:33:15.551192178 +0000 UTC m=+1092.500367492" Dec 04 10:33:15 crc kubenswrapper[4831]: I1204 10:33:15.680729 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=56.34150359 podStartE2EDuration="1m2.680707503s" podCreationTimestamp="2025-12-04 10:32:13 +0000 UTC" firstStartedPulling="2025-12-04 10:32:32.898817456 +0000 UTC m=+1049.847992760" lastFinishedPulling="2025-12-04 10:32:39.238021359 +0000 UTC m=+1056.187196673" observedRunningTime="2025-12-04 10:33:15.640943504 +0000 UTC m=+1092.590118838" watchObservedRunningTime="2025-12-04 10:33:15.680707503 +0000 UTC m=+1092.629882847" Dec 04 10:33:15 crc kubenswrapper[4831]: I1204 10:33:15.689071 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2l4h5-config-42jp4"] Dec 04 10:33:15 crc kubenswrapper[4831]: W1204 10:33:15.691363 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3fabd1d_3171_451c_b105_8f33112581c5.slice/crio-97806ac97d18c7578df467995110131f5695e0859855f33bc9aac0f8b21ac76e WatchSource:0}: Error finding container 97806ac97d18c7578df467995110131f5695e0859855f33bc9aac0f8b21ac76e: Status 404 returned error can't find the container with id 97806ac97d18c7578df467995110131f5695e0859855f33bc9aac0f8b21ac76e Dec 04 10:33:16 crc kubenswrapper[4831]: I1204 10:33:16.539450 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"d13ed0c0-494b-46b5-965d-1426a9575119","Type":"ContainerStarted","Data":"8496ca8751f1b0a974f5a13d06e5c407f3c3f325c650c5ef3cc20fef855359a2"} Dec 04 10:33:16 crc kubenswrapper[4831]: I1204 10:33:16.539978 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-notifications-server-0" Dec 04 10:33:16 crc kubenswrapper[4831]: I1204 10:33:16.541591 4831 generic.go:334] "Generic (PLEG): container finished" podID="c3fabd1d-3171-451c-b105-8f33112581c5" containerID="1e9cdf3fb9bb3ae6888f5cbab78cc708c0cf20a3e5f2b25201f398548b6207aa" exitCode=0 Dec 04 10:33:16 crc kubenswrapper[4831]: I1204 10:33:16.541716 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2l4h5-config-42jp4" event={"ID":"c3fabd1d-3171-451c-b105-8f33112581c5","Type":"ContainerDied","Data":"1e9cdf3fb9bb3ae6888f5cbab78cc708c0cf20a3e5f2b25201f398548b6207aa"} Dec 04 10:33:16 crc kubenswrapper[4831]: I1204 10:33:16.541772 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2l4h5-config-42jp4" event={"ID":"c3fabd1d-3171-451c-b105-8f33112581c5","Type":"ContainerStarted","Data":"97806ac97d18c7578df467995110131f5695e0859855f33bc9aac0f8b21ac76e"} Dec 04 10:33:16 crc kubenswrapper[4831]: I1204 10:33:16.549811 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-798f579549-frr45" Dec 04 10:33:16 crc kubenswrapper[4831]: I1204 10:33:16.566444 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-notifications-server-0" podStartSLOduration=55.510971039 podStartE2EDuration="1m2.566423259s" podCreationTimestamp="2025-12-04 10:32:14 +0000 UTC" firstStartedPulling="2025-12-04 10:32:32.433618529 +0000 UTC m=+1049.382793843" lastFinishedPulling="2025-12-04 10:32:39.489070709 +0000 UTC m=+1056.438246063" observedRunningTime="2025-12-04 10:33:16.563891283 +0000 UTC m=+1093.513066607" watchObservedRunningTime="2025-12-04 10:33:16.566423259 +0000 UTC m=+1093.515598613" Dec 04 10:33:16 crc kubenswrapper[4831]: I1204 10:33:16.638616 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c8f654669-pc8vw"] Dec 04 10:33:16 crc kubenswrapper[4831]: I1204 10:33:16.638930 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c8f654669-pc8vw" podUID="50ea30fe-69af-42c7-baf7-3a3bb0ab7882" containerName="dnsmasq-dns" containerID="cri-o://2de252bfc4262bbbdc329f8131e651dd08c0e9d8efbfee87b1fdc426745cb9fa" gracePeriod=10 Dec 04 10:33:17 crc kubenswrapper[4831]: I1204 10:33:17.074533 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c8f654669-pc8vw" Dec 04 10:33:17 crc kubenswrapper[4831]: I1204 10:33:17.204643 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50ea30fe-69af-42c7-baf7-3a3bb0ab7882-config\") pod \"50ea30fe-69af-42c7-baf7-3a3bb0ab7882\" (UID: \"50ea30fe-69af-42c7-baf7-3a3bb0ab7882\") " Dec 04 10:33:17 crc kubenswrapper[4831]: I1204 10:33:17.204730 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50ea30fe-69af-42c7-baf7-3a3bb0ab7882-dns-svc\") pod \"50ea30fe-69af-42c7-baf7-3a3bb0ab7882\" (UID: \"50ea30fe-69af-42c7-baf7-3a3bb0ab7882\") " Dec 04 10:33:17 crc kubenswrapper[4831]: I1204 10:33:17.204794 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xndbr\" (UniqueName: \"kubernetes.io/projected/50ea30fe-69af-42c7-baf7-3a3bb0ab7882-kube-api-access-xndbr\") pod \"50ea30fe-69af-42c7-baf7-3a3bb0ab7882\" (UID: \"50ea30fe-69af-42c7-baf7-3a3bb0ab7882\") " Dec 04 10:33:17 crc kubenswrapper[4831]: I1204 10:33:17.204822 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50ea30fe-69af-42c7-baf7-3a3bb0ab7882-ovsdbserver-sb\") pod \"50ea30fe-69af-42c7-baf7-3a3bb0ab7882\" (UID: \"50ea30fe-69af-42c7-baf7-3a3bb0ab7882\") " Dec 04 10:33:17 crc kubenswrapper[4831]: I1204 10:33:17.214892 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50ea30fe-69af-42c7-baf7-3a3bb0ab7882-kube-api-access-xndbr" (OuterVolumeSpecName: "kube-api-access-xndbr") pod "50ea30fe-69af-42c7-baf7-3a3bb0ab7882" (UID: "50ea30fe-69af-42c7-baf7-3a3bb0ab7882"). InnerVolumeSpecName "kube-api-access-xndbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:33:17 crc kubenswrapper[4831]: I1204 10:33:17.249526 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50ea30fe-69af-42c7-baf7-3a3bb0ab7882-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "50ea30fe-69af-42c7-baf7-3a3bb0ab7882" (UID: "50ea30fe-69af-42c7-baf7-3a3bb0ab7882"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:33:17 crc kubenswrapper[4831]: I1204 10:33:17.262683 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50ea30fe-69af-42c7-baf7-3a3bb0ab7882-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "50ea30fe-69af-42c7-baf7-3a3bb0ab7882" (UID: "50ea30fe-69af-42c7-baf7-3a3bb0ab7882"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:33:17 crc kubenswrapper[4831]: I1204 10:33:17.271847 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50ea30fe-69af-42c7-baf7-3a3bb0ab7882-config" (OuterVolumeSpecName: "config") pod "50ea30fe-69af-42c7-baf7-3a3bb0ab7882" (UID: "50ea30fe-69af-42c7-baf7-3a3bb0ab7882"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:33:17 crc kubenswrapper[4831]: I1204 10:33:17.306920 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50ea30fe-69af-42c7-baf7-3a3bb0ab7882-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:17 crc kubenswrapper[4831]: I1204 10:33:17.307078 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50ea30fe-69af-42c7-baf7-3a3bb0ab7882-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:17 crc kubenswrapper[4831]: I1204 10:33:17.307163 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xndbr\" (UniqueName: \"kubernetes.io/projected/50ea30fe-69af-42c7-baf7-3a3bb0ab7882-kube-api-access-xndbr\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:17 crc kubenswrapper[4831]: I1204 10:33:17.307248 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50ea30fe-69af-42c7-baf7-3a3bb0ab7882-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:17 crc kubenswrapper[4831]: I1204 10:33:17.550430 4831 generic.go:334] "Generic (PLEG): container finished" podID="50ea30fe-69af-42c7-baf7-3a3bb0ab7882" containerID="2de252bfc4262bbbdc329f8131e651dd08c0e9d8efbfee87b1fdc426745cb9fa" exitCode=0 Dec 04 10:33:17 crc kubenswrapper[4831]: I1204 10:33:17.550653 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c8f654669-pc8vw" Dec 04 10:33:17 crc kubenswrapper[4831]: I1204 10:33:17.550689 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c8f654669-pc8vw" event={"ID":"50ea30fe-69af-42c7-baf7-3a3bb0ab7882","Type":"ContainerDied","Data":"2de252bfc4262bbbdc329f8131e651dd08c0e9d8efbfee87b1fdc426745cb9fa"} Dec 04 10:33:17 crc kubenswrapper[4831]: I1204 10:33:17.551442 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c8f654669-pc8vw" event={"ID":"50ea30fe-69af-42c7-baf7-3a3bb0ab7882","Type":"ContainerDied","Data":"f1bf1600d403e8f94bf1136132bda283d6c9dcba948f3317e6d91777b75b2c58"} Dec 04 10:33:17 crc kubenswrapper[4831]: I1204 10:33:17.551459 4831 scope.go:117] "RemoveContainer" containerID="2de252bfc4262bbbdc329f8131e651dd08c0e9d8efbfee87b1fdc426745cb9fa" Dec 04 10:33:17 crc kubenswrapper[4831]: I1204 10:33:17.555190 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e058318a-8379-4f10-9860-7af36b3278e5","Type":"ContainerStarted","Data":"5ab972791e70f3d9847c29e4571f38d67adab4b6eaa9cc79e48340f2769731dc"} Dec 04 10:33:17 crc kubenswrapper[4831]: I1204 10:33:17.601190 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c8f654669-pc8vw"] Dec 04 10:33:17 crc kubenswrapper[4831]: I1204 10:33:17.604826 4831 scope.go:117] "RemoveContainer" containerID="e7333a20bd1aa184b46175df8c8ee084e6bad29cb8fb6a720bb07c9342782426" Dec 04 10:33:17 crc kubenswrapper[4831]: I1204 10:33:17.617633 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c8f654669-pc8vw"] Dec 04 10:33:17 crc kubenswrapper[4831]: I1204 10:33:17.620723 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=13.623904816 podStartE2EDuration="57.62070412s" podCreationTimestamp="2025-12-04 10:32:20 +0000 UTC" firstStartedPulling="2025-12-04 10:32:33.13028033 +0000 UTC m=+1050.079455644" lastFinishedPulling="2025-12-04 10:33:17.127079634 +0000 UTC m=+1094.076254948" observedRunningTime="2025-12-04 10:33:17.615637826 +0000 UTC m=+1094.564813140" watchObservedRunningTime="2025-12-04 10:33:17.62070412 +0000 UTC m=+1094.569879434" Dec 04 10:33:17 crc kubenswrapper[4831]: I1204 10:33:17.636585 4831 scope.go:117] "RemoveContainer" containerID="2de252bfc4262bbbdc329f8131e651dd08c0e9d8efbfee87b1fdc426745cb9fa" Dec 04 10:33:17 crc kubenswrapper[4831]: E1204 10:33:17.637364 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2de252bfc4262bbbdc329f8131e651dd08c0e9d8efbfee87b1fdc426745cb9fa\": container with ID starting with 2de252bfc4262bbbdc329f8131e651dd08c0e9d8efbfee87b1fdc426745cb9fa not found: ID does not exist" containerID="2de252bfc4262bbbdc329f8131e651dd08c0e9d8efbfee87b1fdc426745cb9fa" Dec 04 10:33:17 crc kubenswrapper[4831]: I1204 10:33:17.637402 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2de252bfc4262bbbdc329f8131e651dd08c0e9d8efbfee87b1fdc426745cb9fa"} err="failed to get container status \"2de252bfc4262bbbdc329f8131e651dd08c0e9d8efbfee87b1fdc426745cb9fa\": rpc error: code = NotFound desc = could not find container \"2de252bfc4262bbbdc329f8131e651dd08c0e9d8efbfee87b1fdc426745cb9fa\": container with ID starting with 2de252bfc4262bbbdc329f8131e651dd08c0e9d8efbfee87b1fdc426745cb9fa not found: ID does not exist" Dec 04 10:33:17 crc kubenswrapper[4831]: I1204 10:33:17.637429 4831 scope.go:117] "RemoveContainer" containerID="e7333a20bd1aa184b46175df8c8ee084e6bad29cb8fb6a720bb07c9342782426" Dec 04 10:33:17 crc kubenswrapper[4831]: E1204 10:33:17.638966 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7333a20bd1aa184b46175df8c8ee084e6bad29cb8fb6a720bb07c9342782426\": container with ID starting with e7333a20bd1aa184b46175df8c8ee084e6bad29cb8fb6a720bb07c9342782426 not found: ID does not exist" containerID="e7333a20bd1aa184b46175df8c8ee084e6bad29cb8fb6a720bb07c9342782426" Dec 04 10:33:17 crc kubenswrapper[4831]: I1204 10:33:17.638996 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7333a20bd1aa184b46175df8c8ee084e6bad29cb8fb6a720bb07c9342782426"} err="failed to get container status \"e7333a20bd1aa184b46175df8c8ee084e6bad29cb8fb6a720bb07c9342782426\": rpc error: code = NotFound desc = could not find container \"e7333a20bd1aa184b46175df8c8ee084e6bad29cb8fb6a720bb07c9342782426\": container with ID starting with e7333a20bd1aa184b46175df8c8ee084e6bad29cb8fb6a720bb07c9342782426 not found: ID does not exist" Dec 04 10:33:17 crc kubenswrapper[4831]: I1204 10:33:17.848828 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2l4h5-config-42jp4" Dec 04 10:33:17 crc kubenswrapper[4831]: I1204 10:33:17.920245 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c3fabd1d-3171-451c-b105-8f33112581c5-additional-scripts\") pod \"c3fabd1d-3171-451c-b105-8f33112581c5\" (UID: \"c3fabd1d-3171-451c-b105-8f33112581c5\") " Dec 04 10:33:17 crc kubenswrapper[4831]: I1204 10:33:17.920426 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c3fabd1d-3171-451c-b105-8f33112581c5-scripts\") pod \"c3fabd1d-3171-451c-b105-8f33112581c5\" (UID: \"c3fabd1d-3171-451c-b105-8f33112581c5\") " Dec 04 10:33:17 crc kubenswrapper[4831]: I1204 10:33:17.920472 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c3fabd1d-3171-451c-b105-8f33112581c5-var-log-ovn\") pod \"c3fabd1d-3171-451c-b105-8f33112581c5\" (UID: \"c3fabd1d-3171-451c-b105-8f33112581c5\") " Dec 04 10:33:17 crc kubenswrapper[4831]: I1204 10:33:17.920508 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c3fabd1d-3171-451c-b105-8f33112581c5-var-run\") pod \"c3fabd1d-3171-451c-b105-8f33112581c5\" (UID: \"c3fabd1d-3171-451c-b105-8f33112581c5\") " Dec 04 10:33:17 crc kubenswrapper[4831]: I1204 10:33:17.920529 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c3fabd1d-3171-451c-b105-8f33112581c5-var-run-ovn\") pod \"c3fabd1d-3171-451c-b105-8f33112581c5\" (UID: \"c3fabd1d-3171-451c-b105-8f33112581c5\") " Dec 04 10:33:17 crc kubenswrapper[4831]: I1204 10:33:17.920601 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjncz\" (UniqueName: \"kubernetes.io/projected/c3fabd1d-3171-451c-b105-8f33112581c5-kube-api-access-vjncz\") pod \"c3fabd1d-3171-451c-b105-8f33112581c5\" (UID: \"c3fabd1d-3171-451c-b105-8f33112581c5\") " Dec 04 10:33:17 crc kubenswrapper[4831]: I1204 10:33:17.921858 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3fabd1d-3171-451c-b105-8f33112581c5-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "c3fabd1d-3171-451c-b105-8f33112581c5" (UID: "c3fabd1d-3171-451c-b105-8f33112581c5"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:33:17 crc kubenswrapper[4831]: I1204 10:33:17.922515 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3fabd1d-3171-451c-b105-8f33112581c5-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "c3fabd1d-3171-451c-b105-8f33112581c5" (UID: "c3fabd1d-3171-451c-b105-8f33112581c5"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:33:17 crc kubenswrapper[4831]: I1204 10:33:17.923251 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3fabd1d-3171-451c-b105-8f33112581c5-scripts" (OuterVolumeSpecName: "scripts") pod "c3fabd1d-3171-451c-b105-8f33112581c5" (UID: "c3fabd1d-3171-451c-b105-8f33112581c5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:33:17 crc kubenswrapper[4831]: I1204 10:33:17.923294 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3fabd1d-3171-451c-b105-8f33112581c5-var-run" (OuterVolumeSpecName: "var-run") pod "c3fabd1d-3171-451c-b105-8f33112581c5" (UID: "c3fabd1d-3171-451c-b105-8f33112581c5"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:33:17 crc kubenswrapper[4831]: I1204 10:33:17.923316 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3fabd1d-3171-451c-b105-8f33112581c5-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "c3fabd1d-3171-451c-b105-8f33112581c5" (UID: "c3fabd1d-3171-451c-b105-8f33112581c5"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:33:17 crc kubenswrapper[4831]: I1204 10:33:17.929963 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3fabd1d-3171-451c-b105-8f33112581c5-kube-api-access-vjncz" (OuterVolumeSpecName: "kube-api-access-vjncz") pod "c3fabd1d-3171-451c-b105-8f33112581c5" (UID: "c3fabd1d-3171-451c-b105-8f33112581c5"). InnerVolumeSpecName "kube-api-access-vjncz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:33:18 crc kubenswrapper[4831]: I1204 10:33:18.023022 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c3fabd1d-3171-451c-b105-8f33112581c5-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:18 crc kubenswrapper[4831]: I1204 10:33:18.023060 4831 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c3fabd1d-3171-451c-b105-8f33112581c5-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:18 crc kubenswrapper[4831]: I1204 10:33:18.023070 4831 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c3fabd1d-3171-451c-b105-8f33112581c5-var-run\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:18 crc kubenswrapper[4831]: I1204 10:33:18.023079 4831 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c3fabd1d-3171-451c-b105-8f33112581c5-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:18 crc kubenswrapper[4831]: I1204 10:33:18.023089 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjncz\" (UniqueName: \"kubernetes.io/projected/c3fabd1d-3171-451c-b105-8f33112581c5-kube-api-access-vjncz\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:18 crc kubenswrapper[4831]: I1204 10:33:18.023099 4831 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c3fabd1d-3171-451c-b105-8f33112581c5-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:18 crc kubenswrapper[4831]: I1204 10:33:18.563434 4831 generic.go:334] "Generic (PLEG): container finished" podID="6f9eb652-90e2-4231-a441-a9947e9fc782" containerID="f27145619c1f4f73d8d7584d371c959b21ed34f09a902480829c8583442cb105" exitCode=0 Dec 04 10:33:18 crc kubenswrapper[4831]: I1204 10:33:18.563514 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-xh7nf" event={"ID":"6f9eb652-90e2-4231-a441-a9947e9fc782","Type":"ContainerDied","Data":"f27145619c1f4f73d8d7584d371c959b21ed34f09a902480829c8583442cb105"} Dec 04 10:33:18 crc kubenswrapper[4831]: I1204 10:33:18.567630 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2l4h5-config-42jp4" event={"ID":"c3fabd1d-3171-451c-b105-8f33112581c5","Type":"ContainerDied","Data":"97806ac97d18c7578df467995110131f5695e0859855f33bc9aac0f8b21ac76e"} Dec 04 10:33:18 crc kubenswrapper[4831]: I1204 10:33:18.567680 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97806ac97d18c7578df467995110131f5695e0859855f33bc9aac0f8b21ac76e" Dec 04 10:33:18 crc kubenswrapper[4831]: I1204 10:33:18.567693 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2l4h5-config-42jp4" Dec 04 10:33:18 crc kubenswrapper[4831]: I1204 10:33:18.570544 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-e63e-account-create-rx9tx"] Dec 04 10:33:18 crc kubenswrapper[4831]: E1204 10:33:18.571009 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b95fde6f-f78a-4a46-ab00-5f817de61b4e" containerName="mariadb-database-create" Dec 04 10:33:18 crc kubenswrapper[4831]: I1204 10:33:18.571026 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="b95fde6f-f78a-4a46-ab00-5f817de61b4e" containerName="mariadb-database-create" Dec 04 10:33:18 crc kubenswrapper[4831]: E1204 10:33:18.571046 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3fabd1d-3171-451c-b105-8f33112581c5" containerName="ovn-config" Dec 04 10:33:18 crc kubenswrapper[4831]: I1204 10:33:18.571054 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3fabd1d-3171-451c-b105-8f33112581c5" containerName="ovn-config" Dec 04 10:33:18 crc kubenswrapper[4831]: E1204 10:33:18.571074 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50ea30fe-69af-42c7-baf7-3a3bb0ab7882" containerName="dnsmasq-dns" Dec 04 10:33:18 crc kubenswrapper[4831]: I1204 10:33:18.571082 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="50ea30fe-69af-42c7-baf7-3a3bb0ab7882" containerName="dnsmasq-dns" Dec 04 10:33:18 crc kubenswrapper[4831]: E1204 10:33:18.571097 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50ea30fe-69af-42c7-baf7-3a3bb0ab7882" containerName="init" Dec 04 10:33:18 crc kubenswrapper[4831]: I1204 10:33:18.571105 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="50ea30fe-69af-42c7-baf7-3a3bb0ab7882" containerName="init" Dec 04 10:33:18 crc kubenswrapper[4831]: I1204 10:33:18.571319 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="b95fde6f-f78a-4a46-ab00-5f817de61b4e" containerName="mariadb-database-create" Dec 04 10:33:18 crc kubenswrapper[4831]: I1204 10:33:18.571341 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="50ea30fe-69af-42c7-baf7-3a3bb0ab7882" containerName="dnsmasq-dns" Dec 04 10:33:18 crc kubenswrapper[4831]: I1204 10:33:18.571353 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3fabd1d-3171-451c-b105-8f33112581c5" containerName="ovn-config" Dec 04 10:33:18 crc kubenswrapper[4831]: I1204 10:33:18.572104 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-e63e-account-create-rx9tx" Dec 04 10:33:18 crc kubenswrapper[4831]: I1204 10:33:18.576271 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 04 10:33:18 crc kubenswrapper[4831]: I1204 10:33:18.601975 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-e63e-account-create-rx9tx"] Dec 04 10:33:18 crc kubenswrapper[4831]: I1204 10:33:18.633815 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwt4d\" (UniqueName: \"kubernetes.io/projected/9433b0a1-4aba-4dea-9245-00e6e0ea65b3-kube-api-access-xwt4d\") pod \"keystone-e63e-account-create-rx9tx\" (UID: \"9433b0a1-4aba-4dea-9245-00e6e0ea65b3\") " pod="openstack/keystone-e63e-account-create-rx9tx" Dec 04 10:33:18 crc kubenswrapper[4831]: I1204 10:33:18.735123 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwt4d\" (UniqueName: \"kubernetes.io/projected/9433b0a1-4aba-4dea-9245-00e6e0ea65b3-kube-api-access-xwt4d\") pod \"keystone-e63e-account-create-rx9tx\" (UID: \"9433b0a1-4aba-4dea-9245-00e6e0ea65b3\") " pod="openstack/keystone-e63e-account-create-rx9tx" Dec 04 10:33:18 crc kubenswrapper[4831]: I1204 10:33:18.757341 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-cd08-account-create-6kj6f"] Dec 04 10:33:18 crc kubenswrapper[4831]: I1204 10:33:18.758573 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-cd08-account-create-6kj6f" Dec 04 10:33:18 crc kubenswrapper[4831]: I1204 10:33:18.760517 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 04 10:33:18 crc kubenswrapper[4831]: I1204 10:33:18.766409 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwt4d\" (UniqueName: \"kubernetes.io/projected/9433b0a1-4aba-4dea-9245-00e6e0ea65b3-kube-api-access-xwt4d\") pod \"keystone-e63e-account-create-rx9tx\" (UID: \"9433b0a1-4aba-4dea-9245-00e6e0ea65b3\") " pod="openstack/keystone-e63e-account-create-rx9tx" Dec 04 10:33:18 crc kubenswrapper[4831]: I1204 10:33:18.777820 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-cd08-account-create-6kj6f"] Dec 04 10:33:18 crc kubenswrapper[4831]: I1204 10:33:18.836741 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28mbh\" (UniqueName: \"kubernetes.io/projected/d25f503b-ff4c-4e83-9ac4-8342e6b525a5-kube-api-access-28mbh\") pod \"placement-cd08-account-create-6kj6f\" (UID: \"d25f503b-ff4c-4e83-9ac4-8342e6b525a5\") " pod="openstack/placement-cd08-account-create-6kj6f" Dec 04 10:33:18 crc kubenswrapper[4831]: I1204 10:33:18.899835 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-e63e-account-create-rx9tx" Dec 04 10:33:18 crc kubenswrapper[4831]: I1204 10:33:18.938748 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28mbh\" (UniqueName: \"kubernetes.io/projected/d25f503b-ff4c-4e83-9ac4-8342e6b525a5-kube-api-access-28mbh\") pod \"placement-cd08-account-create-6kj6f\" (UID: \"d25f503b-ff4c-4e83-9ac4-8342e6b525a5\") " pod="openstack/placement-cd08-account-create-6kj6f" Dec 04 10:33:18 crc kubenswrapper[4831]: I1204 10:33:18.961326 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-2l4h5-config-42jp4"] Dec 04 10:33:18 crc kubenswrapper[4831]: I1204 10:33:18.962273 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28mbh\" (UniqueName: \"kubernetes.io/projected/d25f503b-ff4c-4e83-9ac4-8342e6b525a5-kube-api-access-28mbh\") pod \"placement-cd08-account-create-6kj6f\" (UID: \"d25f503b-ff4c-4e83-9ac4-8342e6b525a5\") " pod="openstack/placement-cd08-account-create-6kj6f" Dec 04 10:33:18 crc kubenswrapper[4831]: I1204 10:33:18.998594 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-2l4h5-config-42jp4"] Dec 04 10:33:19 crc kubenswrapper[4831]: I1204 10:33:19.122756 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-cd08-account-create-6kj6f" Dec 04 10:33:19 crc kubenswrapper[4831]: I1204 10:33:19.131590 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-2l4h5-config-gl774"] Dec 04 10:33:19 crc kubenswrapper[4831]: I1204 10:33:19.132974 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2l4h5-config-gl774" Dec 04 10:33:19 crc kubenswrapper[4831]: I1204 10:33:19.138287 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-2l4h5" Dec 04 10:33:19 crc kubenswrapper[4831]: I1204 10:33:19.138562 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 04 10:33:19 crc kubenswrapper[4831]: I1204 10:33:19.156079 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2l4h5-config-gl774"] Dec 04 10:33:19 crc kubenswrapper[4831]: I1204 10:33:19.247951 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/80a355b8-835d-4581-86c3-6fc035b5c65c-var-log-ovn\") pod \"ovn-controller-2l4h5-config-gl774\" (UID: \"80a355b8-835d-4581-86c3-6fc035b5c65c\") " pod="openstack/ovn-controller-2l4h5-config-gl774" Dec 04 10:33:19 crc kubenswrapper[4831]: I1204 10:33:19.248007 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/80a355b8-835d-4581-86c3-6fc035b5c65c-var-run\") pod \"ovn-controller-2l4h5-config-gl774\" (UID: \"80a355b8-835d-4581-86c3-6fc035b5c65c\") " pod="openstack/ovn-controller-2l4h5-config-gl774" Dec 04 10:33:19 crc kubenswrapper[4831]: I1204 10:33:19.248040 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/80a355b8-835d-4581-86c3-6fc035b5c65c-additional-scripts\") pod \"ovn-controller-2l4h5-config-gl774\" (UID: \"80a355b8-835d-4581-86c3-6fc035b5c65c\") " pod="openstack/ovn-controller-2l4h5-config-gl774" Dec 04 10:33:19 crc kubenswrapper[4831]: I1204 10:33:19.248068 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/80a355b8-835d-4581-86c3-6fc035b5c65c-var-run-ovn\") pod \"ovn-controller-2l4h5-config-gl774\" (UID: \"80a355b8-835d-4581-86c3-6fc035b5c65c\") " pod="openstack/ovn-controller-2l4h5-config-gl774" Dec 04 10:33:19 crc kubenswrapper[4831]: I1204 10:33:19.248130 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6x86\" (UniqueName: \"kubernetes.io/projected/80a355b8-835d-4581-86c3-6fc035b5c65c-kube-api-access-d6x86\") pod \"ovn-controller-2l4h5-config-gl774\" (UID: \"80a355b8-835d-4581-86c3-6fc035b5c65c\") " pod="openstack/ovn-controller-2l4h5-config-gl774" Dec 04 10:33:19 crc kubenswrapper[4831]: I1204 10:33:19.248182 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/80a355b8-835d-4581-86c3-6fc035b5c65c-scripts\") pod \"ovn-controller-2l4h5-config-gl774\" (UID: \"80a355b8-835d-4581-86c3-6fc035b5c65c\") " pod="openstack/ovn-controller-2l4h5-config-gl774" Dec 04 10:33:19 crc kubenswrapper[4831]: I1204 10:33:19.294625 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50ea30fe-69af-42c7-baf7-3a3bb0ab7882" path="/var/lib/kubelet/pods/50ea30fe-69af-42c7-baf7-3a3bb0ab7882/volumes" Dec 04 10:33:19 crc kubenswrapper[4831]: I1204 10:33:19.295373 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3fabd1d-3171-451c-b105-8f33112581c5" path="/var/lib/kubelet/pods/c3fabd1d-3171-451c-b105-8f33112581c5/volumes" Dec 04 10:33:19 crc kubenswrapper[4831]: I1204 10:33:19.346509 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-e63e-account-create-rx9tx"] Dec 04 10:33:19 crc kubenswrapper[4831]: I1204 10:33:19.350502 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6x86\" (UniqueName: \"kubernetes.io/projected/80a355b8-835d-4581-86c3-6fc035b5c65c-kube-api-access-d6x86\") pod \"ovn-controller-2l4h5-config-gl774\" (UID: \"80a355b8-835d-4581-86c3-6fc035b5c65c\") " pod="openstack/ovn-controller-2l4h5-config-gl774" Dec 04 10:33:19 crc kubenswrapper[4831]: I1204 10:33:19.350565 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/80a355b8-835d-4581-86c3-6fc035b5c65c-scripts\") pod \"ovn-controller-2l4h5-config-gl774\" (UID: \"80a355b8-835d-4581-86c3-6fc035b5c65c\") " pod="openstack/ovn-controller-2l4h5-config-gl774" Dec 04 10:33:19 crc kubenswrapper[4831]: I1204 10:33:19.350628 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/80a355b8-835d-4581-86c3-6fc035b5c65c-var-log-ovn\") pod \"ovn-controller-2l4h5-config-gl774\" (UID: \"80a355b8-835d-4581-86c3-6fc035b5c65c\") " pod="openstack/ovn-controller-2l4h5-config-gl774" Dec 04 10:33:19 crc kubenswrapper[4831]: I1204 10:33:19.350655 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/80a355b8-835d-4581-86c3-6fc035b5c65c-var-run\") pod \"ovn-controller-2l4h5-config-gl774\" (UID: \"80a355b8-835d-4581-86c3-6fc035b5c65c\") " pod="openstack/ovn-controller-2l4h5-config-gl774" Dec 04 10:33:19 crc kubenswrapper[4831]: I1204 10:33:19.350698 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/80a355b8-835d-4581-86c3-6fc035b5c65c-additional-scripts\") pod \"ovn-controller-2l4h5-config-gl774\" (UID: \"80a355b8-835d-4581-86c3-6fc035b5c65c\") " pod="openstack/ovn-controller-2l4h5-config-gl774" Dec 04 10:33:19 crc kubenswrapper[4831]: I1204 10:33:19.350726 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/80a355b8-835d-4581-86c3-6fc035b5c65c-var-run-ovn\") pod \"ovn-controller-2l4h5-config-gl774\" (UID: \"80a355b8-835d-4581-86c3-6fc035b5c65c\") " pod="openstack/ovn-controller-2l4h5-config-gl774" Dec 04 10:33:19 crc kubenswrapper[4831]: I1204 10:33:19.351033 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/80a355b8-835d-4581-86c3-6fc035b5c65c-var-run-ovn\") pod \"ovn-controller-2l4h5-config-gl774\" (UID: \"80a355b8-835d-4581-86c3-6fc035b5c65c\") " pod="openstack/ovn-controller-2l4h5-config-gl774" Dec 04 10:33:19 crc kubenswrapper[4831]: I1204 10:33:19.351082 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/80a355b8-835d-4581-86c3-6fc035b5c65c-var-run\") pod \"ovn-controller-2l4h5-config-gl774\" (UID: \"80a355b8-835d-4581-86c3-6fc035b5c65c\") " pod="openstack/ovn-controller-2l4h5-config-gl774" Dec 04 10:33:19 crc kubenswrapper[4831]: I1204 10:33:19.351502 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/80a355b8-835d-4581-86c3-6fc035b5c65c-var-log-ovn\") pod \"ovn-controller-2l4h5-config-gl774\" (UID: \"80a355b8-835d-4581-86c3-6fc035b5c65c\") " pod="openstack/ovn-controller-2l4h5-config-gl774" Dec 04 10:33:19 crc kubenswrapper[4831]: I1204 10:33:19.351705 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/80a355b8-835d-4581-86c3-6fc035b5c65c-additional-scripts\") pod \"ovn-controller-2l4h5-config-gl774\" (UID: \"80a355b8-835d-4581-86c3-6fc035b5c65c\") " pod="openstack/ovn-controller-2l4h5-config-gl774" Dec 04 10:33:19 crc kubenswrapper[4831]: I1204 10:33:19.377811 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/80a355b8-835d-4581-86c3-6fc035b5c65c-scripts\") pod \"ovn-controller-2l4h5-config-gl774\" (UID: \"80a355b8-835d-4581-86c3-6fc035b5c65c\") " pod="openstack/ovn-controller-2l4h5-config-gl774" Dec 04 10:33:19 crc kubenswrapper[4831]: I1204 10:33:19.415165 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6x86\" (UniqueName: \"kubernetes.io/projected/80a355b8-835d-4581-86c3-6fc035b5c65c-kube-api-access-d6x86\") pod \"ovn-controller-2l4h5-config-gl774\" (UID: \"80a355b8-835d-4581-86c3-6fc035b5c65c\") " pod="openstack/ovn-controller-2l4h5-config-gl774" Dec 04 10:33:19 crc kubenswrapper[4831]: I1204 10:33:19.551784 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2l4h5-config-gl774" Dec 04 10:33:19 crc kubenswrapper[4831]: I1204 10:33:19.624321 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-e63e-account-create-rx9tx" event={"ID":"9433b0a1-4aba-4dea-9245-00e6e0ea65b3","Type":"ContainerStarted","Data":"ccdaa3af490d2817a4dd9f5eb9be954fd5d1237f294140174213b0fed108470b"} Dec 04 10:33:19 crc kubenswrapper[4831]: I1204 10:33:19.813575 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-cd08-account-create-6kj6f"] Dec 04 10:33:19 crc kubenswrapper[4831]: I1204 10:33:19.927869 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-xh7nf" Dec 04 10:33:19 crc kubenswrapper[4831]: I1204 10:33:19.959645 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zccvx\" (UniqueName: \"kubernetes.io/projected/6f9eb652-90e2-4231-a441-a9947e9fc782-kube-api-access-zccvx\") pod \"6f9eb652-90e2-4231-a441-a9947e9fc782\" (UID: \"6f9eb652-90e2-4231-a441-a9947e9fc782\") " Dec 04 10:33:19 crc kubenswrapper[4831]: I1204 10:33:19.959720 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f9eb652-90e2-4231-a441-a9947e9fc782-scripts\") pod \"6f9eb652-90e2-4231-a441-a9947e9fc782\" (UID: \"6f9eb652-90e2-4231-a441-a9947e9fc782\") " Dec 04 10:33:19 crc kubenswrapper[4831]: I1204 10:33:19.959778 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6f9eb652-90e2-4231-a441-a9947e9fc782-swiftconf\") pod \"6f9eb652-90e2-4231-a441-a9947e9fc782\" (UID: \"6f9eb652-90e2-4231-a441-a9947e9fc782\") " Dec 04 10:33:19 crc kubenswrapper[4831]: I1204 10:33:19.959885 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6f9eb652-90e2-4231-a441-a9947e9fc782-dispersionconf\") pod \"6f9eb652-90e2-4231-a441-a9947e9fc782\" (UID: \"6f9eb652-90e2-4231-a441-a9947e9fc782\") " Dec 04 10:33:19 crc kubenswrapper[4831]: I1204 10:33:19.960001 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6f9eb652-90e2-4231-a441-a9947e9fc782-ring-data-devices\") pod \"6f9eb652-90e2-4231-a441-a9947e9fc782\" (UID: \"6f9eb652-90e2-4231-a441-a9947e9fc782\") " Dec 04 10:33:19 crc kubenswrapper[4831]: I1204 10:33:19.960025 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f9eb652-90e2-4231-a441-a9947e9fc782-combined-ca-bundle\") pod \"6f9eb652-90e2-4231-a441-a9947e9fc782\" (UID: \"6f9eb652-90e2-4231-a441-a9947e9fc782\") " Dec 04 10:33:19 crc kubenswrapper[4831]: I1204 10:33:19.960058 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6f9eb652-90e2-4231-a441-a9947e9fc782-etc-swift\") pod \"6f9eb652-90e2-4231-a441-a9947e9fc782\" (UID: \"6f9eb652-90e2-4231-a441-a9947e9fc782\") " Dec 04 10:33:19 crc kubenswrapper[4831]: I1204 10:33:19.961978 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f9eb652-90e2-4231-a441-a9947e9fc782-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "6f9eb652-90e2-4231-a441-a9947e9fc782" (UID: "6f9eb652-90e2-4231-a441-a9947e9fc782"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:33:19 crc kubenswrapper[4831]: I1204 10:33:19.962063 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f9eb652-90e2-4231-a441-a9947e9fc782-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "6f9eb652-90e2-4231-a441-a9947e9fc782" (UID: "6f9eb652-90e2-4231-a441-a9947e9fc782"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:33:19 crc kubenswrapper[4831]: I1204 10:33:19.974122 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f9eb652-90e2-4231-a441-a9947e9fc782-kube-api-access-zccvx" (OuterVolumeSpecName: "kube-api-access-zccvx") pod "6f9eb652-90e2-4231-a441-a9947e9fc782" (UID: "6f9eb652-90e2-4231-a441-a9947e9fc782"). InnerVolumeSpecName "kube-api-access-zccvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:33:19 crc kubenswrapper[4831]: I1204 10:33:19.978448 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f9eb652-90e2-4231-a441-a9947e9fc782-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "6f9eb652-90e2-4231-a441-a9947e9fc782" (UID: "6f9eb652-90e2-4231-a441-a9947e9fc782"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:33:19 crc kubenswrapper[4831]: I1204 10:33:19.999431 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f9eb652-90e2-4231-a441-a9947e9fc782-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f9eb652-90e2-4231-a441-a9947e9fc782" (UID: "6f9eb652-90e2-4231-a441-a9947e9fc782"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:33:20 crc kubenswrapper[4831]: I1204 10:33:20.003528 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f9eb652-90e2-4231-a441-a9947e9fc782-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "6f9eb652-90e2-4231-a441-a9947e9fc782" (UID: "6f9eb652-90e2-4231-a441-a9947e9fc782"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:33:20 crc kubenswrapper[4831]: I1204 10:33:20.005704 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f9eb652-90e2-4231-a441-a9947e9fc782-scripts" (OuterVolumeSpecName: "scripts") pod "6f9eb652-90e2-4231-a441-a9947e9fc782" (UID: "6f9eb652-90e2-4231-a441-a9947e9fc782"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:33:20 crc kubenswrapper[4831]: I1204 10:33:20.061909 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6f9eb652-90e2-4231-a441-a9947e9fc782-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:20 crc kubenswrapper[4831]: I1204 10:33:20.062143 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f9eb652-90e2-4231-a441-a9947e9fc782-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:20 crc kubenswrapper[4831]: I1204 10:33:20.062153 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6f9eb652-90e2-4231-a441-a9947e9fc782-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:20 crc kubenswrapper[4831]: I1204 10:33:20.062165 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zccvx\" (UniqueName: \"kubernetes.io/projected/6f9eb652-90e2-4231-a441-a9947e9fc782-kube-api-access-zccvx\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:20 crc kubenswrapper[4831]: I1204 10:33:20.062176 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f9eb652-90e2-4231-a441-a9947e9fc782-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:20 crc kubenswrapper[4831]: I1204 10:33:20.062184 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6f9eb652-90e2-4231-a441-a9947e9fc782-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:20 crc kubenswrapper[4831]: I1204 10:33:20.062192 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6f9eb652-90e2-4231-a441-a9947e9fc782-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:20 crc kubenswrapper[4831]: I1204 10:33:20.163379 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2l4h5-config-gl774"] Dec 04 10:33:20 crc kubenswrapper[4831]: W1204 10:33:20.188327 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80a355b8_835d_4581_86c3_6fc035b5c65c.slice/crio-88c11c3a08d610e65e529019408a906b7628ddfe1d23337c6439ec63d0977cfd WatchSource:0}: Error finding container 88c11c3a08d610e65e529019408a906b7628ddfe1d23337c6439ec63d0977cfd: Status 404 returned error can't find the container with id 88c11c3a08d610e65e529019408a906b7628ddfe1d23337c6439ec63d0977cfd Dec 04 10:33:20 crc kubenswrapper[4831]: I1204 10:33:20.634375 4831 generic.go:334] "Generic (PLEG): container finished" podID="9433b0a1-4aba-4dea-9245-00e6e0ea65b3" containerID="707c8a641959817a87955edd4ebfe1341fbcac68ce3e10572059efeb384fd81a" exitCode=0 Dec 04 10:33:20 crc kubenswrapper[4831]: I1204 10:33:20.634493 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-e63e-account-create-rx9tx" event={"ID":"9433b0a1-4aba-4dea-9245-00e6e0ea65b3","Type":"ContainerDied","Data":"707c8a641959817a87955edd4ebfe1341fbcac68ce3e10572059efeb384fd81a"} Dec 04 10:33:20 crc kubenswrapper[4831]: I1204 10:33:20.636416 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-xh7nf" event={"ID":"6f9eb652-90e2-4231-a441-a9947e9fc782","Type":"ContainerDied","Data":"bbacc049ede87a697a393c4a560ed3bdbc6f7c1f88a6140244c0fa57f36ce832"} Dec 04 10:33:20 crc kubenswrapper[4831]: I1204 10:33:20.636453 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-xh7nf" Dec 04 10:33:20 crc kubenswrapper[4831]: I1204 10:33:20.636458 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbacc049ede87a697a393c4a560ed3bdbc6f7c1f88a6140244c0fa57f36ce832" Dec 04 10:33:20 crc kubenswrapper[4831]: I1204 10:33:20.642903 4831 generic.go:334] "Generic (PLEG): container finished" podID="d25f503b-ff4c-4e83-9ac4-8342e6b525a5" containerID="1528f69fc1b30778b5461d697506bcae221f968b424cbcea9da7e200d091e0a2" exitCode=0 Dec 04 10:33:20 crc kubenswrapper[4831]: I1204 10:33:20.642984 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-cd08-account-create-6kj6f" event={"ID":"d25f503b-ff4c-4e83-9ac4-8342e6b525a5","Type":"ContainerDied","Data":"1528f69fc1b30778b5461d697506bcae221f968b424cbcea9da7e200d091e0a2"} Dec 04 10:33:20 crc kubenswrapper[4831]: I1204 10:33:20.643014 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-cd08-account-create-6kj6f" event={"ID":"d25f503b-ff4c-4e83-9ac4-8342e6b525a5","Type":"ContainerStarted","Data":"50f1c90795fb99f51b4b581811a09d43b3ab949a5d330c083161bc242441dc32"} Dec 04 10:33:20 crc kubenswrapper[4831]: I1204 10:33:20.645528 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2l4h5-config-gl774" event={"ID":"80a355b8-835d-4581-86c3-6fc035b5c65c","Type":"ContainerStarted","Data":"be9a409630e74deee12f0e444e4b33754b450525865232bcf4919edab314ae23"} Dec 04 10:33:20 crc kubenswrapper[4831]: I1204 10:33:20.645563 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2l4h5-config-gl774" event={"ID":"80a355b8-835d-4581-86c3-6fc035b5c65c","Type":"ContainerStarted","Data":"88c11c3a08d610e65e529019408a906b7628ddfe1d23337c6439ec63d0977cfd"} Dec 04 10:33:20 crc kubenswrapper[4831]: I1204 10:33:20.693505 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-2l4h5-config-gl774" podStartSLOduration=1.6934852889999998 podStartE2EDuration="1.693485289s" podCreationTimestamp="2025-12-04 10:33:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:33:20.682879369 +0000 UTC m=+1097.632054683" watchObservedRunningTime="2025-12-04 10:33:20.693485289 +0000 UTC m=+1097.642660623" Dec 04 10:33:21 crc kubenswrapper[4831]: I1204 10:33:21.664080 4831 generic.go:334] "Generic (PLEG): container finished" podID="80a355b8-835d-4581-86c3-6fc035b5c65c" containerID="be9a409630e74deee12f0e444e4b33754b450525865232bcf4919edab314ae23" exitCode=0 Dec 04 10:33:21 crc kubenswrapper[4831]: I1204 10:33:21.664487 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2l4h5-config-gl774" event={"ID":"80a355b8-835d-4581-86c3-6fc035b5c65c","Type":"ContainerDied","Data":"be9a409630e74deee12f0e444e4b33754b450525865232bcf4919edab314ae23"} Dec 04 10:33:22 crc kubenswrapper[4831]: I1204 10:33:22.096558 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-e63e-account-create-rx9tx" Dec 04 10:33:22 crc kubenswrapper[4831]: I1204 10:33:22.102831 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-cd08-account-create-6kj6f" Dec 04 10:33:22 crc kubenswrapper[4831]: I1204 10:33:22.195037 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28mbh\" (UniqueName: \"kubernetes.io/projected/d25f503b-ff4c-4e83-9ac4-8342e6b525a5-kube-api-access-28mbh\") pod \"d25f503b-ff4c-4e83-9ac4-8342e6b525a5\" (UID: \"d25f503b-ff4c-4e83-9ac4-8342e6b525a5\") " Dec 04 10:33:22 crc kubenswrapper[4831]: I1204 10:33:22.195273 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwt4d\" (UniqueName: \"kubernetes.io/projected/9433b0a1-4aba-4dea-9245-00e6e0ea65b3-kube-api-access-xwt4d\") pod \"9433b0a1-4aba-4dea-9245-00e6e0ea65b3\" (UID: \"9433b0a1-4aba-4dea-9245-00e6e0ea65b3\") " Dec 04 10:33:22 crc kubenswrapper[4831]: I1204 10:33:22.201820 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d25f503b-ff4c-4e83-9ac4-8342e6b525a5-kube-api-access-28mbh" (OuterVolumeSpecName: "kube-api-access-28mbh") pod "d25f503b-ff4c-4e83-9ac4-8342e6b525a5" (UID: "d25f503b-ff4c-4e83-9ac4-8342e6b525a5"). InnerVolumeSpecName "kube-api-access-28mbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:33:22 crc kubenswrapper[4831]: I1204 10:33:22.201958 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9433b0a1-4aba-4dea-9245-00e6e0ea65b3-kube-api-access-xwt4d" (OuterVolumeSpecName: "kube-api-access-xwt4d") pod "9433b0a1-4aba-4dea-9245-00e6e0ea65b3" (UID: "9433b0a1-4aba-4dea-9245-00e6e0ea65b3"). InnerVolumeSpecName "kube-api-access-xwt4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:33:22 crc kubenswrapper[4831]: I1204 10:33:22.297716 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwt4d\" (UniqueName: \"kubernetes.io/projected/9433b0a1-4aba-4dea-9245-00e6e0ea65b3-kube-api-access-xwt4d\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:22 crc kubenswrapper[4831]: I1204 10:33:22.297752 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28mbh\" (UniqueName: \"kubernetes.io/projected/d25f503b-ff4c-4e83-9ac4-8342e6b525a5-kube-api-access-28mbh\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:22 crc kubenswrapper[4831]: I1204 10:33:22.339109 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:22 crc kubenswrapper[4831]: I1204 10:33:22.339160 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:22 crc kubenswrapper[4831]: I1204 10:33:22.341998 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:22 crc kubenswrapper[4831]: I1204 10:33:22.672778 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-cd08-account-create-6kj6f" event={"ID":"d25f503b-ff4c-4e83-9ac4-8342e6b525a5","Type":"ContainerDied","Data":"50f1c90795fb99f51b4b581811a09d43b3ab949a5d330c083161bc242441dc32"} Dec 04 10:33:22 crc kubenswrapper[4831]: I1204 10:33:22.672833 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50f1c90795fb99f51b4b581811a09d43b3ab949a5d330c083161bc242441dc32" Dec 04 10:33:22 crc kubenswrapper[4831]: I1204 10:33:22.672792 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-cd08-account-create-6kj6f" Dec 04 10:33:22 crc kubenswrapper[4831]: I1204 10:33:22.674307 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-e63e-account-create-rx9tx" event={"ID":"9433b0a1-4aba-4dea-9245-00e6e0ea65b3","Type":"ContainerDied","Data":"ccdaa3af490d2817a4dd9f5eb9be954fd5d1237f294140174213b0fed108470b"} Dec 04 10:33:22 crc kubenswrapper[4831]: I1204 10:33:22.674340 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ccdaa3af490d2817a4dd9f5eb9be954fd5d1237f294140174213b0fed108470b" Dec 04 10:33:22 crc kubenswrapper[4831]: I1204 10:33:22.674397 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-e63e-account-create-rx9tx" Dec 04 10:33:22 crc kubenswrapper[4831]: I1204 10:33:22.682024 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:22 crc kubenswrapper[4831]: I1204 10:33:22.962089 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2l4h5-config-gl774" Dec 04 10:33:23 crc kubenswrapper[4831]: I1204 10:33:23.009581 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/80a355b8-835d-4581-86c3-6fc035b5c65c-additional-scripts\") pod \"80a355b8-835d-4581-86c3-6fc035b5c65c\" (UID: \"80a355b8-835d-4581-86c3-6fc035b5c65c\") " Dec 04 10:33:23 crc kubenswrapper[4831]: I1204 10:33:23.009639 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/80a355b8-835d-4581-86c3-6fc035b5c65c-var-run-ovn\") pod \"80a355b8-835d-4581-86c3-6fc035b5c65c\" (UID: \"80a355b8-835d-4581-86c3-6fc035b5c65c\") " Dec 04 10:33:23 crc kubenswrapper[4831]: I1204 10:33:23.009688 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/80a355b8-835d-4581-86c3-6fc035b5c65c-var-log-ovn\") pod \"80a355b8-835d-4581-86c3-6fc035b5c65c\" (UID: \"80a355b8-835d-4581-86c3-6fc035b5c65c\") " Dec 04 10:33:23 crc kubenswrapper[4831]: I1204 10:33:23.009736 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/80a355b8-835d-4581-86c3-6fc035b5c65c-scripts\") pod \"80a355b8-835d-4581-86c3-6fc035b5c65c\" (UID: \"80a355b8-835d-4581-86c3-6fc035b5c65c\") " Dec 04 10:33:23 crc kubenswrapper[4831]: I1204 10:33:23.009788 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6x86\" (UniqueName: \"kubernetes.io/projected/80a355b8-835d-4581-86c3-6fc035b5c65c-kube-api-access-d6x86\") pod \"80a355b8-835d-4581-86c3-6fc035b5c65c\" (UID: \"80a355b8-835d-4581-86c3-6fc035b5c65c\") " Dec 04 10:33:23 crc kubenswrapper[4831]: I1204 10:33:23.009834 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/80a355b8-835d-4581-86c3-6fc035b5c65c-var-run\") pod \"80a355b8-835d-4581-86c3-6fc035b5c65c\" (UID: \"80a355b8-835d-4581-86c3-6fc035b5c65c\") " Dec 04 10:33:23 crc kubenswrapper[4831]: I1204 10:33:23.010298 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/80a355b8-835d-4581-86c3-6fc035b5c65c-var-run" (OuterVolumeSpecName: "var-run") pod "80a355b8-835d-4581-86c3-6fc035b5c65c" (UID: "80a355b8-835d-4581-86c3-6fc035b5c65c"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:33:23 crc kubenswrapper[4831]: I1204 10:33:23.010358 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/80a355b8-835d-4581-86c3-6fc035b5c65c-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "80a355b8-835d-4581-86c3-6fc035b5c65c" (UID: "80a355b8-835d-4581-86c3-6fc035b5c65c"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:33:23 crc kubenswrapper[4831]: I1204 10:33:23.011284 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80a355b8-835d-4581-86c3-6fc035b5c65c-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "80a355b8-835d-4581-86c3-6fc035b5c65c" (UID: "80a355b8-835d-4581-86c3-6fc035b5c65c"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:33:23 crc kubenswrapper[4831]: I1204 10:33:23.011322 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/80a355b8-835d-4581-86c3-6fc035b5c65c-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "80a355b8-835d-4581-86c3-6fc035b5c65c" (UID: "80a355b8-835d-4581-86c3-6fc035b5c65c"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:33:23 crc kubenswrapper[4831]: I1204 10:33:23.011532 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80a355b8-835d-4581-86c3-6fc035b5c65c-scripts" (OuterVolumeSpecName: "scripts") pod "80a355b8-835d-4581-86c3-6fc035b5c65c" (UID: "80a355b8-835d-4581-86c3-6fc035b5c65c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:33:23 crc kubenswrapper[4831]: I1204 10:33:23.016804 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80a355b8-835d-4581-86c3-6fc035b5c65c-kube-api-access-d6x86" (OuterVolumeSpecName: "kube-api-access-d6x86") pod "80a355b8-835d-4581-86c3-6fc035b5c65c" (UID: "80a355b8-835d-4581-86c3-6fc035b5c65c"). InnerVolumeSpecName "kube-api-access-d6x86". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:33:23 crc kubenswrapper[4831]: I1204 10:33:23.111390 4831 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/80a355b8-835d-4581-86c3-6fc035b5c65c-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:23 crc kubenswrapper[4831]: I1204 10:33:23.111714 4831 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/80a355b8-835d-4581-86c3-6fc035b5c65c-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:23 crc kubenswrapper[4831]: I1204 10:33:23.111725 4831 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/80a355b8-835d-4581-86c3-6fc035b5c65c-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:23 crc kubenswrapper[4831]: I1204 10:33:23.111736 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/80a355b8-835d-4581-86c3-6fc035b5c65c-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:23 crc kubenswrapper[4831]: I1204 10:33:23.111746 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6x86\" (UniqueName: \"kubernetes.io/projected/80a355b8-835d-4581-86c3-6fc035b5c65c-kube-api-access-d6x86\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:23 crc kubenswrapper[4831]: I1204 10:33:23.111761 4831 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/80a355b8-835d-4581-86c3-6fc035b5c65c-var-run\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:23 crc kubenswrapper[4831]: I1204 10:33:23.228286 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-2l4h5-config-gl774"] Dec 04 10:33:23 crc kubenswrapper[4831]: I1204 10:33:23.234642 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-2l4h5-config-gl774"] Dec 04 10:33:23 crc kubenswrapper[4831]: I1204 10:33:23.285181 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80a355b8-835d-4581-86c3-6fc035b5c65c" path="/var/lib/kubelet/pods/80a355b8-835d-4581-86c3-6fc035b5c65c/volumes" Dec 04 10:33:23 crc kubenswrapper[4831]: I1204 10:33:23.682434 4831 scope.go:117] "RemoveContainer" containerID="be9a409630e74deee12f0e444e4b33754b450525865232bcf4919edab314ae23" Dec 04 10:33:23 crc kubenswrapper[4831]: I1204 10:33:23.682432 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2l4h5-config-gl774" Dec 04 10:33:24 crc kubenswrapper[4831]: I1204 10:33:24.228829 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c8b58305-3383-4cf9-9127-481c1bf16ba5-etc-swift\") pod \"swift-storage-0\" (UID: \"c8b58305-3383-4cf9-9127-481c1bf16ba5\") " pod="openstack/swift-storage-0" Dec 04 10:33:24 crc kubenswrapper[4831]: I1204 10:33:24.243023 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c8b58305-3383-4cf9-9127-481c1bf16ba5-etc-swift\") pod \"swift-storage-0\" (UID: \"c8b58305-3383-4cf9-9127-481c1bf16ba5\") " pod="openstack/swift-storage-0" Dec 04 10:33:24 crc kubenswrapper[4831]: I1204 10:33:24.396712 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 04 10:33:24 crc kubenswrapper[4831]: I1204 10:33:24.950214 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 04 10:33:24 crc kubenswrapper[4831]: W1204 10:33:24.951516 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8b58305_3383_4cf9_9127_481c1bf16ba5.slice/crio-6a50916f0d6f0ff8dca38420595023c181ded6067a9dd5856ce1b5f8c5571d02 WatchSource:0}: Error finding container 6a50916f0d6f0ff8dca38420595023c181ded6067a9dd5856ce1b5f8c5571d02: Status 404 returned error can't find the container with id 6a50916f0d6f0ff8dca38420595023c181ded6067a9dd5856ce1b5f8c5571d02 Dec 04 10:33:25 crc kubenswrapper[4831]: I1204 10:33:25.129027 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="2d1e04df-4c2a-440f-b533-9903a58c8ecc" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.106:5671: connect: connection refused" Dec 04 10:33:25 crc kubenswrapper[4831]: I1204 10:33:25.416792 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-notifications-server-0" podUID="d13ed0c0-494b-46b5-965d-1426a9575119" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.107:5671: connect: connection refused" Dec 04 10:33:25 crc kubenswrapper[4831]: I1204 10:33:25.459193 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 04 10:33:25 crc kubenswrapper[4831]: I1204 10:33:25.459497 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="e058318a-8379-4f10-9860-7af36b3278e5" containerName="config-reloader" containerID="cri-o://e1e401b2381a51ccd3f08295b648ee827e192e48bf52c3ce23d645caafe2437c" gracePeriod=600 Dec 04 10:33:25 crc kubenswrapper[4831]: I1204 10:33:25.459549 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="e058318a-8379-4f10-9860-7af36b3278e5" containerName="prometheus" containerID="cri-o://5ab972791e70f3d9847c29e4571f38d67adab4b6eaa9cc79e48340f2769731dc" gracePeriod=600 Dec 04 10:33:25 crc kubenswrapper[4831]: I1204 10:33:25.459602 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="e058318a-8379-4f10-9860-7af36b3278e5" containerName="thanos-sidecar" containerID="cri-o://d5101b883935f875441bbee30620f782e02cc4a55144d0af5cc004f056164c07" gracePeriod=600 Dec 04 10:33:25 crc kubenswrapper[4831]: I1204 10:33:25.706981 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8b58305-3383-4cf9-9127-481c1bf16ba5","Type":"ContainerStarted","Data":"6a50916f0d6f0ff8dca38420595023c181ded6067a9dd5856ce1b5f8c5571d02"} Dec 04 10:33:25 crc kubenswrapper[4831]: I1204 10:33:25.710222 4831 generic.go:334] "Generic (PLEG): container finished" podID="e058318a-8379-4f10-9860-7af36b3278e5" containerID="5ab972791e70f3d9847c29e4571f38d67adab4b6eaa9cc79e48340f2769731dc" exitCode=0 Dec 04 10:33:25 crc kubenswrapper[4831]: I1204 10:33:25.710246 4831 generic.go:334] "Generic (PLEG): container finished" podID="e058318a-8379-4f10-9860-7af36b3278e5" containerID="d5101b883935f875441bbee30620f782e02cc4a55144d0af5cc004f056164c07" exitCode=0 Dec 04 10:33:25 crc kubenswrapper[4831]: I1204 10:33:25.710259 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e058318a-8379-4f10-9860-7af36b3278e5","Type":"ContainerDied","Data":"5ab972791e70f3d9847c29e4571f38d67adab4b6eaa9cc79e48340f2769731dc"} Dec 04 10:33:25 crc kubenswrapper[4831]: I1204 10:33:25.710301 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e058318a-8379-4f10-9860-7af36b3278e5","Type":"ContainerDied","Data":"d5101b883935f875441bbee30620f782e02cc4a55144d0af5cc004f056164c07"} Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.451594 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.573868 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e058318a-8379-4f10-9860-7af36b3278e5-web-config\") pod \"e058318a-8379-4f10-9860-7af36b3278e5\" (UID: \"e058318a-8379-4f10-9860-7af36b3278e5\") " Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.573950 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e058318a-8379-4f10-9860-7af36b3278e5-prometheus-metric-storage-rulefiles-0\") pod \"e058318a-8379-4f10-9860-7af36b3278e5\" (UID: \"e058318a-8379-4f10-9860-7af36b3278e5\") " Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.573980 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e058318a-8379-4f10-9860-7af36b3278e5-thanos-prometheus-http-client-file\") pod \"e058318a-8379-4f10-9860-7af36b3278e5\" (UID: \"e058318a-8379-4f10-9860-7af36b3278e5\") " Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.574000 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e058318a-8379-4f10-9860-7af36b3278e5-tls-assets\") pod \"e058318a-8379-4f10-9860-7af36b3278e5\" (UID: \"e058318a-8379-4f10-9860-7af36b3278e5\") " Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.574020 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fgwv\" (UniqueName: \"kubernetes.io/projected/e058318a-8379-4f10-9860-7af36b3278e5-kube-api-access-7fgwv\") pod \"e058318a-8379-4f10-9860-7af36b3278e5\" (UID: \"e058318a-8379-4f10-9860-7af36b3278e5\") " Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.574038 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e058318a-8379-4f10-9860-7af36b3278e5-config-out\") pod \"e058318a-8379-4f10-9860-7af36b3278e5\" (UID: \"e058318a-8379-4f10-9860-7af36b3278e5\") " Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.574200 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b44ca90-3490-4c8d-99fe-c1474d342303\") pod \"e058318a-8379-4f10-9860-7af36b3278e5\" (UID: \"e058318a-8379-4f10-9860-7af36b3278e5\") " Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.574254 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e058318a-8379-4f10-9860-7af36b3278e5-config\") pod \"e058318a-8379-4f10-9860-7af36b3278e5\" (UID: \"e058318a-8379-4f10-9860-7af36b3278e5\") " Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.575637 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e058318a-8379-4f10-9860-7af36b3278e5-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "e058318a-8379-4f10-9860-7af36b3278e5" (UID: "e058318a-8379-4f10-9860-7af36b3278e5"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.580123 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e058318a-8379-4f10-9860-7af36b3278e5-kube-api-access-7fgwv" (OuterVolumeSpecName: "kube-api-access-7fgwv") pod "e058318a-8379-4f10-9860-7af36b3278e5" (UID: "e058318a-8379-4f10-9860-7af36b3278e5"). InnerVolumeSpecName "kube-api-access-7fgwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.580142 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e058318a-8379-4f10-9860-7af36b3278e5-config-out" (OuterVolumeSpecName: "config-out") pod "e058318a-8379-4f10-9860-7af36b3278e5" (UID: "e058318a-8379-4f10-9860-7af36b3278e5"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.580712 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e058318a-8379-4f10-9860-7af36b3278e5-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "e058318a-8379-4f10-9860-7af36b3278e5" (UID: "e058318a-8379-4f10-9860-7af36b3278e5"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.581551 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e058318a-8379-4f10-9860-7af36b3278e5-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "e058318a-8379-4f10-9860-7af36b3278e5" (UID: "e058318a-8379-4f10-9860-7af36b3278e5"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.584874 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e058318a-8379-4f10-9860-7af36b3278e5-config" (OuterVolumeSpecName: "config") pod "e058318a-8379-4f10-9860-7af36b3278e5" (UID: "e058318a-8379-4f10-9860-7af36b3278e5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.599166 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e058318a-8379-4f10-9860-7af36b3278e5-web-config" (OuterVolumeSpecName: "web-config") pod "e058318a-8379-4f10-9860-7af36b3278e5" (UID: "e058318a-8379-4f10-9860-7af36b3278e5"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.608549 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b44ca90-3490-4c8d-99fe-c1474d342303" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "e058318a-8379-4f10-9860-7af36b3278e5" (UID: "e058318a-8379-4f10-9860-7af36b3278e5"). InnerVolumeSpecName "pvc-7b44ca90-3490-4c8d-99fe-c1474d342303". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.675890 4831 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-7b44ca90-3490-4c8d-99fe-c1474d342303\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b44ca90-3490-4c8d-99fe-c1474d342303\") on node \"crc\" " Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.675928 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e058318a-8379-4f10-9860-7af36b3278e5-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.675946 4831 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e058318a-8379-4f10-9860-7af36b3278e5-web-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.675969 4831 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e058318a-8379-4f10-9860-7af36b3278e5-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.675985 4831 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e058318a-8379-4f10-9860-7af36b3278e5-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.675998 4831 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e058318a-8379-4f10-9860-7af36b3278e5-tls-assets\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.676010 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fgwv\" (UniqueName: \"kubernetes.io/projected/e058318a-8379-4f10-9860-7af36b3278e5-kube-api-access-7fgwv\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.676023 4831 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e058318a-8379-4f10-9860-7af36b3278e5-config-out\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.707770 4831 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.707939 4831 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-7b44ca90-3490-4c8d-99fe-c1474d342303" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b44ca90-3490-4c8d-99fe-c1474d342303") on node "crc" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.724673 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8b58305-3383-4cf9-9127-481c1bf16ba5","Type":"ContainerStarted","Data":"25643e0a5840ac3e730e158f59c84f9bbb9ebc8370364a73cc2d54e8cfcf3fd3"} Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.724722 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8b58305-3383-4cf9-9127-481c1bf16ba5","Type":"ContainerStarted","Data":"0eadbd39146ea0b1847b79a0b20aa38cb58f291ec86a05df7b978a4a23313d8f"} Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.724732 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8b58305-3383-4cf9-9127-481c1bf16ba5","Type":"ContainerStarted","Data":"8d3057314b953e4691de6c9b341a63d5d57d32bde7b24ae923796bf079f8a41d"} Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.728335 4831 generic.go:334] "Generic (PLEG): container finished" podID="e058318a-8379-4f10-9860-7af36b3278e5" containerID="e1e401b2381a51ccd3f08295b648ee827e192e48bf52c3ce23d645caafe2437c" exitCode=0 Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.728421 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e058318a-8379-4f10-9860-7af36b3278e5","Type":"ContainerDied","Data":"e1e401b2381a51ccd3f08295b648ee827e192e48bf52c3ce23d645caafe2437c"} Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.728451 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e058318a-8379-4f10-9860-7af36b3278e5","Type":"ContainerDied","Data":"5641e197a46725bffb15cd9332375a4b4af9a436ec9f385e633700f071ba879e"} Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.728468 4831 scope.go:117] "RemoveContainer" containerID="5ab972791e70f3d9847c29e4571f38d67adab4b6eaa9cc79e48340f2769731dc" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.728741 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.757033 4831 scope.go:117] "RemoveContainer" containerID="d5101b883935f875441bbee30620f782e02cc4a55144d0af5cc004f056164c07" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.769529 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.777014 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.777936 4831 reconciler_common.go:293] "Volume detached for volume \"pvc-7b44ca90-3490-4c8d-99fe-c1474d342303\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b44ca90-3490-4c8d-99fe-c1474d342303\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.807794 4831 scope.go:117] "RemoveContainer" containerID="e1e401b2381a51ccd3f08295b648ee827e192e48bf52c3ce23d645caafe2437c" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.819370 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 04 10:33:26 crc kubenswrapper[4831]: E1204 10:33:26.819748 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e058318a-8379-4f10-9860-7af36b3278e5" containerName="prometheus" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.819766 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e058318a-8379-4f10-9860-7af36b3278e5" containerName="prometheus" Dec 04 10:33:26 crc kubenswrapper[4831]: E1204 10:33:26.819777 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e058318a-8379-4f10-9860-7af36b3278e5" containerName="init-config-reloader" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.819785 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e058318a-8379-4f10-9860-7af36b3278e5" containerName="init-config-reloader" Dec 04 10:33:26 crc kubenswrapper[4831]: E1204 10:33:26.819798 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80a355b8-835d-4581-86c3-6fc035b5c65c" containerName="ovn-config" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.819804 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="80a355b8-835d-4581-86c3-6fc035b5c65c" containerName="ovn-config" Dec 04 10:33:26 crc kubenswrapper[4831]: E1204 10:33:26.819811 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9433b0a1-4aba-4dea-9245-00e6e0ea65b3" containerName="mariadb-account-create" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.819816 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="9433b0a1-4aba-4dea-9245-00e6e0ea65b3" containerName="mariadb-account-create" Dec 04 10:33:26 crc kubenswrapper[4831]: E1204 10:33:26.819823 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d25f503b-ff4c-4e83-9ac4-8342e6b525a5" containerName="mariadb-account-create" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.819829 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="d25f503b-ff4c-4e83-9ac4-8342e6b525a5" containerName="mariadb-account-create" Dec 04 10:33:26 crc kubenswrapper[4831]: E1204 10:33:26.819838 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e058318a-8379-4f10-9860-7af36b3278e5" containerName="config-reloader" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.819845 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e058318a-8379-4f10-9860-7af36b3278e5" containerName="config-reloader" Dec 04 10:33:26 crc kubenswrapper[4831]: E1204 10:33:26.819855 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9eb652-90e2-4231-a441-a9947e9fc782" containerName="swift-ring-rebalance" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.819861 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9eb652-90e2-4231-a441-a9947e9fc782" containerName="swift-ring-rebalance" Dec 04 10:33:26 crc kubenswrapper[4831]: E1204 10:33:26.819878 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e058318a-8379-4f10-9860-7af36b3278e5" containerName="thanos-sidecar" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.819884 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e058318a-8379-4f10-9860-7af36b3278e5" containerName="thanos-sidecar" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.820061 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="d25f503b-ff4c-4e83-9ac4-8342e6b525a5" containerName="mariadb-account-create" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.820074 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9eb652-90e2-4231-a441-a9947e9fc782" containerName="swift-ring-rebalance" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.820085 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e058318a-8379-4f10-9860-7af36b3278e5" containerName="prometheus" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.820093 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="80a355b8-835d-4581-86c3-6fc035b5c65c" containerName="ovn-config" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.820102 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="9433b0a1-4aba-4dea-9245-00e6e0ea65b3" containerName="mariadb-account-create" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.820112 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e058318a-8379-4f10-9860-7af36b3278e5" containerName="config-reloader" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.820121 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e058318a-8379-4f10-9860-7af36b3278e5" containerName="thanos-sidecar" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.822571 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.826804 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.826915 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-6p9wz" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.826974 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.827149 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.827277 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.827584 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.835141 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.850250 4831 scope.go:117] "RemoveContainer" containerID="d125746330cd0b36bdef72c8c0c34dbc600aa31a341f76e101598642834a0abf" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.856555 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.887225 4831 scope.go:117] "RemoveContainer" containerID="5ab972791e70f3d9847c29e4571f38d67adab4b6eaa9cc79e48340f2769731dc" Dec 04 10:33:26 crc kubenswrapper[4831]: E1204 10:33:26.889550 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ab972791e70f3d9847c29e4571f38d67adab4b6eaa9cc79e48340f2769731dc\": container with ID starting with 5ab972791e70f3d9847c29e4571f38d67adab4b6eaa9cc79e48340f2769731dc not found: ID does not exist" containerID="5ab972791e70f3d9847c29e4571f38d67adab4b6eaa9cc79e48340f2769731dc" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.889595 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ab972791e70f3d9847c29e4571f38d67adab4b6eaa9cc79e48340f2769731dc"} err="failed to get container status \"5ab972791e70f3d9847c29e4571f38d67adab4b6eaa9cc79e48340f2769731dc\": rpc error: code = NotFound desc = could not find container \"5ab972791e70f3d9847c29e4571f38d67adab4b6eaa9cc79e48340f2769731dc\": container with ID starting with 5ab972791e70f3d9847c29e4571f38d67adab4b6eaa9cc79e48340f2769731dc not found: ID does not exist" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.889632 4831 scope.go:117] "RemoveContainer" containerID="d5101b883935f875441bbee30620f782e02cc4a55144d0af5cc004f056164c07" Dec 04 10:33:26 crc kubenswrapper[4831]: E1204 10:33:26.890452 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5101b883935f875441bbee30620f782e02cc4a55144d0af5cc004f056164c07\": container with ID starting with d5101b883935f875441bbee30620f782e02cc4a55144d0af5cc004f056164c07 not found: ID does not exist" containerID="d5101b883935f875441bbee30620f782e02cc4a55144d0af5cc004f056164c07" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.890475 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5101b883935f875441bbee30620f782e02cc4a55144d0af5cc004f056164c07"} err="failed to get container status \"d5101b883935f875441bbee30620f782e02cc4a55144d0af5cc004f056164c07\": rpc error: code = NotFound desc = could not find container \"d5101b883935f875441bbee30620f782e02cc4a55144d0af5cc004f056164c07\": container with ID starting with d5101b883935f875441bbee30620f782e02cc4a55144d0af5cc004f056164c07 not found: ID does not exist" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.890489 4831 scope.go:117] "RemoveContainer" containerID="e1e401b2381a51ccd3f08295b648ee827e192e48bf52c3ce23d645caafe2437c" Dec 04 10:33:26 crc kubenswrapper[4831]: E1204 10:33:26.891793 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1e401b2381a51ccd3f08295b648ee827e192e48bf52c3ce23d645caafe2437c\": container with ID starting with e1e401b2381a51ccd3f08295b648ee827e192e48bf52c3ce23d645caafe2437c not found: ID does not exist" containerID="e1e401b2381a51ccd3f08295b648ee827e192e48bf52c3ce23d645caafe2437c" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.891817 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1e401b2381a51ccd3f08295b648ee827e192e48bf52c3ce23d645caafe2437c"} err="failed to get container status \"e1e401b2381a51ccd3f08295b648ee827e192e48bf52c3ce23d645caafe2437c\": rpc error: code = NotFound desc = could not find container \"e1e401b2381a51ccd3f08295b648ee827e192e48bf52c3ce23d645caafe2437c\": container with ID starting with e1e401b2381a51ccd3f08295b648ee827e192e48bf52c3ce23d645caafe2437c not found: ID does not exist" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.891832 4831 scope.go:117] "RemoveContainer" containerID="d125746330cd0b36bdef72c8c0c34dbc600aa31a341f76e101598642834a0abf" Dec 04 10:33:26 crc kubenswrapper[4831]: E1204 10:33:26.895572 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d125746330cd0b36bdef72c8c0c34dbc600aa31a341f76e101598642834a0abf\": container with ID starting with d125746330cd0b36bdef72c8c0c34dbc600aa31a341f76e101598642834a0abf not found: ID does not exist" containerID="d125746330cd0b36bdef72c8c0c34dbc600aa31a341f76e101598642834a0abf" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.895597 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d125746330cd0b36bdef72c8c0c34dbc600aa31a341f76e101598642834a0abf"} err="failed to get container status \"d125746330cd0b36bdef72c8c0c34dbc600aa31a341f76e101598642834a0abf\": rpc error: code = NotFound desc = could not find container \"d125746330cd0b36bdef72c8c0c34dbc600aa31a341f76e101598642834a0abf\": container with ID starting with d125746330cd0b36bdef72c8c0c34dbc600aa31a341f76e101598642834a0abf not found: ID does not exist" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.981341 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/44484889-57d0-478f-a19a-2e78f913faf3-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.981393 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7b44ca90-3490-4c8d-99fe-c1474d342303\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b44ca90-3490-4c8d-99fe-c1474d342303\") pod \"prometheus-metric-storage-0\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.981431 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/44484889-57d0-478f-a19a-2e78f913faf3-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.981462 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll6xr\" (UniqueName: \"kubernetes.io/projected/44484889-57d0-478f-a19a-2e78f913faf3-kube-api-access-ll6xr\") pod \"prometheus-metric-storage-0\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.981484 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/44484889-57d0-478f-a19a-2e78f913faf3-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.981505 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/44484889-57d0-478f-a19a-2e78f913faf3-config\") pod \"prometheus-metric-storage-0\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.981523 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44484889-57d0-478f-a19a-2e78f913faf3-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.981542 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/44484889-57d0-478f-a19a-2e78f913faf3-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.981571 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/44484889-57d0-478f-a19a-2e78f913faf3-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.981588 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/44484889-57d0-478f-a19a-2e78f913faf3-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:26 crc kubenswrapper[4831]: I1204 10:33:26.981651 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/44484889-57d0-478f-a19a-2e78f913faf3-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:27 crc kubenswrapper[4831]: I1204 10:33:27.083396 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll6xr\" (UniqueName: \"kubernetes.io/projected/44484889-57d0-478f-a19a-2e78f913faf3-kube-api-access-ll6xr\") pod \"prometheus-metric-storage-0\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:27 crc kubenswrapper[4831]: I1204 10:33:27.083439 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/44484889-57d0-478f-a19a-2e78f913faf3-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:27 crc kubenswrapper[4831]: I1204 10:33:27.083464 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/44484889-57d0-478f-a19a-2e78f913faf3-config\") pod \"prometheus-metric-storage-0\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:27 crc kubenswrapper[4831]: I1204 10:33:27.083483 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44484889-57d0-478f-a19a-2e78f913faf3-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:27 crc kubenswrapper[4831]: I1204 10:33:27.083508 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/44484889-57d0-478f-a19a-2e78f913faf3-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:27 crc kubenswrapper[4831]: I1204 10:33:27.083546 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/44484889-57d0-478f-a19a-2e78f913faf3-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:27 crc kubenswrapper[4831]: I1204 10:33:27.083567 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/44484889-57d0-478f-a19a-2e78f913faf3-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:27 crc kubenswrapper[4831]: I1204 10:33:27.083585 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/44484889-57d0-478f-a19a-2e78f913faf3-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:27 crc kubenswrapper[4831]: I1204 10:33:27.083647 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/44484889-57d0-478f-a19a-2e78f913faf3-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:27 crc kubenswrapper[4831]: I1204 10:33:27.083685 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7b44ca90-3490-4c8d-99fe-c1474d342303\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b44ca90-3490-4c8d-99fe-c1474d342303\") pod \"prometheus-metric-storage-0\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:27 crc kubenswrapper[4831]: I1204 10:33:27.083716 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/44484889-57d0-478f-a19a-2e78f913faf3-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:27 crc kubenswrapper[4831]: I1204 10:33:27.085484 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/44484889-57d0-478f-a19a-2e78f913faf3-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:27 crc kubenswrapper[4831]: I1204 10:33:27.089223 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44484889-57d0-478f-a19a-2e78f913faf3-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:27 crc kubenswrapper[4831]: I1204 10:33:27.089275 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/44484889-57d0-478f-a19a-2e78f913faf3-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:27 crc kubenswrapper[4831]: I1204 10:33:27.093081 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/44484889-57d0-478f-a19a-2e78f913faf3-config\") pod \"prometheus-metric-storage-0\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:27 crc kubenswrapper[4831]: I1204 10:33:27.093358 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/44484889-57d0-478f-a19a-2e78f913faf3-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:27 crc kubenswrapper[4831]: I1204 10:33:27.095416 4831 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 04 10:33:27 crc kubenswrapper[4831]: I1204 10:33:27.095453 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7b44ca90-3490-4c8d-99fe-c1474d342303\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b44ca90-3490-4c8d-99fe-c1474d342303\") pod \"prometheus-metric-storage-0\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c51bcc27fcdd05e3b872f332894f06756ce616ce82fb1b91e1091af3050124aa/globalmount\"" pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:27 crc kubenswrapper[4831]: I1204 10:33:27.095757 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/44484889-57d0-478f-a19a-2e78f913faf3-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:27 crc kubenswrapper[4831]: I1204 10:33:27.096282 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/44484889-57d0-478f-a19a-2e78f913faf3-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:27 crc kubenswrapper[4831]: I1204 10:33:27.096629 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/44484889-57d0-478f-a19a-2e78f913faf3-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:27 crc kubenswrapper[4831]: I1204 10:33:27.096758 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/44484889-57d0-478f-a19a-2e78f913faf3-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:27 crc kubenswrapper[4831]: I1204 10:33:27.117135 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll6xr\" (UniqueName: \"kubernetes.io/projected/44484889-57d0-478f-a19a-2e78f913faf3-kube-api-access-ll6xr\") pod \"prometheus-metric-storage-0\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:27 crc kubenswrapper[4831]: I1204 10:33:27.141337 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7b44ca90-3490-4c8d-99fe-c1474d342303\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b44ca90-3490-4c8d-99fe-c1474d342303\") pod \"prometheus-metric-storage-0\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:27 crc kubenswrapper[4831]: I1204 10:33:27.151594 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:27 crc kubenswrapper[4831]: I1204 10:33:27.289673 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e058318a-8379-4f10-9860-7af36b3278e5" path="/var/lib/kubelet/pods/e058318a-8379-4f10-9860-7af36b3278e5/volumes" Dec 04 10:33:27 crc kubenswrapper[4831]: I1204 10:33:27.743638 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8b58305-3383-4cf9-9127-481c1bf16ba5","Type":"ContainerStarted","Data":"86bb898139a3ade932e4b524d85af43d472cd2f0c8f0a90f3b390bfe7f4f077f"} Dec 04 10:33:27 crc kubenswrapper[4831]: I1204 10:33:27.743978 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8b58305-3383-4cf9-9127-481c1bf16ba5","Type":"ContainerStarted","Data":"c7dc0a2e85103db59c44e2e94f30a4d08ddb30b852d74038affa8b5f385f569d"} Dec 04 10:33:27 crc kubenswrapper[4831]: I1204 10:33:27.752112 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 04 10:33:27 crc kubenswrapper[4831]: W1204 10:33:27.757011 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44484889_57d0_478f_a19a_2e78f913faf3.slice/crio-513900fbf669dc79ceb241084fd1b200a072a12239ce24a633bbaed1680cd8ff WatchSource:0}: Error finding container 513900fbf669dc79ceb241084fd1b200a072a12239ce24a633bbaed1680cd8ff: Status 404 returned error can't find the container with id 513900fbf669dc79ceb241084fd1b200a072a12239ce24a633bbaed1680cd8ff Dec 04 10:33:28 crc kubenswrapper[4831]: I1204 10:33:28.757127 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8b58305-3383-4cf9-9127-481c1bf16ba5","Type":"ContainerStarted","Data":"d1741d9d32ad0b4a4794d306ae60dabb0d24aabd8ff365ae78d3f05338d86c9b"} Dec 04 10:33:28 crc kubenswrapper[4831]: I1204 10:33:28.757498 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8b58305-3383-4cf9-9127-481c1bf16ba5","Type":"ContainerStarted","Data":"7722a9b7ff441c7ffff486e3248bd7282a57e5ddea52ba36a9f3224a408cd6c6"} Dec 04 10:33:28 crc kubenswrapper[4831]: I1204 10:33:28.757514 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8b58305-3383-4cf9-9127-481c1bf16ba5","Type":"ContainerStarted","Data":"405c9f841d05733feb4bba491af5021aacfc8d4893d3bfc27d298999b918f675"} Dec 04 10:33:28 crc kubenswrapper[4831]: I1204 10:33:28.758551 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"44484889-57d0-478f-a19a-2e78f913faf3","Type":"ContainerStarted","Data":"513900fbf669dc79ceb241084fd1b200a072a12239ce24a633bbaed1680cd8ff"} Dec 04 10:33:29 crc kubenswrapper[4831]: I1204 10:33:29.782368 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8b58305-3383-4cf9-9127-481c1bf16ba5","Type":"ContainerStarted","Data":"b8642771a2277a41e90cc1986fccfea6e3696e586c2553b993abf363d4c1b040"} Dec 04 10:33:29 crc kubenswrapper[4831]: I1204 10:33:29.782898 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8b58305-3383-4cf9-9127-481c1bf16ba5","Type":"ContainerStarted","Data":"ddce09d009f50a3c34efe7120ef2c7b01b885cc10280b0c526bbf31c9f49b828"} Dec 04 10:33:30 crc kubenswrapper[4831]: I1204 10:33:30.795420 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"44484889-57d0-478f-a19a-2e78f913faf3","Type":"ContainerStarted","Data":"29d94e9ef7d95da5ed21b153b30bb46ff0865fe591891f5e9d44bf1656bd0e05"} Dec 04 10:33:30 crc kubenswrapper[4831]: I1204 10:33:30.801135 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8b58305-3383-4cf9-9127-481c1bf16ba5","Type":"ContainerStarted","Data":"47e0c876e016e578cc3a7b89dd2210b613f8acbbd46660e6c5eef028df5d5941"} Dec 04 10:33:30 crc kubenswrapper[4831]: I1204 10:33:30.801180 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8b58305-3383-4cf9-9127-481c1bf16ba5","Type":"ContainerStarted","Data":"8285bb5cfd478129d8bbd509ba2b66fd9ad1b5afc66fc28113f3b979b24ab5ae"} Dec 04 10:33:30 crc kubenswrapper[4831]: I1204 10:33:30.801195 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8b58305-3383-4cf9-9127-481c1bf16ba5","Type":"ContainerStarted","Data":"c84fe987579955865df6893168797c5fa7d4c2201d1001ce9dfed25e84ef1795"} Dec 04 10:33:30 crc kubenswrapper[4831]: I1204 10:33:30.974350 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-4a99-account-create-22f2m"] Dec 04 10:33:30 crc kubenswrapper[4831]: I1204 10:33:30.980732 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-4a99-account-create-22f2m" Dec 04 10:33:30 crc kubenswrapper[4831]: I1204 10:33:30.984440 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-db-secret" Dec 04 10:33:31 crc kubenswrapper[4831]: I1204 10:33:31.000439 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-4a99-account-create-22f2m"] Dec 04 10:33:31 crc kubenswrapper[4831]: I1204 10:33:31.151900 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlmb8\" (UniqueName: \"kubernetes.io/projected/130910ed-23cb-4473-b0cb-e59907852513-kube-api-access-xlmb8\") pod \"watcher-4a99-account-create-22f2m\" (UID: \"130910ed-23cb-4473-b0cb-e59907852513\") " pod="openstack/watcher-4a99-account-create-22f2m" Dec 04 10:33:31 crc kubenswrapper[4831]: I1204 10:33:31.253491 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlmb8\" (UniqueName: \"kubernetes.io/projected/130910ed-23cb-4473-b0cb-e59907852513-kube-api-access-xlmb8\") pod \"watcher-4a99-account-create-22f2m\" (UID: \"130910ed-23cb-4473-b0cb-e59907852513\") " pod="openstack/watcher-4a99-account-create-22f2m" Dec 04 10:33:31 crc kubenswrapper[4831]: I1204 10:33:31.284315 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlmb8\" (UniqueName: \"kubernetes.io/projected/130910ed-23cb-4473-b0cb-e59907852513-kube-api-access-xlmb8\") pod \"watcher-4a99-account-create-22f2m\" (UID: \"130910ed-23cb-4473-b0cb-e59907852513\") " pod="openstack/watcher-4a99-account-create-22f2m" Dec 04 10:33:31 crc kubenswrapper[4831]: I1204 10:33:31.312905 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-4a99-account-create-22f2m" Dec 04 10:33:31 crc kubenswrapper[4831]: I1204 10:33:31.781008 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-4a99-account-create-22f2m"] Dec 04 10:33:31 crc kubenswrapper[4831]: I1204 10:33:31.815325 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8b58305-3383-4cf9-9127-481c1bf16ba5","Type":"ContainerStarted","Data":"5c1c433b7b980b1d44b8b28ff36e07a5d13f7620fa09e702a4fffff9a07204cb"} Dec 04 10:33:31 crc kubenswrapper[4831]: I1204 10:33:31.816336 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8b58305-3383-4cf9-9127-481c1bf16ba5","Type":"ContainerStarted","Data":"aefe4384084a614a093ca12e11812503ba753d9e6ba58674dae9d0b63e0aa8e0"} Dec 04 10:33:31 crc kubenswrapper[4831]: I1204 10:33:31.818144 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-4a99-account-create-22f2m" event={"ID":"130910ed-23cb-4473-b0cb-e59907852513","Type":"ContainerStarted","Data":"0e5d58a0c51af9fbde2571028806e0359b86340cfab7b0b531d968ab7d459ce9"} Dec 04 10:33:31 crc kubenswrapper[4831]: I1204 10:33:31.855704 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=36.636731834 podStartE2EDuration="40.855678893s" podCreationTimestamp="2025-12-04 10:32:51 +0000 UTC" firstStartedPulling="2025-12-04 10:33:24.953324748 +0000 UTC m=+1101.902500062" lastFinishedPulling="2025-12-04 10:33:29.172271807 +0000 UTC m=+1106.121447121" observedRunningTime="2025-12-04 10:33:31.844434837 +0000 UTC m=+1108.793610151" watchObservedRunningTime="2025-12-04 10:33:31.855678893 +0000 UTC m=+1108.804854207" Dec 04 10:33:32 crc kubenswrapper[4831]: I1204 10:33:32.218982 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59db87f787-ql2pd"] Dec 04 10:33:32 crc kubenswrapper[4831]: I1204 10:33:32.220652 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59db87f787-ql2pd" Dec 04 10:33:32 crc kubenswrapper[4831]: I1204 10:33:32.226693 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 04 10:33:32 crc kubenswrapper[4831]: I1204 10:33:32.229853 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59db87f787-ql2pd"] Dec 04 10:33:32 crc kubenswrapper[4831]: I1204 10:33:32.368507 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fg7d\" (UniqueName: \"kubernetes.io/projected/2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61-kube-api-access-2fg7d\") pod \"dnsmasq-dns-59db87f787-ql2pd\" (UID: \"2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61\") " pod="openstack/dnsmasq-dns-59db87f787-ql2pd" Dec 04 10:33:32 crc kubenswrapper[4831]: I1204 10:33:32.368607 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61-dns-svc\") pod \"dnsmasq-dns-59db87f787-ql2pd\" (UID: \"2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61\") " pod="openstack/dnsmasq-dns-59db87f787-ql2pd" Dec 04 10:33:32 crc kubenswrapper[4831]: I1204 10:33:32.368770 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61-ovsdbserver-nb\") pod \"dnsmasq-dns-59db87f787-ql2pd\" (UID: \"2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61\") " pod="openstack/dnsmasq-dns-59db87f787-ql2pd" Dec 04 10:33:32 crc kubenswrapper[4831]: I1204 10:33:32.368824 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61-config\") pod \"dnsmasq-dns-59db87f787-ql2pd\" (UID: \"2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61\") " pod="openstack/dnsmasq-dns-59db87f787-ql2pd" Dec 04 10:33:32 crc kubenswrapper[4831]: I1204 10:33:32.368874 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61-ovsdbserver-sb\") pod \"dnsmasq-dns-59db87f787-ql2pd\" (UID: \"2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61\") " pod="openstack/dnsmasq-dns-59db87f787-ql2pd" Dec 04 10:33:32 crc kubenswrapper[4831]: I1204 10:33:32.369007 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61-dns-swift-storage-0\") pod \"dnsmasq-dns-59db87f787-ql2pd\" (UID: \"2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61\") " pod="openstack/dnsmasq-dns-59db87f787-ql2pd" Dec 04 10:33:32 crc kubenswrapper[4831]: I1204 10:33:32.470990 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61-ovsdbserver-nb\") pod \"dnsmasq-dns-59db87f787-ql2pd\" (UID: \"2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61\") " pod="openstack/dnsmasq-dns-59db87f787-ql2pd" Dec 04 10:33:32 crc kubenswrapper[4831]: I1204 10:33:32.471068 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61-config\") pod \"dnsmasq-dns-59db87f787-ql2pd\" (UID: \"2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61\") " pod="openstack/dnsmasq-dns-59db87f787-ql2pd" Dec 04 10:33:32 crc kubenswrapper[4831]: I1204 10:33:32.471131 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61-ovsdbserver-sb\") pod \"dnsmasq-dns-59db87f787-ql2pd\" (UID: \"2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61\") " pod="openstack/dnsmasq-dns-59db87f787-ql2pd" Dec 04 10:33:32 crc kubenswrapper[4831]: I1204 10:33:32.471250 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61-dns-swift-storage-0\") pod \"dnsmasq-dns-59db87f787-ql2pd\" (UID: \"2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61\") " pod="openstack/dnsmasq-dns-59db87f787-ql2pd" Dec 04 10:33:32 crc kubenswrapper[4831]: I1204 10:33:32.471335 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fg7d\" (UniqueName: \"kubernetes.io/projected/2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61-kube-api-access-2fg7d\") pod \"dnsmasq-dns-59db87f787-ql2pd\" (UID: \"2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61\") " pod="openstack/dnsmasq-dns-59db87f787-ql2pd" Dec 04 10:33:32 crc kubenswrapper[4831]: I1204 10:33:32.471511 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61-dns-svc\") pod \"dnsmasq-dns-59db87f787-ql2pd\" (UID: \"2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61\") " pod="openstack/dnsmasq-dns-59db87f787-ql2pd" Dec 04 10:33:32 crc kubenswrapper[4831]: I1204 10:33:32.472080 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61-dns-swift-storage-0\") pod \"dnsmasq-dns-59db87f787-ql2pd\" (UID: \"2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61\") " pod="openstack/dnsmasq-dns-59db87f787-ql2pd" Dec 04 10:33:32 crc kubenswrapper[4831]: I1204 10:33:32.472079 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61-ovsdbserver-nb\") pod \"dnsmasq-dns-59db87f787-ql2pd\" (UID: \"2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61\") " pod="openstack/dnsmasq-dns-59db87f787-ql2pd" Dec 04 10:33:32 crc kubenswrapper[4831]: I1204 10:33:32.472157 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61-config\") pod \"dnsmasq-dns-59db87f787-ql2pd\" (UID: \"2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61\") " pod="openstack/dnsmasq-dns-59db87f787-ql2pd" Dec 04 10:33:32 crc kubenswrapper[4831]: I1204 10:33:32.472220 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61-ovsdbserver-sb\") pod \"dnsmasq-dns-59db87f787-ql2pd\" (UID: \"2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61\") " pod="openstack/dnsmasq-dns-59db87f787-ql2pd" Dec 04 10:33:32 crc kubenswrapper[4831]: I1204 10:33:32.473011 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61-dns-svc\") pod \"dnsmasq-dns-59db87f787-ql2pd\" (UID: \"2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61\") " pod="openstack/dnsmasq-dns-59db87f787-ql2pd" Dec 04 10:33:32 crc kubenswrapper[4831]: I1204 10:33:32.493270 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fg7d\" (UniqueName: \"kubernetes.io/projected/2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61-kube-api-access-2fg7d\") pod \"dnsmasq-dns-59db87f787-ql2pd\" (UID: \"2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61\") " pod="openstack/dnsmasq-dns-59db87f787-ql2pd" Dec 04 10:33:32 crc kubenswrapper[4831]: I1204 10:33:32.536606 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59db87f787-ql2pd" Dec 04 10:33:32 crc kubenswrapper[4831]: I1204 10:33:32.825971 4831 generic.go:334] "Generic (PLEG): container finished" podID="130910ed-23cb-4473-b0cb-e59907852513" containerID="88e3709a20824fbda63a27d39977898e63725c6f9ce174c6256c7a8bde9ef169" exitCode=0 Dec 04 10:33:32 crc kubenswrapper[4831]: I1204 10:33:32.826074 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-4a99-account-create-22f2m" event={"ID":"130910ed-23cb-4473-b0cb-e59907852513","Type":"ContainerDied","Data":"88e3709a20824fbda63a27d39977898e63725c6f9ce174c6256c7a8bde9ef169"} Dec 04 10:33:33 crc kubenswrapper[4831]: I1204 10:33:33.013020 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59db87f787-ql2pd"] Dec 04 10:33:33 crc kubenswrapper[4831]: W1204 10:33:33.015189 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ee1b2fc_3b04_4f1c_9671_a3cb4945fd61.slice/crio-3b2ab94b0ba2a0fb250c141450a8c84a7e427f1c716a41e4539468d6ebc71464 WatchSource:0}: Error finding container 3b2ab94b0ba2a0fb250c141450a8c84a7e427f1c716a41e4539468d6ebc71464: Status 404 returned error can't find the container with id 3b2ab94b0ba2a0fb250c141450a8c84a7e427f1c716a41e4539468d6ebc71464 Dec 04 10:33:33 crc kubenswrapper[4831]: I1204 10:33:33.839803 4831 generic.go:334] "Generic (PLEG): container finished" podID="2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61" containerID="c914bf4be613426ce9e2fd91cb2f06ec674896d4ae1a394ceb55b91ae1dd71f9" exitCode=0 Dec 04 10:33:33 crc kubenswrapper[4831]: I1204 10:33:33.839856 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59db87f787-ql2pd" event={"ID":"2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61","Type":"ContainerDied","Data":"c914bf4be613426ce9e2fd91cb2f06ec674896d4ae1a394ceb55b91ae1dd71f9"} Dec 04 10:33:33 crc kubenswrapper[4831]: I1204 10:33:33.840145 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59db87f787-ql2pd" event={"ID":"2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61","Type":"ContainerStarted","Data":"3b2ab94b0ba2a0fb250c141450a8c84a7e427f1c716a41e4539468d6ebc71464"} Dec 04 10:33:34 crc kubenswrapper[4831]: I1204 10:33:34.203158 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-4a99-account-create-22f2m" Dec 04 10:33:34 crc kubenswrapper[4831]: I1204 10:33:34.299515 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlmb8\" (UniqueName: \"kubernetes.io/projected/130910ed-23cb-4473-b0cb-e59907852513-kube-api-access-xlmb8\") pod \"130910ed-23cb-4473-b0cb-e59907852513\" (UID: \"130910ed-23cb-4473-b0cb-e59907852513\") " Dec 04 10:33:34 crc kubenswrapper[4831]: I1204 10:33:34.309641 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/130910ed-23cb-4473-b0cb-e59907852513-kube-api-access-xlmb8" (OuterVolumeSpecName: "kube-api-access-xlmb8") pod "130910ed-23cb-4473-b0cb-e59907852513" (UID: "130910ed-23cb-4473-b0cb-e59907852513"). InnerVolumeSpecName "kube-api-access-xlmb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:33:34 crc kubenswrapper[4831]: I1204 10:33:34.402132 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlmb8\" (UniqueName: \"kubernetes.io/projected/130910ed-23cb-4473-b0cb-e59907852513-kube-api-access-xlmb8\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:34 crc kubenswrapper[4831]: I1204 10:33:34.805982 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 04 10:33:34 crc kubenswrapper[4831]: I1204 10:33:34.859044 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59db87f787-ql2pd" event={"ID":"2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61","Type":"ContainerStarted","Data":"82cdfc36281272e7a838a3d13d489979203c8808a5e83c3711a76a6ac90ae985"} Dec 04 10:33:34 crc kubenswrapper[4831]: I1204 10:33:34.860520 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59db87f787-ql2pd" Dec 04 10:33:34 crc kubenswrapper[4831]: I1204 10:33:34.865647 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-4a99-account-create-22f2m" event={"ID":"130910ed-23cb-4473-b0cb-e59907852513","Type":"ContainerDied","Data":"0e5d58a0c51af9fbde2571028806e0359b86340cfab7b0b531d968ab7d459ce9"} Dec 04 10:33:34 crc kubenswrapper[4831]: I1204 10:33:34.865733 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e5d58a0c51af9fbde2571028806e0359b86340cfab7b0b531d968ab7d459ce9" Dec 04 10:33:34 crc kubenswrapper[4831]: I1204 10:33:34.865824 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-4a99-account-create-22f2m" Dec 04 10:33:34 crc kubenswrapper[4831]: I1204 10:33:34.890989 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59db87f787-ql2pd" podStartSLOduration=2.890966618 podStartE2EDuration="2.890966618s" podCreationTimestamp="2025-12-04 10:33:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:33:34.888602108 +0000 UTC m=+1111.837777442" watchObservedRunningTime="2025-12-04 10:33:34.890966618 +0000 UTC m=+1111.840141952" Dec 04 10:33:35 crc kubenswrapper[4831]: I1204 10:33:35.127882 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:33:35 crc kubenswrapper[4831]: I1204 10:33:35.167133 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-6mcl2"] Dec 04 10:33:35 crc kubenswrapper[4831]: E1204 10:33:35.167476 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="130910ed-23cb-4473-b0cb-e59907852513" containerName="mariadb-account-create" Dec 04 10:33:35 crc kubenswrapper[4831]: I1204 10:33:35.167490 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="130910ed-23cb-4473-b0cb-e59907852513" containerName="mariadb-account-create" Dec 04 10:33:35 crc kubenswrapper[4831]: I1204 10:33:35.167686 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="130910ed-23cb-4473-b0cb-e59907852513" containerName="mariadb-account-create" Dec 04 10:33:35 crc kubenswrapper[4831]: I1204 10:33:35.168384 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6mcl2" Dec 04 10:33:35 crc kubenswrapper[4831]: I1204 10:33:35.187316 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-6mcl2"] Dec 04 10:33:35 crc kubenswrapper[4831]: I1204 10:33:35.216446 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnxj7\" (UniqueName: \"kubernetes.io/projected/dc17360e-f22d-4420-8de8-5a95abc8f54c-kube-api-access-cnxj7\") pod \"barbican-db-create-6mcl2\" (UID: \"dc17360e-f22d-4420-8de8-5a95abc8f54c\") " pod="openstack/barbican-db-create-6mcl2" Dec 04 10:33:35 crc kubenswrapper[4831]: I1204 10:33:35.293464 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-xhvnt"] Dec 04 10:33:35 crc kubenswrapper[4831]: I1204 10:33:35.294697 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xhvnt" Dec 04 10:33:35 crc kubenswrapper[4831]: I1204 10:33:35.295506 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-xhvnt"] Dec 04 10:33:35 crc kubenswrapper[4831]: I1204 10:33:35.317766 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ljvs\" (UniqueName: \"kubernetes.io/projected/561a4728-58d4-40dc-ad9b-604394f208d6-kube-api-access-5ljvs\") pod \"cinder-db-create-xhvnt\" (UID: \"561a4728-58d4-40dc-ad9b-604394f208d6\") " pod="openstack/cinder-db-create-xhvnt" Dec 04 10:33:35 crc kubenswrapper[4831]: I1204 10:33:35.318095 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnxj7\" (UniqueName: \"kubernetes.io/projected/dc17360e-f22d-4420-8de8-5a95abc8f54c-kube-api-access-cnxj7\") pod \"barbican-db-create-6mcl2\" (UID: \"dc17360e-f22d-4420-8de8-5a95abc8f54c\") " pod="openstack/barbican-db-create-6mcl2" Dec 04 10:33:35 crc kubenswrapper[4831]: I1204 10:33:35.340834 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnxj7\" (UniqueName: \"kubernetes.io/projected/dc17360e-f22d-4420-8de8-5a95abc8f54c-kube-api-access-cnxj7\") pod \"barbican-db-create-6mcl2\" (UID: \"dc17360e-f22d-4420-8de8-5a95abc8f54c\") " pod="openstack/barbican-db-create-6mcl2" Dec 04 10:33:35 crc kubenswrapper[4831]: I1204 10:33:35.413926 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-notifications-server-0" Dec 04 10:33:35 crc kubenswrapper[4831]: I1204 10:33:35.419524 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ljvs\" (UniqueName: \"kubernetes.io/projected/561a4728-58d4-40dc-ad9b-604394f208d6-kube-api-access-5ljvs\") pod \"cinder-db-create-xhvnt\" (UID: \"561a4728-58d4-40dc-ad9b-604394f208d6\") " pod="openstack/cinder-db-create-xhvnt" Dec 04 10:33:35 crc kubenswrapper[4831]: I1204 10:33:35.430983 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-lm2h7"] Dec 04 10:33:35 crc kubenswrapper[4831]: I1204 10:33:35.432223 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-lm2h7" Dec 04 10:33:35 crc kubenswrapper[4831]: I1204 10:33:35.434372 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 04 10:33:35 crc kubenswrapper[4831]: I1204 10:33:35.435567 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-5hfxd" Dec 04 10:33:35 crc kubenswrapper[4831]: I1204 10:33:35.435857 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 04 10:33:35 crc kubenswrapper[4831]: I1204 10:33:35.440230 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 04 10:33:35 crc kubenswrapper[4831]: I1204 10:33:35.447285 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-lm2h7"] Dec 04 10:33:35 crc kubenswrapper[4831]: I1204 10:33:35.459220 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ljvs\" (UniqueName: \"kubernetes.io/projected/561a4728-58d4-40dc-ad9b-604394f208d6-kube-api-access-5ljvs\") pod \"cinder-db-create-xhvnt\" (UID: \"561a4728-58d4-40dc-ad9b-604394f208d6\") " pod="openstack/cinder-db-create-xhvnt" Dec 04 10:33:35 crc kubenswrapper[4831]: I1204 10:33:35.483714 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6mcl2" Dec 04 10:33:35 crc kubenswrapper[4831]: I1204 10:33:35.521050 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c771ae2e-18c8-42c4-a789-7b55fea8e605-combined-ca-bundle\") pod \"keystone-db-sync-lm2h7\" (UID: \"c771ae2e-18c8-42c4-a789-7b55fea8e605\") " pod="openstack/keystone-db-sync-lm2h7" Dec 04 10:33:35 crc kubenswrapper[4831]: I1204 10:33:35.521108 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c771ae2e-18c8-42c4-a789-7b55fea8e605-config-data\") pod \"keystone-db-sync-lm2h7\" (UID: \"c771ae2e-18c8-42c4-a789-7b55fea8e605\") " pod="openstack/keystone-db-sync-lm2h7" Dec 04 10:33:35 crc kubenswrapper[4831]: I1204 10:33:35.521276 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjsnx\" (UniqueName: \"kubernetes.io/projected/c771ae2e-18c8-42c4-a789-7b55fea8e605-kube-api-access-fjsnx\") pod \"keystone-db-sync-lm2h7\" (UID: \"c771ae2e-18c8-42c4-a789-7b55fea8e605\") " pod="openstack/keystone-db-sync-lm2h7" Dec 04 10:33:35 crc kubenswrapper[4831]: I1204 10:33:35.610688 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xhvnt" Dec 04 10:33:35 crc kubenswrapper[4831]: I1204 10:33:35.622342 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjsnx\" (UniqueName: \"kubernetes.io/projected/c771ae2e-18c8-42c4-a789-7b55fea8e605-kube-api-access-fjsnx\") pod \"keystone-db-sync-lm2h7\" (UID: \"c771ae2e-18c8-42c4-a789-7b55fea8e605\") " pod="openstack/keystone-db-sync-lm2h7" Dec 04 10:33:35 crc kubenswrapper[4831]: I1204 10:33:35.622404 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c771ae2e-18c8-42c4-a789-7b55fea8e605-combined-ca-bundle\") pod \"keystone-db-sync-lm2h7\" (UID: \"c771ae2e-18c8-42c4-a789-7b55fea8e605\") " pod="openstack/keystone-db-sync-lm2h7" Dec 04 10:33:35 crc kubenswrapper[4831]: I1204 10:33:35.622426 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c771ae2e-18c8-42c4-a789-7b55fea8e605-config-data\") pod \"keystone-db-sync-lm2h7\" (UID: \"c771ae2e-18c8-42c4-a789-7b55fea8e605\") " pod="openstack/keystone-db-sync-lm2h7" Dec 04 10:33:35 crc kubenswrapper[4831]: I1204 10:33:35.626717 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c771ae2e-18c8-42c4-a789-7b55fea8e605-combined-ca-bundle\") pod \"keystone-db-sync-lm2h7\" (UID: \"c771ae2e-18c8-42c4-a789-7b55fea8e605\") " pod="openstack/keystone-db-sync-lm2h7" Dec 04 10:33:35 crc kubenswrapper[4831]: I1204 10:33:35.627421 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c771ae2e-18c8-42c4-a789-7b55fea8e605-config-data\") pod \"keystone-db-sync-lm2h7\" (UID: \"c771ae2e-18c8-42c4-a789-7b55fea8e605\") " pod="openstack/keystone-db-sync-lm2h7" Dec 04 10:33:35 crc kubenswrapper[4831]: I1204 10:33:35.656920 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjsnx\" (UniqueName: \"kubernetes.io/projected/c771ae2e-18c8-42c4-a789-7b55fea8e605-kube-api-access-fjsnx\") pod \"keystone-db-sync-lm2h7\" (UID: \"c771ae2e-18c8-42c4-a789-7b55fea8e605\") " pod="openstack/keystone-db-sync-lm2h7" Dec 04 10:33:35 crc kubenswrapper[4831]: I1204 10:33:35.747266 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-lm2h7" Dec 04 10:33:35 crc kubenswrapper[4831]: I1204 10:33:35.947435 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-6mcl2"] Dec 04 10:33:35 crc kubenswrapper[4831]: W1204 10:33:35.969968 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc17360e_f22d_4420_8de8_5a95abc8f54c.slice/crio-e399b7790754cc4bb96c41b61a5a66708f02d2a8ee6d3b01e0777b04d42bfce8 WatchSource:0}: Error finding container e399b7790754cc4bb96c41b61a5a66708f02d2a8ee6d3b01e0777b04d42bfce8: Status 404 returned error can't find the container with id e399b7790754cc4bb96c41b61a5a66708f02d2a8ee6d3b01e0777b04d42bfce8 Dec 04 10:33:36 crc kubenswrapper[4831]: I1204 10:33:36.191492 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-xhvnt"] Dec 04 10:33:36 crc kubenswrapper[4831]: W1204 10:33:36.195787 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod561a4728_58d4_40dc_ad9b_604394f208d6.slice/crio-0eb23e28e420d036d34c1bade058c827eace402d6253009cb1ce3e98a593974a WatchSource:0}: Error finding container 0eb23e28e420d036d34c1bade058c827eace402d6253009cb1ce3e98a593974a: Status 404 returned error can't find the container with id 0eb23e28e420d036d34c1bade058c827eace402d6253009cb1ce3e98a593974a Dec 04 10:33:36 crc kubenswrapper[4831]: I1204 10:33:36.236004 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-lm2h7"] Dec 04 10:33:36 crc kubenswrapper[4831]: I1204 10:33:36.898463 4831 generic.go:334] "Generic (PLEG): container finished" podID="44484889-57d0-478f-a19a-2e78f913faf3" containerID="29d94e9ef7d95da5ed21b153b30bb46ff0865fe591891f5e9d44bf1656bd0e05" exitCode=0 Dec 04 10:33:36 crc kubenswrapper[4831]: I1204 10:33:36.898860 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"44484889-57d0-478f-a19a-2e78f913faf3","Type":"ContainerDied","Data":"29d94e9ef7d95da5ed21b153b30bb46ff0865fe591891f5e9d44bf1656bd0e05"} Dec 04 10:33:36 crc kubenswrapper[4831]: I1204 10:33:36.913554 4831 generic.go:334] "Generic (PLEG): container finished" podID="561a4728-58d4-40dc-ad9b-604394f208d6" containerID="3a1d4237f22b0570e9b56ea8d266009644ae7c742001de722ea30ae565cae9d1" exitCode=0 Dec 04 10:33:36 crc kubenswrapper[4831]: I1204 10:33:36.913814 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xhvnt" event={"ID":"561a4728-58d4-40dc-ad9b-604394f208d6","Type":"ContainerDied","Data":"3a1d4237f22b0570e9b56ea8d266009644ae7c742001de722ea30ae565cae9d1"} Dec 04 10:33:36 crc kubenswrapper[4831]: I1204 10:33:36.913846 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xhvnt" event={"ID":"561a4728-58d4-40dc-ad9b-604394f208d6","Type":"ContainerStarted","Data":"0eb23e28e420d036d34c1bade058c827eace402d6253009cb1ce3e98a593974a"} Dec 04 10:33:36 crc kubenswrapper[4831]: I1204 10:33:36.917145 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-lm2h7" event={"ID":"c771ae2e-18c8-42c4-a789-7b55fea8e605","Type":"ContainerStarted","Data":"4f75a1246dc7499f34fa30dc03bafde66c67644f31c3caa1714f2752efe46b18"} Dec 04 10:33:36 crc kubenswrapper[4831]: I1204 10:33:36.918850 4831 generic.go:334] "Generic (PLEG): container finished" podID="dc17360e-f22d-4420-8de8-5a95abc8f54c" containerID="935821672bc4757d8ed34b09d63c80b995dd9a0e6dbdb2c7c6d87b0183e4b271" exitCode=0 Dec 04 10:33:36 crc kubenswrapper[4831]: I1204 10:33:36.919957 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6mcl2" event={"ID":"dc17360e-f22d-4420-8de8-5a95abc8f54c","Type":"ContainerDied","Data":"935821672bc4757d8ed34b09d63c80b995dd9a0e6dbdb2c7c6d87b0183e4b271"} Dec 04 10:33:36 crc kubenswrapper[4831]: I1204 10:33:36.919990 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6mcl2" event={"ID":"dc17360e-f22d-4420-8de8-5a95abc8f54c","Type":"ContainerStarted","Data":"e399b7790754cc4bb96c41b61a5a66708f02d2a8ee6d3b01e0777b04d42bfce8"} Dec 04 10:33:37 crc kubenswrapper[4831]: I1204 10:33:37.939488 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"44484889-57d0-478f-a19a-2e78f913faf3","Type":"ContainerStarted","Data":"da005076936568b2afdf2c6e7de2b3a8ed1bae2bb63a31f2b72e623077246a19"} Dec 04 10:33:37 crc kubenswrapper[4831]: I1204 10:33:37.945492 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-pl6jb"] Dec 04 10:33:37 crc kubenswrapper[4831]: I1204 10:33:37.956131 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pl6jb" Dec 04 10:33:37 crc kubenswrapper[4831]: I1204 10:33:37.970576 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znv74\" (UniqueName: \"kubernetes.io/projected/dba9a395-7b7d-4a63-be80-61caea16396b-kube-api-access-znv74\") pod \"glance-db-create-pl6jb\" (UID: \"dba9a395-7b7d-4a63-be80-61caea16396b\") " pod="openstack/glance-db-create-pl6jb" Dec 04 10:33:37 crc kubenswrapper[4831]: I1204 10:33:37.997690 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-pl6jb"] Dec 04 10:33:38 crc kubenswrapper[4831]: I1204 10:33:38.071817 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znv74\" (UniqueName: \"kubernetes.io/projected/dba9a395-7b7d-4a63-be80-61caea16396b-kube-api-access-znv74\") pod \"glance-db-create-pl6jb\" (UID: \"dba9a395-7b7d-4a63-be80-61caea16396b\") " pod="openstack/glance-db-create-pl6jb" Dec 04 10:33:38 crc kubenswrapper[4831]: I1204 10:33:38.113072 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-sync-hwbtz"] Dec 04 10:33:38 crc kubenswrapper[4831]: I1204 10:33:38.115393 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znv74\" (UniqueName: \"kubernetes.io/projected/dba9a395-7b7d-4a63-be80-61caea16396b-kube-api-access-znv74\") pod \"glance-db-create-pl6jb\" (UID: \"dba9a395-7b7d-4a63-be80-61caea16396b\") " pod="openstack/glance-db-create-pl6jb" Dec 04 10:33:38 crc kubenswrapper[4831]: I1204 10:33:38.119104 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-hwbtz" Dec 04 10:33:38 crc kubenswrapper[4831]: I1204 10:33:38.126494 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-config-data" Dec 04 10:33:38 crc kubenswrapper[4831]: I1204 10:33:38.127143 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-2vnjf" Dec 04 10:33:38 crc kubenswrapper[4831]: I1204 10:33:38.128316 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-hwbtz"] Dec 04 10:33:38 crc kubenswrapper[4831]: I1204 10:33:38.141910 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-m6n2t"] Dec 04 10:33:38 crc kubenswrapper[4831]: I1204 10:33:38.143141 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-m6n2t" Dec 04 10:33:38 crc kubenswrapper[4831]: I1204 10:33:38.148874 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-m6n2t"] Dec 04 10:33:38 crc kubenswrapper[4831]: I1204 10:33:38.173942 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a-db-sync-config-data\") pod \"watcher-db-sync-hwbtz\" (UID: \"ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a\") " pod="openstack/watcher-db-sync-hwbtz" Dec 04 10:33:38 crc kubenswrapper[4831]: I1204 10:33:38.174261 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-569l9\" (UniqueName: \"kubernetes.io/projected/ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a-kube-api-access-569l9\") pod \"watcher-db-sync-hwbtz\" (UID: \"ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a\") " pod="openstack/watcher-db-sync-hwbtz" Dec 04 10:33:38 crc kubenswrapper[4831]: I1204 10:33:38.174287 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a-combined-ca-bundle\") pod \"watcher-db-sync-hwbtz\" (UID: \"ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a\") " pod="openstack/watcher-db-sync-hwbtz" Dec 04 10:33:38 crc kubenswrapper[4831]: I1204 10:33:38.174434 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58ndm\" (UniqueName: \"kubernetes.io/projected/f36d8a75-9447-4921-8f46-f3fdb08160d8-kube-api-access-58ndm\") pod \"neutron-db-create-m6n2t\" (UID: \"f36d8a75-9447-4921-8f46-f3fdb08160d8\") " pod="openstack/neutron-db-create-m6n2t" Dec 04 10:33:38 crc kubenswrapper[4831]: I1204 10:33:38.174500 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a-config-data\") pod \"watcher-db-sync-hwbtz\" (UID: \"ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a\") " pod="openstack/watcher-db-sync-hwbtz" Dec 04 10:33:38 crc kubenswrapper[4831]: I1204 10:33:38.275743 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-569l9\" (UniqueName: \"kubernetes.io/projected/ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a-kube-api-access-569l9\") pod \"watcher-db-sync-hwbtz\" (UID: \"ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a\") " pod="openstack/watcher-db-sync-hwbtz" Dec 04 10:33:38 crc kubenswrapper[4831]: I1204 10:33:38.275785 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a-combined-ca-bundle\") pod \"watcher-db-sync-hwbtz\" (UID: \"ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a\") " pod="openstack/watcher-db-sync-hwbtz" Dec 04 10:33:38 crc kubenswrapper[4831]: I1204 10:33:38.275835 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58ndm\" (UniqueName: \"kubernetes.io/projected/f36d8a75-9447-4921-8f46-f3fdb08160d8-kube-api-access-58ndm\") pod \"neutron-db-create-m6n2t\" (UID: \"f36d8a75-9447-4921-8f46-f3fdb08160d8\") " pod="openstack/neutron-db-create-m6n2t" Dec 04 10:33:38 crc kubenswrapper[4831]: I1204 10:33:38.275876 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a-config-data\") pod \"watcher-db-sync-hwbtz\" (UID: \"ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a\") " pod="openstack/watcher-db-sync-hwbtz" Dec 04 10:33:38 crc kubenswrapper[4831]: I1204 10:33:38.275941 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a-db-sync-config-data\") pod \"watcher-db-sync-hwbtz\" (UID: \"ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a\") " pod="openstack/watcher-db-sync-hwbtz" Dec 04 10:33:38 crc kubenswrapper[4831]: I1204 10:33:38.280613 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a-db-sync-config-data\") pod \"watcher-db-sync-hwbtz\" (UID: \"ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a\") " pod="openstack/watcher-db-sync-hwbtz" Dec 04 10:33:38 crc kubenswrapper[4831]: I1204 10:33:38.305355 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a-combined-ca-bundle\") pod \"watcher-db-sync-hwbtz\" (UID: \"ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a\") " pod="openstack/watcher-db-sync-hwbtz" Dec 04 10:33:38 crc kubenswrapper[4831]: I1204 10:33:38.305781 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pl6jb" Dec 04 10:33:38 crc kubenswrapper[4831]: I1204 10:33:38.309126 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58ndm\" (UniqueName: \"kubernetes.io/projected/f36d8a75-9447-4921-8f46-f3fdb08160d8-kube-api-access-58ndm\") pod \"neutron-db-create-m6n2t\" (UID: \"f36d8a75-9447-4921-8f46-f3fdb08160d8\") " pod="openstack/neutron-db-create-m6n2t" Dec 04 10:33:38 crc kubenswrapper[4831]: I1204 10:33:38.309247 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-569l9\" (UniqueName: \"kubernetes.io/projected/ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a-kube-api-access-569l9\") pod \"watcher-db-sync-hwbtz\" (UID: \"ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a\") " pod="openstack/watcher-db-sync-hwbtz" Dec 04 10:33:38 crc kubenswrapper[4831]: I1204 10:33:38.315472 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a-config-data\") pod \"watcher-db-sync-hwbtz\" (UID: \"ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a\") " pod="openstack/watcher-db-sync-hwbtz" Dec 04 10:33:38 crc kubenswrapper[4831]: I1204 10:33:38.458340 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-hwbtz" Dec 04 10:33:38 crc kubenswrapper[4831]: I1204 10:33:38.471927 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-m6n2t" Dec 04 10:33:39 crc kubenswrapper[4831]: I1204 10:33:39.968261 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"44484889-57d0-478f-a19a-2e78f913faf3","Type":"ContainerStarted","Data":"3e76e66e07641bf5c8832f9a8bc91dc4b61d75d659d278bd05fa0d513a7eb75a"} Dec 04 10:33:42 crc kubenswrapper[4831]: I1204 10:33:42.538848 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59db87f787-ql2pd" Dec 04 10:33:42 crc kubenswrapper[4831]: I1204 10:33:42.612797 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-798f579549-frr45"] Dec 04 10:33:42 crc kubenswrapper[4831]: I1204 10:33:42.613077 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-798f579549-frr45" podUID="83014775-6674-44f9-86ed-952fec02b15b" containerName="dnsmasq-dns" containerID="cri-o://84e34fae13b6b19084707fd60a8b792c50ecd8ea5894777117ce1d55df471f27" gracePeriod=10 Dec 04 10:33:43 crc kubenswrapper[4831]: I1204 10:33:43.029217 4831 generic.go:334] "Generic (PLEG): container finished" podID="83014775-6674-44f9-86ed-952fec02b15b" containerID="84e34fae13b6b19084707fd60a8b792c50ecd8ea5894777117ce1d55df471f27" exitCode=0 Dec 04 10:33:43 crc kubenswrapper[4831]: I1204 10:33:43.029274 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-798f579549-frr45" event={"ID":"83014775-6674-44f9-86ed-952fec02b15b","Type":"ContainerDied","Data":"84e34fae13b6b19084707fd60a8b792c50ecd8ea5894777117ce1d55df471f27"} Dec 04 10:33:43 crc kubenswrapper[4831]: I1204 10:33:43.036091 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6mcl2" event={"ID":"dc17360e-f22d-4420-8de8-5a95abc8f54c","Type":"ContainerDied","Data":"e399b7790754cc4bb96c41b61a5a66708f02d2a8ee6d3b01e0777b04d42bfce8"} Dec 04 10:33:43 crc kubenswrapper[4831]: I1204 10:33:43.036137 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e399b7790754cc4bb96c41b61a5a66708f02d2a8ee6d3b01e0777b04d42bfce8" Dec 04 10:33:43 crc kubenswrapper[4831]: I1204 10:33:43.051819 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xhvnt" event={"ID":"561a4728-58d4-40dc-ad9b-604394f208d6","Type":"ContainerDied","Data":"0eb23e28e420d036d34c1bade058c827eace402d6253009cb1ce3e98a593974a"} Dec 04 10:33:43 crc kubenswrapper[4831]: I1204 10:33:43.051864 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0eb23e28e420d036d34c1bade058c827eace402d6253009cb1ce3e98a593974a" Dec 04 10:33:43 crc kubenswrapper[4831]: I1204 10:33:43.233630 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6mcl2" Dec 04 10:33:43 crc kubenswrapper[4831]: I1204 10:33:43.261522 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xhvnt" Dec 04 10:33:43 crc kubenswrapper[4831]: I1204 10:33:43.274649 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-798f579549-frr45" Dec 04 10:33:43 crc kubenswrapper[4831]: I1204 10:33:43.359341 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnxj7\" (UniqueName: \"kubernetes.io/projected/dc17360e-f22d-4420-8de8-5a95abc8f54c-kube-api-access-cnxj7\") pod \"dc17360e-f22d-4420-8de8-5a95abc8f54c\" (UID: \"dc17360e-f22d-4420-8de8-5a95abc8f54c\") " Dec 04 10:33:43 crc kubenswrapper[4831]: I1204 10:33:43.365625 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc17360e-f22d-4420-8de8-5a95abc8f54c-kube-api-access-cnxj7" (OuterVolumeSpecName: "kube-api-access-cnxj7") pod "dc17360e-f22d-4420-8de8-5a95abc8f54c" (UID: "dc17360e-f22d-4420-8de8-5a95abc8f54c"). InnerVolumeSpecName "kube-api-access-cnxj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:33:43 crc kubenswrapper[4831]: I1204 10:33:43.459184 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-pl6jb"] Dec 04 10:33:43 crc kubenswrapper[4831]: W1204 10:33:43.459501 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddba9a395_7b7d_4a63_be80_61caea16396b.slice/crio-4c49348eec3122151094b2a4baa0dfe66bd579dd824258fc7d398b8ab46d4256 WatchSource:0}: Error finding container 4c49348eec3122151094b2a4baa0dfe66bd579dd824258fc7d398b8ab46d4256: Status 404 returned error can't find the container with id 4c49348eec3122151094b2a4baa0dfe66bd579dd824258fc7d398b8ab46d4256 Dec 04 10:33:43 crc kubenswrapper[4831]: I1204 10:33:43.460895 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83014775-6674-44f9-86ed-952fec02b15b-ovsdbserver-nb\") pod \"83014775-6674-44f9-86ed-952fec02b15b\" (UID: \"83014775-6674-44f9-86ed-952fec02b15b\") " Dec 04 10:33:43 crc kubenswrapper[4831]: I1204 10:33:43.460985 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83014775-6674-44f9-86ed-952fec02b15b-config\") pod \"83014775-6674-44f9-86ed-952fec02b15b\" (UID: \"83014775-6674-44f9-86ed-952fec02b15b\") " Dec 04 10:33:43 crc kubenswrapper[4831]: I1204 10:33:43.461041 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ljvs\" (UniqueName: \"kubernetes.io/projected/561a4728-58d4-40dc-ad9b-604394f208d6-kube-api-access-5ljvs\") pod \"561a4728-58d4-40dc-ad9b-604394f208d6\" (UID: \"561a4728-58d4-40dc-ad9b-604394f208d6\") " Dec 04 10:33:43 crc kubenswrapper[4831]: I1204 10:33:43.461150 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83014775-6674-44f9-86ed-952fec02b15b-dns-svc\") pod \"83014775-6674-44f9-86ed-952fec02b15b\" (UID: \"83014775-6674-44f9-86ed-952fec02b15b\") " Dec 04 10:33:43 crc kubenswrapper[4831]: I1204 10:33:43.461166 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmcwk\" (UniqueName: \"kubernetes.io/projected/83014775-6674-44f9-86ed-952fec02b15b-kube-api-access-dmcwk\") pod \"83014775-6674-44f9-86ed-952fec02b15b\" (UID: \"83014775-6674-44f9-86ed-952fec02b15b\") " Dec 04 10:33:43 crc kubenswrapper[4831]: I1204 10:33:43.461201 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83014775-6674-44f9-86ed-952fec02b15b-ovsdbserver-sb\") pod \"83014775-6674-44f9-86ed-952fec02b15b\" (UID: \"83014775-6674-44f9-86ed-952fec02b15b\") " Dec 04 10:33:43 crc kubenswrapper[4831]: I1204 10:33:43.462121 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnxj7\" (UniqueName: \"kubernetes.io/projected/dc17360e-f22d-4420-8de8-5a95abc8f54c-kube-api-access-cnxj7\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:43 crc kubenswrapper[4831]: I1204 10:33:43.464724 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/561a4728-58d4-40dc-ad9b-604394f208d6-kube-api-access-5ljvs" (OuterVolumeSpecName: "kube-api-access-5ljvs") pod "561a4728-58d4-40dc-ad9b-604394f208d6" (UID: "561a4728-58d4-40dc-ad9b-604394f208d6"). InnerVolumeSpecName "kube-api-access-5ljvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:33:43 crc kubenswrapper[4831]: I1204 10:33:43.468853 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83014775-6674-44f9-86ed-952fec02b15b-kube-api-access-dmcwk" (OuterVolumeSpecName: "kube-api-access-dmcwk") pod "83014775-6674-44f9-86ed-952fec02b15b" (UID: "83014775-6674-44f9-86ed-952fec02b15b"). InnerVolumeSpecName "kube-api-access-dmcwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:33:43 crc kubenswrapper[4831]: I1204 10:33:43.507427 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83014775-6674-44f9-86ed-952fec02b15b-config" (OuterVolumeSpecName: "config") pod "83014775-6674-44f9-86ed-952fec02b15b" (UID: "83014775-6674-44f9-86ed-952fec02b15b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:33:43 crc kubenswrapper[4831]: I1204 10:33:43.507633 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83014775-6674-44f9-86ed-952fec02b15b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "83014775-6674-44f9-86ed-952fec02b15b" (UID: "83014775-6674-44f9-86ed-952fec02b15b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:33:43 crc kubenswrapper[4831]: I1204 10:33:43.513063 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83014775-6674-44f9-86ed-952fec02b15b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "83014775-6674-44f9-86ed-952fec02b15b" (UID: "83014775-6674-44f9-86ed-952fec02b15b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:33:43 crc kubenswrapper[4831]: I1204 10:33:43.518217 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83014775-6674-44f9-86ed-952fec02b15b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "83014775-6674-44f9-86ed-952fec02b15b" (UID: "83014775-6674-44f9-86ed-952fec02b15b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:33:43 crc kubenswrapper[4831]: I1204 10:33:43.564349 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83014775-6674-44f9-86ed-952fec02b15b-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:43 crc kubenswrapper[4831]: I1204 10:33:43.564387 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmcwk\" (UniqueName: \"kubernetes.io/projected/83014775-6674-44f9-86ed-952fec02b15b-kube-api-access-dmcwk\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:43 crc kubenswrapper[4831]: I1204 10:33:43.564402 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83014775-6674-44f9-86ed-952fec02b15b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:43 crc kubenswrapper[4831]: I1204 10:33:43.564413 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83014775-6674-44f9-86ed-952fec02b15b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:43 crc kubenswrapper[4831]: I1204 10:33:43.564424 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83014775-6674-44f9-86ed-952fec02b15b-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:43 crc kubenswrapper[4831]: I1204 10:33:43.564434 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ljvs\" (UniqueName: \"kubernetes.io/projected/561a4728-58d4-40dc-ad9b-604394f208d6-kube-api-access-5ljvs\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:43 crc kubenswrapper[4831]: I1204 10:33:43.566853 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-m6n2t"] Dec 04 10:33:43 crc kubenswrapper[4831]: I1204 10:33:43.647232 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-hwbtz"] Dec 04 10:33:43 crc kubenswrapper[4831]: W1204 10:33:43.652581 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee687d9f_0cd1_40f3_a5e5_e8891e8d0d3a.slice/crio-99fc17db23ccd6fce633cd8cee0802f1759006bd9ddd03c419feb9f04ca3b9fd WatchSource:0}: Error finding container 99fc17db23ccd6fce633cd8cee0802f1759006bd9ddd03c419feb9f04ca3b9fd: Status 404 returned error can't find the container with id 99fc17db23ccd6fce633cd8cee0802f1759006bd9ddd03c419feb9f04ca3b9fd Dec 04 10:33:44 crc kubenswrapper[4831]: I1204 10:33:44.069106 4831 generic.go:334] "Generic (PLEG): container finished" podID="f36d8a75-9447-4921-8f46-f3fdb08160d8" containerID="1029616cc98d2367ff3a607ed08022b9f67c4b1f011c7de6880df86463eb3310" exitCode=0 Dec 04 10:33:44 crc kubenswrapper[4831]: I1204 10:33:44.069311 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-m6n2t" event={"ID":"f36d8a75-9447-4921-8f46-f3fdb08160d8","Type":"ContainerDied","Data":"1029616cc98d2367ff3a607ed08022b9f67c4b1f011c7de6880df86463eb3310"} Dec 04 10:33:44 crc kubenswrapper[4831]: I1204 10:33:44.069395 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-m6n2t" event={"ID":"f36d8a75-9447-4921-8f46-f3fdb08160d8","Type":"ContainerStarted","Data":"db5d7bfa5149eee09757941f527c10bb8b706354c748f578b4e350e34ee7373c"} Dec 04 10:33:44 crc kubenswrapper[4831]: I1204 10:33:44.071553 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-798f579549-frr45" event={"ID":"83014775-6674-44f9-86ed-952fec02b15b","Type":"ContainerDied","Data":"29354a5193235f5a90ab979a3cc8c805f488f3be134ad5471c395b75724e083f"} Dec 04 10:33:44 crc kubenswrapper[4831]: I1204 10:33:44.071597 4831 scope.go:117] "RemoveContainer" containerID="84e34fae13b6b19084707fd60a8b792c50ecd8ea5894777117ce1d55df471f27" Dec 04 10:33:44 crc kubenswrapper[4831]: I1204 10:33:44.071727 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-798f579549-frr45" Dec 04 10:33:44 crc kubenswrapper[4831]: I1204 10:33:44.086330 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-lm2h7" event={"ID":"c771ae2e-18c8-42c4-a789-7b55fea8e605","Type":"ContainerStarted","Data":"d6d1eef002d83ca29576f21cf66897b7fdd4a93cd18544403c6a0a5444b5c3ff"} Dec 04 10:33:44 crc kubenswrapper[4831]: I1204 10:33:44.087584 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-hwbtz" event={"ID":"ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a","Type":"ContainerStarted","Data":"99fc17db23ccd6fce633cd8cee0802f1759006bd9ddd03c419feb9f04ca3b9fd"} Dec 04 10:33:44 crc kubenswrapper[4831]: I1204 10:33:44.089322 4831 generic.go:334] "Generic (PLEG): container finished" podID="dba9a395-7b7d-4a63-be80-61caea16396b" containerID="7587716e8a8723f8f0a01c1bea6561f868845004d917c80310ea5759e4e620b3" exitCode=0 Dec 04 10:33:44 crc kubenswrapper[4831]: I1204 10:33:44.089439 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-pl6jb" event={"ID":"dba9a395-7b7d-4a63-be80-61caea16396b","Type":"ContainerDied","Data":"7587716e8a8723f8f0a01c1bea6561f868845004d917c80310ea5759e4e620b3"} Dec 04 10:33:44 crc kubenswrapper[4831]: I1204 10:33:44.089463 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-pl6jb" event={"ID":"dba9a395-7b7d-4a63-be80-61caea16396b","Type":"ContainerStarted","Data":"4c49348eec3122151094b2a4baa0dfe66bd579dd824258fc7d398b8ab46d4256"} Dec 04 10:33:44 crc kubenswrapper[4831]: I1204 10:33:44.097649 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6mcl2" Dec 04 10:33:44 crc kubenswrapper[4831]: I1204 10:33:44.098752 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"44484889-57d0-478f-a19a-2e78f913faf3","Type":"ContainerStarted","Data":"0558a93511fc21a680303c1455178512d22324550297919104375cbc5ea001a6"} Dec 04 10:33:44 crc kubenswrapper[4831]: I1204 10:33:44.098830 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xhvnt" Dec 04 10:33:44 crc kubenswrapper[4831]: I1204 10:33:44.119621 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-lm2h7" podStartSLOduration=2.34231465 podStartE2EDuration="9.119604508s" podCreationTimestamp="2025-12-04 10:33:35 +0000 UTC" firstStartedPulling="2025-12-04 10:33:36.243954122 +0000 UTC m=+1113.193129436" lastFinishedPulling="2025-12-04 10:33:43.02124398 +0000 UTC m=+1119.970419294" observedRunningTime="2025-12-04 10:33:44.107911891 +0000 UTC m=+1121.057087225" watchObservedRunningTime="2025-12-04 10:33:44.119604508 +0000 UTC m=+1121.068779822" Dec 04 10:33:44 crc kubenswrapper[4831]: I1204 10:33:44.125819 4831 scope.go:117] "RemoveContainer" containerID="b5d85a848c1efe10786a4e2f27f8145b231618abe83e713948a9be8257ff2bab" Dec 04 10:33:44 crc kubenswrapper[4831]: I1204 10:33:44.149835 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-798f579549-frr45"] Dec 04 10:33:44 crc kubenswrapper[4831]: I1204 10:33:44.162778 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-798f579549-frr45"] Dec 04 10:33:44 crc kubenswrapper[4831]: I1204 10:33:44.175537 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=18.175519658 podStartE2EDuration="18.175519658s" podCreationTimestamp="2025-12-04 10:33:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:33:44.17167656 +0000 UTC m=+1121.120851884" watchObservedRunningTime="2025-12-04 10:33:44.175519658 +0000 UTC m=+1121.124694972" Dec 04 10:33:45 crc kubenswrapper[4831]: I1204 10:33:45.315886 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83014775-6674-44f9-86ed-952fec02b15b" path="/var/lib/kubelet/pods/83014775-6674-44f9-86ed-952fec02b15b/volumes" Dec 04 10:33:45 crc kubenswrapper[4831]: I1204 10:33:45.497525 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pl6jb" Dec 04 10:33:45 crc kubenswrapper[4831]: I1204 10:33:45.502208 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-m6n2t" Dec 04 10:33:45 crc kubenswrapper[4831]: I1204 10:33:45.607357 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znv74\" (UniqueName: \"kubernetes.io/projected/dba9a395-7b7d-4a63-be80-61caea16396b-kube-api-access-znv74\") pod \"dba9a395-7b7d-4a63-be80-61caea16396b\" (UID: \"dba9a395-7b7d-4a63-be80-61caea16396b\") " Dec 04 10:33:45 crc kubenswrapper[4831]: I1204 10:33:45.608419 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58ndm\" (UniqueName: \"kubernetes.io/projected/f36d8a75-9447-4921-8f46-f3fdb08160d8-kube-api-access-58ndm\") pod \"f36d8a75-9447-4921-8f46-f3fdb08160d8\" (UID: \"f36d8a75-9447-4921-8f46-f3fdb08160d8\") " Dec 04 10:33:45 crc kubenswrapper[4831]: I1204 10:33:45.616112 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dba9a395-7b7d-4a63-be80-61caea16396b-kube-api-access-znv74" (OuterVolumeSpecName: "kube-api-access-znv74") pod "dba9a395-7b7d-4a63-be80-61caea16396b" (UID: "dba9a395-7b7d-4a63-be80-61caea16396b"). InnerVolumeSpecName "kube-api-access-znv74". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:33:45 crc kubenswrapper[4831]: I1204 10:33:45.622586 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f36d8a75-9447-4921-8f46-f3fdb08160d8-kube-api-access-58ndm" (OuterVolumeSpecName: "kube-api-access-58ndm") pod "f36d8a75-9447-4921-8f46-f3fdb08160d8" (UID: "f36d8a75-9447-4921-8f46-f3fdb08160d8"). InnerVolumeSpecName "kube-api-access-58ndm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:33:45 crc kubenswrapper[4831]: I1204 10:33:45.710808 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znv74\" (UniqueName: \"kubernetes.io/projected/dba9a395-7b7d-4a63-be80-61caea16396b-kube-api-access-znv74\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:45 crc kubenswrapper[4831]: I1204 10:33:45.710851 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58ndm\" (UniqueName: \"kubernetes.io/projected/f36d8a75-9447-4921-8f46-f3fdb08160d8-kube-api-access-58ndm\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:46 crc kubenswrapper[4831]: I1204 10:33:46.142447 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-m6n2t" Dec 04 10:33:46 crc kubenswrapper[4831]: I1204 10:33:46.142366 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-m6n2t" event={"ID":"f36d8a75-9447-4921-8f46-f3fdb08160d8","Type":"ContainerDied","Data":"db5d7bfa5149eee09757941f527c10bb8b706354c748f578b4e350e34ee7373c"} Dec 04 10:33:46 crc kubenswrapper[4831]: I1204 10:33:46.144109 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db5d7bfa5149eee09757941f527c10bb8b706354c748f578b4e350e34ee7373c" Dec 04 10:33:46 crc kubenswrapper[4831]: I1204 10:33:46.145803 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-pl6jb" event={"ID":"dba9a395-7b7d-4a63-be80-61caea16396b","Type":"ContainerDied","Data":"4c49348eec3122151094b2a4baa0dfe66bd579dd824258fc7d398b8ab46d4256"} Dec 04 10:33:46 crc kubenswrapper[4831]: I1204 10:33:46.145833 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c49348eec3122151094b2a4baa0dfe66bd579dd824258fc7d398b8ab46d4256" Dec 04 10:33:46 crc kubenswrapper[4831]: I1204 10:33:46.145874 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pl6jb" Dec 04 10:33:47 crc kubenswrapper[4831]: I1204 10:33:47.152587 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:48 crc kubenswrapper[4831]: I1204 10:33:48.292697 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-b886-account-create-r4hm2"] Dec 04 10:33:48 crc kubenswrapper[4831]: E1204 10:33:48.293432 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f36d8a75-9447-4921-8f46-f3fdb08160d8" containerName="mariadb-database-create" Dec 04 10:33:48 crc kubenswrapper[4831]: I1204 10:33:48.293450 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f36d8a75-9447-4921-8f46-f3fdb08160d8" containerName="mariadb-database-create" Dec 04 10:33:48 crc kubenswrapper[4831]: E1204 10:33:48.293467 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83014775-6674-44f9-86ed-952fec02b15b" containerName="init" Dec 04 10:33:48 crc kubenswrapper[4831]: I1204 10:33:48.293475 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="83014775-6674-44f9-86ed-952fec02b15b" containerName="init" Dec 04 10:33:48 crc kubenswrapper[4831]: E1204 10:33:48.293496 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83014775-6674-44f9-86ed-952fec02b15b" containerName="dnsmasq-dns" Dec 04 10:33:48 crc kubenswrapper[4831]: I1204 10:33:48.293504 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="83014775-6674-44f9-86ed-952fec02b15b" containerName="dnsmasq-dns" Dec 04 10:33:48 crc kubenswrapper[4831]: E1204 10:33:48.293514 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc17360e-f22d-4420-8de8-5a95abc8f54c" containerName="mariadb-database-create" Dec 04 10:33:48 crc kubenswrapper[4831]: I1204 10:33:48.293522 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc17360e-f22d-4420-8de8-5a95abc8f54c" containerName="mariadb-database-create" Dec 04 10:33:48 crc kubenswrapper[4831]: E1204 10:33:48.293542 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dba9a395-7b7d-4a63-be80-61caea16396b" containerName="mariadb-database-create" Dec 04 10:33:48 crc kubenswrapper[4831]: I1204 10:33:48.293550 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="dba9a395-7b7d-4a63-be80-61caea16396b" containerName="mariadb-database-create" Dec 04 10:33:48 crc kubenswrapper[4831]: E1204 10:33:48.293588 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="561a4728-58d4-40dc-ad9b-604394f208d6" containerName="mariadb-database-create" Dec 04 10:33:48 crc kubenswrapper[4831]: I1204 10:33:48.293595 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="561a4728-58d4-40dc-ad9b-604394f208d6" containerName="mariadb-database-create" Dec 04 10:33:48 crc kubenswrapper[4831]: I1204 10:33:48.293809 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc17360e-f22d-4420-8de8-5a95abc8f54c" containerName="mariadb-database-create" Dec 04 10:33:48 crc kubenswrapper[4831]: I1204 10:33:48.293823 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="dba9a395-7b7d-4a63-be80-61caea16396b" containerName="mariadb-database-create" Dec 04 10:33:48 crc kubenswrapper[4831]: I1204 10:33:48.293846 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="83014775-6674-44f9-86ed-952fec02b15b" containerName="dnsmasq-dns" Dec 04 10:33:48 crc kubenswrapper[4831]: I1204 10:33:48.293860 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="561a4728-58d4-40dc-ad9b-604394f208d6" containerName="mariadb-database-create" Dec 04 10:33:48 crc kubenswrapper[4831]: I1204 10:33:48.293883 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f36d8a75-9447-4921-8f46-f3fdb08160d8" containerName="mariadb-database-create" Dec 04 10:33:48 crc kubenswrapper[4831]: I1204 10:33:48.294605 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b886-account-create-r4hm2" Dec 04 10:33:48 crc kubenswrapper[4831]: I1204 10:33:48.297213 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 04 10:33:48 crc kubenswrapper[4831]: I1204 10:33:48.304546 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b886-account-create-r4hm2"] Dec 04 10:33:48 crc kubenswrapper[4831]: I1204 10:33:48.462534 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vkgr\" (UniqueName: \"kubernetes.io/projected/91a201ba-1bd6-41a4-a891-a82a2e017da1-kube-api-access-4vkgr\") pod \"neutron-b886-account-create-r4hm2\" (UID: \"91a201ba-1bd6-41a4-a891-a82a2e017da1\") " pod="openstack/neutron-b886-account-create-r4hm2" Dec 04 10:33:48 crc kubenswrapper[4831]: I1204 10:33:48.564866 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vkgr\" (UniqueName: \"kubernetes.io/projected/91a201ba-1bd6-41a4-a891-a82a2e017da1-kube-api-access-4vkgr\") pod \"neutron-b886-account-create-r4hm2\" (UID: \"91a201ba-1bd6-41a4-a891-a82a2e017da1\") " pod="openstack/neutron-b886-account-create-r4hm2" Dec 04 10:33:48 crc kubenswrapper[4831]: I1204 10:33:48.583677 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vkgr\" (UniqueName: \"kubernetes.io/projected/91a201ba-1bd6-41a4-a891-a82a2e017da1-kube-api-access-4vkgr\") pod \"neutron-b886-account-create-r4hm2\" (UID: \"91a201ba-1bd6-41a4-a891-a82a2e017da1\") " pod="openstack/neutron-b886-account-create-r4hm2" Dec 04 10:33:48 crc kubenswrapper[4831]: I1204 10:33:48.615389 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b886-account-create-r4hm2" Dec 04 10:33:49 crc kubenswrapper[4831]: I1204 10:33:49.177271 4831 generic.go:334] "Generic (PLEG): container finished" podID="c771ae2e-18c8-42c4-a789-7b55fea8e605" containerID="d6d1eef002d83ca29576f21cf66897b7fdd4a93cd18544403c6a0a5444b5c3ff" exitCode=0 Dec 04 10:33:49 crc kubenswrapper[4831]: I1204 10:33:49.177314 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-lm2h7" event={"ID":"c771ae2e-18c8-42c4-a789-7b55fea8e605","Type":"ContainerDied","Data":"d6d1eef002d83ca29576f21cf66897b7fdd4a93cd18544403c6a0a5444b5c3ff"} Dec 04 10:33:50 crc kubenswrapper[4831]: I1204 10:33:50.469630 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-lm2h7" Dec 04 10:33:50 crc kubenswrapper[4831]: I1204 10:33:50.599892 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjsnx\" (UniqueName: \"kubernetes.io/projected/c771ae2e-18c8-42c4-a789-7b55fea8e605-kube-api-access-fjsnx\") pod \"c771ae2e-18c8-42c4-a789-7b55fea8e605\" (UID: \"c771ae2e-18c8-42c4-a789-7b55fea8e605\") " Dec 04 10:33:50 crc kubenswrapper[4831]: I1204 10:33:50.600072 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c771ae2e-18c8-42c4-a789-7b55fea8e605-config-data\") pod \"c771ae2e-18c8-42c4-a789-7b55fea8e605\" (UID: \"c771ae2e-18c8-42c4-a789-7b55fea8e605\") " Dec 04 10:33:50 crc kubenswrapper[4831]: I1204 10:33:50.600096 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c771ae2e-18c8-42c4-a789-7b55fea8e605-combined-ca-bundle\") pod \"c771ae2e-18c8-42c4-a789-7b55fea8e605\" (UID: \"c771ae2e-18c8-42c4-a789-7b55fea8e605\") " Dec 04 10:33:50 crc kubenswrapper[4831]: I1204 10:33:50.605791 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c771ae2e-18c8-42c4-a789-7b55fea8e605-kube-api-access-fjsnx" (OuterVolumeSpecName: "kube-api-access-fjsnx") pod "c771ae2e-18c8-42c4-a789-7b55fea8e605" (UID: "c771ae2e-18c8-42c4-a789-7b55fea8e605"). InnerVolumeSpecName "kube-api-access-fjsnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:33:50 crc kubenswrapper[4831]: I1204 10:33:50.623466 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c771ae2e-18c8-42c4-a789-7b55fea8e605-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c771ae2e-18c8-42c4-a789-7b55fea8e605" (UID: "c771ae2e-18c8-42c4-a789-7b55fea8e605"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:33:50 crc kubenswrapper[4831]: I1204 10:33:50.668829 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c771ae2e-18c8-42c4-a789-7b55fea8e605-config-data" (OuterVolumeSpecName: "config-data") pod "c771ae2e-18c8-42c4-a789-7b55fea8e605" (UID: "c771ae2e-18c8-42c4-a789-7b55fea8e605"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:33:50 crc kubenswrapper[4831]: I1204 10:33:50.672296 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b886-account-create-r4hm2"] Dec 04 10:33:50 crc kubenswrapper[4831]: I1204 10:33:50.702057 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c771ae2e-18c8-42c4-a789-7b55fea8e605-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:50 crc kubenswrapper[4831]: I1204 10:33:50.703367 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c771ae2e-18c8-42c4-a789-7b55fea8e605-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:50 crc kubenswrapper[4831]: I1204 10:33:50.703455 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjsnx\" (UniqueName: \"kubernetes.io/projected/c771ae2e-18c8-42c4-a789-7b55fea8e605-kube-api-access-fjsnx\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.198370 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-hwbtz" event={"ID":"ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a","Type":"ContainerStarted","Data":"00de293c46492e8856a60277fad082eca3709b73ad4afe3288c8baff079841aa"} Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.201230 4831 generic.go:334] "Generic (PLEG): container finished" podID="91a201ba-1bd6-41a4-a891-a82a2e017da1" containerID="3b25fa24b36a0dd425ffaf8fe0b8181de2a3297787a29654799f0fb4a0cf8896" exitCode=0 Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.201320 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b886-account-create-r4hm2" event={"ID":"91a201ba-1bd6-41a4-a891-a82a2e017da1","Type":"ContainerDied","Data":"3b25fa24b36a0dd425ffaf8fe0b8181de2a3297787a29654799f0fb4a0cf8896"} Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.201353 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b886-account-create-r4hm2" event={"ID":"91a201ba-1bd6-41a4-a891-a82a2e017da1","Type":"ContainerStarted","Data":"9b4c2573d2c55125a53423a52968bbd0808691dc0bd4be2a9481ca602692947e"} Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.207684 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-lm2h7" event={"ID":"c771ae2e-18c8-42c4-a789-7b55fea8e605","Type":"ContainerDied","Data":"4f75a1246dc7499f34fa30dc03bafde66c67644f31c3caa1714f2752efe46b18"} Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.208000 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f75a1246dc7499f34fa30dc03bafde66c67644f31c3caa1714f2752efe46b18" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.207750 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-lm2h7" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.243181 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-sync-hwbtz" podStartSLOduration=6.644584613 podStartE2EDuration="13.243160291s" podCreationTimestamp="2025-12-04 10:33:38 +0000 UTC" firstStartedPulling="2025-12-04 10:33:43.654748141 +0000 UTC m=+1120.603923455" lastFinishedPulling="2025-12-04 10:33:50.253323819 +0000 UTC m=+1127.202499133" observedRunningTime="2025-12-04 10:33:51.227718799 +0000 UTC m=+1128.176894123" watchObservedRunningTime="2025-12-04 10:33:51.243160291 +0000 UTC m=+1128.192335615" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.476071 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78556fb4f5-cnrnk"] Dec 04 10:33:51 crc kubenswrapper[4831]: E1204 10:33:51.476458 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c771ae2e-18c8-42c4-a789-7b55fea8e605" containerName="keystone-db-sync" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.476471 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="c771ae2e-18c8-42c4-a789-7b55fea8e605" containerName="keystone-db-sync" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.476645 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="c771ae2e-18c8-42c4-a789-7b55fea8e605" containerName="keystone-db-sync" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.477544 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78556fb4f5-cnrnk" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.491886 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78556fb4f5-cnrnk"] Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.516453 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-vgwjn"] Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.517563 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vgwjn" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.519893 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.524105 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-5hfxd" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.524309 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.524446 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.538751 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vgwjn"] Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.617620 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cb735a75-e006-4422-9ef1-f1ba13d56646-fernet-keys\") pod \"keystone-bootstrap-vgwjn\" (UID: \"cb735a75-e006-4422-9ef1-f1ba13d56646\") " pod="openstack/keystone-bootstrap-vgwjn" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.617694 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cb735a75-e006-4422-9ef1-f1ba13d56646-credential-keys\") pod \"keystone-bootstrap-vgwjn\" (UID: \"cb735a75-e006-4422-9ef1-f1ba13d56646\") " pod="openstack/keystone-bootstrap-vgwjn" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.617773 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56qwc\" (UniqueName: \"kubernetes.io/projected/cb735a75-e006-4422-9ef1-f1ba13d56646-kube-api-access-56qwc\") pod \"keystone-bootstrap-vgwjn\" (UID: \"cb735a75-e006-4422-9ef1-f1ba13d56646\") " pod="openstack/keystone-bootstrap-vgwjn" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.617823 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb735a75-e006-4422-9ef1-f1ba13d56646-config-data\") pod \"keystone-bootstrap-vgwjn\" (UID: \"cb735a75-e006-4422-9ef1-f1ba13d56646\") " pod="openstack/keystone-bootstrap-vgwjn" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.617849 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9cb9e923-4f6a-469c-b748-d14d105d2cac-dns-swift-storage-0\") pod \"dnsmasq-dns-78556fb4f5-cnrnk\" (UID: \"9cb9e923-4f6a-469c-b748-d14d105d2cac\") " pod="openstack/dnsmasq-dns-78556fb4f5-cnrnk" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.617903 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb735a75-e006-4422-9ef1-f1ba13d56646-scripts\") pod \"keystone-bootstrap-vgwjn\" (UID: \"cb735a75-e006-4422-9ef1-f1ba13d56646\") " pod="openstack/keystone-bootstrap-vgwjn" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.617932 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb735a75-e006-4422-9ef1-f1ba13d56646-combined-ca-bundle\") pod \"keystone-bootstrap-vgwjn\" (UID: \"cb735a75-e006-4422-9ef1-f1ba13d56646\") " pod="openstack/keystone-bootstrap-vgwjn" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.617998 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cb9e923-4f6a-469c-b748-d14d105d2cac-config\") pod \"dnsmasq-dns-78556fb4f5-cnrnk\" (UID: \"9cb9e923-4f6a-469c-b748-d14d105d2cac\") " pod="openstack/dnsmasq-dns-78556fb4f5-cnrnk" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.618030 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9cb9e923-4f6a-469c-b748-d14d105d2cac-ovsdbserver-nb\") pod \"dnsmasq-dns-78556fb4f5-cnrnk\" (UID: \"9cb9e923-4f6a-469c-b748-d14d105d2cac\") " pod="openstack/dnsmasq-dns-78556fb4f5-cnrnk" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.618061 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9cb9e923-4f6a-469c-b748-d14d105d2cac-dns-svc\") pod \"dnsmasq-dns-78556fb4f5-cnrnk\" (UID: \"9cb9e923-4f6a-469c-b748-d14d105d2cac\") " pod="openstack/dnsmasq-dns-78556fb4f5-cnrnk" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.618091 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9cb9e923-4f6a-469c-b748-d14d105d2cac-ovsdbserver-sb\") pod \"dnsmasq-dns-78556fb4f5-cnrnk\" (UID: \"9cb9e923-4f6a-469c-b748-d14d105d2cac\") " pod="openstack/dnsmasq-dns-78556fb4f5-cnrnk" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.618115 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z47wk\" (UniqueName: \"kubernetes.io/projected/9cb9e923-4f6a-469c-b748-d14d105d2cac-kube-api-access-z47wk\") pod \"dnsmasq-dns-78556fb4f5-cnrnk\" (UID: \"9cb9e923-4f6a-469c-b748-d14d105d2cac\") " pod="openstack/dnsmasq-dns-78556fb4f5-cnrnk" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.654760 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5fb66c999-tf56s"] Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.656038 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fb66c999-tf56s" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.661134 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-ldlwk" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.661324 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.661426 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.662044 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.694736 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5fb66c999-tf56s"] Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.724608 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cb735a75-e006-4422-9ef1-f1ba13d56646-credential-keys\") pod \"keystone-bootstrap-vgwjn\" (UID: \"cb735a75-e006-4422-9ef1-f1ba13d56646\") " pod="openstack/keystone-bootstrap-vgwjn" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.724990 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56qwc\" (UniqueName: \"kubernetes.io/projected/cb735a75-e006-4422-9ef1-f1ba13d56646-kube-api-access-56qwc\") pod \"keystone-bootstrap-vgwjn\" (UID: \"cb735a75-e006-4422-9ef1-f1ba13d56646\") " pod="openstack/keystone-bootstrap-vgwjn" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.725055 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb735a75-e006-4422-9ef1-f1ba13d56646-config-data\") pod \"keystone-bootstrap-vgwjn\" (UID: \"cb735a75-e006-4422-9ef1-f1ba13d56646\") " pod="openstack/keystone-bootstrap-vgwjn" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.725088 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9cb9e923-4f6a-469c-b748-d14d105d2cac-dns-swift-storage-0\") pod \"dnsmasq-dns-78556fb4f5-cnrnk\" (UID: \"9cb9e923-4f6a-469c-b748-d14d105d2cac\") " pod="openstack/dnsmasq-dns-78556fb4f5-cnrnk" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.725158 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb735a75-e006-4422-9ef1-f1ba13d56646-scripts\") pod \"keystone-bootstrap-vgwjn\" (UID: \"cb735a75-e006-4422-9ef1-f1ba13d56646\") " pod="openstack/keystone-bootstrap-vgwjn" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.725189 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb735a75-e006-4422-9ef1-f1ba13d56646-combined-ca-bundle\") pod \"keystone-bootstrap-vgwjn\" (UID: \"cb735a75-e006-4422-9ef1-f1ba13d56646\") " pod="openstack/keystone-bootstrap-vgwjn" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.725278 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cb9e923-4f6a-469c-b748-d14d105d2cac-config\") pod \"dnsmasq-dns-78556fb4f5-cnrnk\" (UID: \"9cb9e923-4f6a-469c-b748-d14d105d2cac\") " pod="openstack/dnsmasq-dns-78556fb4f5-cnrnk" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.725334 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9cb9e923-4f6a-469c-b748-d14d105d2cac-ovsdbserver-nb\") pod \"dnsmasq-dns-78556fb4f5-cnrnk\" (UID: \"9cb9e923-4f6a-469c-b748-d14d105d2cac\") " pod="openstack/dnsmasq-dns-78556fb4f5-cnrnk" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.725370 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9cb9e923-4f6a-469c-b748-d14d105d2cac-dns-svc\") pod \"dnsmasq-dns-78556fb4f5-cnrnk\" (UID: \"9cb9e923-4f6a-469c-b748-d14d105d2cac\") " pod="openstack/dnsmasq-dns-78556fb4f5-cnrnk" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.725403 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9cb9e923-4f6a-469c-b748-d14d105d2cac-ovsdbserver-sb\") pod \"dnsmasq-dns-78556fb4f5-cnrnk\" (UID: \"9cb9e923-4f6a-469c-b748-d14d105d2cac\") " pod="openstack/dnsmasq-dns-78556fb4f5-cnrnk" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.725438 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z47wk\" (UniqueName: \"kubernetes.io/projected/9cb9e923-4f6a-469c-b748-d14d105d2cac-kube-api-access-z47wk\") pod \"dnsmasq-dns-78556fb4f5-cnrnk\" (UID: \"9cb9e923-4f6a-469c-b748-d14d105d2cac\") " pod="openstack/dnsmasq-dns-78556fb4f5-cnrnk" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.725499 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cb735a75-e006-4422-9ef1-f1ba13d56646-fernet-keys\") pod \"keystone-bootstrap-vgwjn\" (UID: \"cb735a75-e006-4422-9ef1-f1ba13d56646\") " pod="openstack/keystone-bootstrap-vgwjn" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.728517 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9cb9e923-4f6a-469c-b748-d14d105d2cac-dns-svc\") pod \"dnsmasq-dns-78556fb4f5-cnrnk\" (UID: \"9cb9e923-4f6a-469c-b748-d14d105d2cac\") " pod="openstack/dnsmasq-dns-78556fb4f5-cnrnk" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.729183 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cb9e923-4f6a-469c-b748-d14d105d2cac-config\") pod \"dnsmasq-dns-78556fb4f5-cnrnk\" (UID: \"9cb9e923-4f6a-469c-b748-d14d105d2cac\") " pod="openstack/dnsmasq-dns-78556fb4f5-cnrnk" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.729717 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9cb9e923-4f6a-469c-b748-d14d105d2cac-ovsdbserver-nb\") pod \"dnsmasq-dns-78556fb4f5-cnrnk\" (UID: \"9cb9e923-4f6a-469c-b748-d14d105d2cac\") " pod="openstack/dnsmasq-dns-78556fb4f5-cnrnk" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.730267 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9cb9e923-4f6a-469c-b748-d14d105d2cac-dns-swift-storage-0\") pod \"dnsmasq-dns-78556fb4f5-cnrnk\" (UID: \"9cb9e923-4f6a-469c-b748-d14d105d2cac\") " pod="openstack/dnsmasq-dns-78556fb4f5-cnrnk" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.731637 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9cb9e923-4f6a-469c-b748-d14d105d2cac-ovsdbserver-sb\") pod \"dnsmasq-dns-78556fb4f5-cnrnk\" (UID: \"9cb9e923-4f6a-469c-b748-d14d105d2cac\") " pod="openstack/dnsmasq-dns-78556fb4f5-cnrnk" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.752264 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb735a75-e006-4422-9ef1-f1ba13d56646-scripts\") pod \"keystone-bootstrap-vgwjn\" (UID: \"cb735a75-e006-4422-9ef1-f1ba13d56646\") " pod="openstack/keystone-bootstrap-vgwjn" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.755776 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb735a75-e006-4422-9ef1-f1ba13d56646-config-data\") pod \"keystone-bootstrap-vgwjn\" (UID: \"cb735a75-e006-4422-9ef1-f1ba13d56646\") " pod="openstack/keystone-bootstrap-vgwjn" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.755835 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cb735a75-e006-4422-9ef1-f1ba13d56646-fernet-keys\") pod \"keystone-bootstrap-vgwjn\" (UID: \"cb735a75-e006-4422-9ef1-f1ba13d56646\") " pod="openstack/keystone-bootstrap-vgwjn" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.760541 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cb735a75-e006-4422-9ef1-f1ba13d56646-credential-keys\") pod \"keystone-bootstrap-vgwjn\" (UID: \"cb735a75-e006-4422-9ef1-f1ba13d56646\") " pod="openstack/keystone-bootstrap-vgwjn" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.766454 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb735a75-e006-4422-9ef1-f1ba13d56646-combined-ca-bundle\") pod \"keystone-bootstrap-vgwjn\" (UID: \"cb735a75-e006-4422-9ef1-f1ba13d56646\") " pod="openstack/keystone-bootstrap-vgwjn" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.809944 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z47wk\" (UniqueName: \"kubernetes.io/projected/9cb9e923-4f6a-469c-b748-d14d105d2cac-kube-api-access-z47wk\") pod \"dnsmasq-dns-78556fb4f5-cnrnk\" (UID: \"9cb9e923-4f6a-469c-b748-d14d105d2cac\") " pod="openstack/dnsmasq-dns-78556fb4f5-cnrnk" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.824576 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56qwc\" (UniqueName: \"kubernetes.io/projected/cb735a75-e006-4422-9ef1-f1ba13d56646-kube-api-access-56qwc\") pod \"keystone-bootstrap-vgwjn\" (UID: \"cb735a75-e006-4422-9ef1-f1ba13d56646\") " pod="openstack/keystone-bootstrap-vgwjn" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.827854 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33636a94-6988-4866-81d9-408053be5c58-logs\") pod \"horizon-5fb66c999-tf56s\" (UID: \"33636a94-6988-4866-81d9-408053be5c58\") " pod="openstack/horizon-5fb66c999-tf56s" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.827911 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/33636a94-6988-4866-81d9-408053be5c58-scripts\") pod \"horizon-5fb66c999-tf56s\" (UID: \"33636a94-6988-4866-81d9-408053be5c58\") " pod="openstack/horizon-5fb66c999-tf56s" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.828253 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wmt4\" (UniqueName: \"kubernetes.io/projected/33636a94-6988-4866-81d9-408053be5c58-kube-api-access-5wmt4\") pod \"horizon-5fb66c999-tf56s\" (UID: \"33636a94-6988-4866-81d9-408053be5c58\") " pod="openstack/horizon-5fb66c999-tf56s" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.828286 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/33636a94-6988-4866-81d9-408053be5c58-horizon-secret-key\") pod \"horizon-5fb66c999-tf56s\" (UID: \"33636a94-6988-4866-81d9-408053be5c58\") " pod="openstack/horizon-5fb66c999-tf56s" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.828379 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/33636a94-6988-4866-81d9-408053be5c58-config-data\") pod \"horizon-5fb66c999-tf56s\" (UID: \"33636a94-6988-4866-81d9-408053be5c58\") " pod="openstack/horizon-5fb66c999-tf56s" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.834753 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.844283 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vgwjn" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.869132 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.878892 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.879083 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.933983 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wmt4\" (UniqueName: \"kubernetes.io/projected/33636a94-6988-4866-81d9-408053be5c58-kube-api-access-5wmt4\") pod \"horizon-5fb66c999-tf56s\" (UID: \"33636a94-6988-4866-81d9-408053be5c58\") " pod="openstack/horizon-5fb66c999-tf56s" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.934074 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/33636a94-6988-4866-81d9-408053be5c58-horizon-secret-key\") pod \"horizon-5fb66c999-tf56s\" (UID: \"33636a94-6988-4866-81d9-408053be5c58\") " pod="openstack/horizon-5fb66c999-tf56s" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.934102 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/33636a94-6988-4866-81d9-408053be5c58-config-data\") pod \"horizon-5fb66c999-tf56s\" (UID: \"33636a94-6988-4866-81d9-408053be5c58\") " pod="openstack/horizon-5fb66c999-tf56s" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.934152 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33636a94-6988-4866-81d9-408053be5c58-logs\") pod \"horizon-5fb66c999-tf56s\" (UID: \"33636a94-6988-4866-81d9-408053be5c58\") " pod="openstack/horizon-5fb66c999-tf56s" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.934184 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/33636a94-6988-4866-81d9-408053be5c58-scripts\") pod \"horizon-5fb66c999-tf56s\" (UID: \"33636a94-6988-4866-81d9-408053be5c58\") " pod="openstack/horizon-5fb66c999-tf56s" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.935004 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/33636a94-6988-4866-81d9-408053be5c58-scripts\") pod \"horizon-5fb66c999-tf56s\" (UID: \"33636a94-6988-4866-81d9-408053be5c58\") " pod="openstack/horizon-5fb66c999-tf56s" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.937566 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.938208 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33636a94-6988-4866-81d9-408053be5c58-logs\") pod \"horizon-5fb66c999-tf56s\" (UID: \"33636a94-6988-4866-81d9-408053be5c58\") " pod="openstack/horizon-5fb66c999-tf56s" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.941771 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/33636a94-6988-4866-81d9-408053be5c58-config-data\") pod \"horizon-5fb66c999-tf56s\" (UID: \"33636a94-6988-4866-81d9-408053be5c58\") " pod="openstack/horizon-5fb66c999-tf56s" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.975418 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/33636a94-6988-4866-81d9-408053be5c58-horizon-secret-key\") pod \"horizon-5fb66c999-tf56s\" (UID: \"33636a94-6988-4866-81d9-408053be5c58\") " pod="openstack/horizon-5fb66c999-tf56s" Dec 04 10:33:51 crc kubenswrapper[4831]: I1204 10:33:51.995338 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wmt4\" (UniqueName: \"kubernetes.io/projected/33636a94-6988-4866-81d9-408053be5c58-kube-api-access-5wmt4\") pod \"horizon-5fb66c999-tf56s\" (UID: \"33636a94-6988-4866-81d9-408053be5c58\") " pod="openstack/horizon-5fb66c999-tf56s" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.011303 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-9hwx9"] Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.037684 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fb66c999-tf56s" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.039109 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-9hwx9" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.041200 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b769c363-a026-4bf4-9b56-3d1452b6847d-config-data\") pod \"ceilometer-0\" (UID: \"b769c363-a026-4bf4-9b56-3d1452b6847d\") " pod="openstack/ceilometer-0" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.041229 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b769c363-a026-4bf4-9b56-3d1452b6847d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b769c363-a026-4bf4-9b56-3d1452b6847d\") " pod="openstack/ceilometer-0" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.041272 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b769c363-a026-4bf4-9b56-3d1452b6847d-run-httpd\") pod \"ceilometer-0\" (UID: \"b769c363-a026-4bf4-9b56-3d1452b6847d\") " pod="openstack/ceilometer-0" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.041303 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b769c363-a026-4bf4-9b56-3d1452b6847d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b769c363-a026-4bf4-9b56-3d1452b6847d\") " pod="openstack/ceilometer-0" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.041361 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b769c363-a026-4bf4-9b56-3d1452b6847d-scripts\") pod \"ceilometer-0\" (UID: \"b769c363-a026-4bf4-9b56-3d1452b6847d\") " pod="openstack/ceilometer-0" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.041381 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b769c363-a026-4bf4-9b56-3d1452b6847d-log-httpd\") pod \"ceilometer-0\" (UID: \"b769c363-a026-4bf4-9b56-3d1452b6847d\") " pod="openstack/ceilometer-0" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.041407 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq52w\" (UniqueName: \"kubernetes.io/projected/b769c363-a026-4bf4-9b56-3d1452b6847d-kube-api-access-vq52w\") pod \"ceilometer-0\" (UID: \"b769c363-a026-4bf4-9b56-3d1452b6847d\") " pod="openstack/ceilometer-0" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.052187 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.052368 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-hmp6k" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.052472 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.055742 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5cb786f76c-8xlwx"] Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.059300 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cb786f76c-8xlwx" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.079959 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-9hwx9"] Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.096098 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78556fb4f5-cnrnk" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.102963 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5cb786f76c-8xlwx"] Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.143191 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b769c363-a026-4bf4-9b56-3d1452b6847d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b769c363-a026-4bf4-9b56-3d1452b6847d\") " pod="openstack/ceilometer-0" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.143254 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e-config-data\") pod \"placement-db-sync-9hwx9\" (UID: \"ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e\") " pod="openstack/placement-db-sync-9hwx9" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.143286 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cj74\" (UniqueName: \"kubernetes.io/projected/e41bbb94-b986-4268-8040-77542216b905-kube-api-access-6cj74\") pod \"horizon-5cb786f76c-8xlwx\" (UID: \"e41bbb94-b986-4268-8040-77542216b905\") " pod="openstack/horizon-5cb786f76c-8xlwx" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.143344 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e-combined-ca-bundle\") pod \"placement-db-sync-9hwx9\" (UID: \"ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e\") " pod="openstack/placement-db-sync-9hwx9" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.143391 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e41bbb94-b986-4268-8040-77542216b905-scripts\") pod \"horizon-5cb786f76c-8xlwx\" (UID: \"e41bbb94-b986-4268-8040-77542216b905\") " pod="openstack/horizon-5cb786f76c-8xlwx" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.143419 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b769c363-a026-4bf4-9b56-3d1452b6847d-scripts\") pod \"ceilometer-0\" (UID: \"b769c363-a026-4bf4-9b56-3d1452b6847d\") " pod="openstack/ceilometer-0" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.143437 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpj5d\" (UniqueName: \"kubernetes.io/projected/ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e-kube-api-access-vpj5d\") pod \"placement-db-sync-9hwx9\" (UID: \"ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e\") " pod="openstack/placement-db-sync-9hwx9" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.143456 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b769c363-a026-4bf4-9b56-3d1452b6847d-log-httpd\") pod \"ceilometer-0\" (UID: \"b769c363-a026-4bf4-9b56-3d1452b6847d\") " pod="openstack/ceilometer-0" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.143472 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e-logs\") pod \"placement-db-sync-9hwx9\" (UID: \"ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e\") " pod="openstack/placement-db-sync-9hwx9" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.143498 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq52w\" (UniqueName: \"kubernetes.io/projected/b769c363-a026-4bf4-9b56-3d1452b6847d-kube-api-access-vq52w\") pod \"ceilometer-0\" (UID: \"b769c363-a026-4bf4-9b56-3d1452b6847d\") " pod="openstack/ceilometer-0" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.143529 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e41bbb94-b986-4268-8040-77542216b905-horizon-secret-key\") pod \"horizon-5cb786f76c-8xlwx\" (UID: \"e41bbb94-b986-4268-8040-77542216b905\") " pod="openstack/horizon-5cb786f76c-8xlwx" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.143555 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e41bbb94-b986-4268-8040-77542216b905-logs\") pod \"horizon-5cb786f76c-8xlwx\" (UID: \"e41bbb94-b986-4268-8040-77542216b905\") " pod="openstack/horizon-5cb786f76c-8xlwx" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.143604 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e-scripts\") pod \"placement-db-sync-9hwx9\" (UID: \"ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e\") " pod="openstack/placement-db-sync-9hwx9" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.143643 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e41bbb94-b986-4268-8040-77542216b905-config-data\") pod \"horizon-5cb786f76c-8xlwx\" (UID: \"e41bbb94-b986-4268-8040-77542216b905\") " pod="openstack/horizon-5cb786f76c-8xlwx" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.143721 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b769c363-a026-4bf4-9b56-3d1452b6847d-config-data\") pod \"ceilometer-0\" (UID: \"b769c363-a026-4bf4-9b56-3d1452b6847d\") " pod="openstack/ceilometer-0" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.143740 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b769c363-a026-4bf4-9b56-3d1452b6847d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b769c363-a026-4bf4-9b56-3d1452b6847d\") " pod="openstack/ceilometer-0" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.143760 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b769c363-a026-4bf4-9b56-3d1452b6847d-run-httpd\") pod \"ceilometer-0\" (UID: \"b769c363-a026-4bf4-9b56-3d1452b6847d\") " pod="openstack/ceilometer-0" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.144175 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b769c363-a026-4bf4-9b56-3d1452b6847d-run-httpd\") pod \"ceilometer-0\" (UID: \"b769c363-a026-4bf4-9b56-3d1452b6847d\") " pod="openstack/ceilometer-0" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.150289 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b769c363-a026-4bf4-9b56-3d1452b6847d-log-httpd\") pod \"ceilometer-0\" (UID: \"b769c363-a026-4bf4-9b56-3d1452b6847d\") " pod="openstack/ceilometer-0" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.154617 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b769c363-a026-4bf4-9b56-3d1452b6847d-config-data\") pod \"ceilometer-0\" (UID: \"b769c363-a026-4bf4-9b56-3d1452b6847d\") " pod="openstack/ceilometer-0" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.155479 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b769c363-a026-4bf4-9b56-3d1452b6847d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b769c363-a026-4bf4-9b56-3d1452b6847d\") " pod="openstack/ceilometer-0" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.157550 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b769c363-a026-4bf4-9b56-3d1452b6847d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b769c363-a026-4bf4-9b56-3d1452b6847d\") " pod="openstack/ceilometer-0" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.165396 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b769c363-a026-4bf4-9b56-3d1452b6847d-scripts\") pod \"ceilometer-0\" (UID: \"b769c363-a026-4bf4-9b56-3d1452b6847d\") " pod="openstack/ceilometer-0" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.184596 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq52w\" (UniqueName: \"kubernetes.io/projected/b769c363-a026-4bf4-9b56-3d1452b6847d-kube-api-access-vq52w\") pod \"ceilometer-0\" (UID: \"b769c363-a026-4bf4-9b56-3d1452b6847d\") " pod="openstack/ceilometer-0" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.187957 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78556fb4f5-cnrnk"] Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.244789 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e-config-data\") pod \"placement-db-sync-9hwx9\" (UID: \"ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e\") " pod="openstack/placement-db-sync-9hwx9" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.245147 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cj74\" (UniqueName: \"kubernetes.io/projected/e41bbb94-b986-4268-8040-77542216b905-kube-api-access-6cj74\") pod \"horizon-5cb786f76c-8xlwx\" (UID: \"e41bbb94-b986-4268-8040-77542216b905\") " pod="openstack/horizon-5cb786f76c-8xlwx" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.245173 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e-combined-ca-bundle\") pod \"placement-db-sync-9hwx9\" (UID: \"ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e\") " pod="openstack/placement-db-sync-9hwx9" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.245221 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e41bbb94-b986-4268-8040-77542216b905-scripts\") pod \"horizon-5cb786f76c-8xlwx\" (UID: \"e41bbb94-b986-4268-8040-77542216b905\") " pod="openstack/horizon-5cb786f76c-8xlwx" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.245251 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpj5d\" (UniqueName: \"kubernetes.io/projected/ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e-kube-api-access-vpj5d\") pod \"placement-db-sync-9hwx9\" (UID: \"ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e\") " pod="openstack/placement-db-sync-9hwx9" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.245268 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e-logs\") pod \"placement-db-sync-9hwx9\" (UID: \"ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e\") " pod="openstack/placement-db-sync-9hwx9" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.245343 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e41bbb94-b986-4268-8040-77542216b905-horizon-secret-key\") pod \"horizon-5cb786f76c-8xlwx\" (UID: \"e41bbb94-b986-4268-8040-77542216b905\") " pod="openstack/horizon-5cb786f76c-8xlwx" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.245362 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e41bbb94-b986-4268-8040-77542216b905-logs\") pod \"horizon-5cb786f76c-8xlwx\" (UID: \"e41bbb94-b986-4268-8040-77542216b905\") " pod="openstack/horizon-5cb786f76c-8xlwx" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.245404 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e-scripts\") pod \"placement-db-sync-9hwx9\" (UID: \"ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e\") " pod="openstack/placement-db-sync-9hwx9" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.245439 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e41bbb94-b986-4268-8040-77542216b905-config-data\") pod \"horizon-5cb786f76c-8xlwx\" (UID: \"e41bbb94-b986-4268-8040-77542216b905\") " pod="openstack/horizon-5cb786f76c-8xlwx" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.246915 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e41bbb94-b986-4268-8040-77542216b905-config-data\") pod \"horizon-5cb786f76c-8xlwx\" (UID: \"e41bbb94-b986-4268-8040-77542216b905\") " pod="openstack/horizon-5cb786f76c-8xlwx" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.250085 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e41bbb94-b986-4268-8040-77542216b905-scripts\") pod \"horizon-5cb786f76c-8xlwx\" (UID: \"e41bbb94-b986-4268-8040-77542216b905\") " pod="openstack/horizon-5cb786f76c-8xlwx" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.250299 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e41bbb94-b986-4268-8040-77542216b905-logs\") pod \"horizon-5cb786f76c-8xlwx\" (UID: \"e41bbb94-b986-4268-8040-77542216b905\") " pod="openstack/horizon-5cb786f76c-8xlwx" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.253266 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e-logs\") pod \"placement-db-sync-9hwx9\" (UID: \"ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e\") " pod="openstack/placement-db-sync-9hwx9" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.257255 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e-scripts\") pod \"placement-db-sync-9hwx9\" (UID: \"ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e\") " pod="openstack/placement-db-sync-9hwx9" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.267819 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e41bbb94-b986-4268-8040-77542216b905-horizon-secret-key\") pod \"horizon-5cb786f76c-8xlwx\" (UID: \"e41bbb94-b986-4268-8040-77542216b905\") " pod="openstack/horizon-5cb786f76c-8xlwx" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.273296 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e-combined-ca-bundle\") pod \"placement-db-sync-9hwx9\" (UID: \"ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e\") " pod="openstack/placement-db-sync-9hwx9" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.295040 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cj74\" (UniqueName: \"kubernetes.io/projected/e41bbb94-b986-4268-8040-77542216b905-kube-api-access-6cj74\") pod \"horizon-5cb786f76c-8xlwx\" (UID: \"e41bbb94-b986-4268-8040-77542216b905\") " pod="openstack/horizon-5cb786f76c-8xlwx" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.296046 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.297453 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpj5d\" (UniqueName: \"kubernetes.io/projected/ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e-kube-api-access-vpj5d\") pod \"placement-db-sync-9hwx9\" (UID: \"ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e\") " pod="openstack/placement-db-sync-9hwx9" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.308986 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e-config-data\") pod \"placement-db-sync-9hwx9\" (UID: \"ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e\") " pod="openstack/placement-db-sync-9hwx9" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.356086 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6ddf665d6c-8wdxb"] Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.357846 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ddf665d6c-8wdxb" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.403807 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-9hwx9" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.410935 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6ddf665d6c-8wdxb"] Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.430421 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cb786f76c-8xlwx" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.552524 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e40b774b-77a1-455f-90c3-a4fe7ff1c9ad-ovsdbserver-nb\") pod \"dnsmasq-dns-6ddf665d6c-8wdxb\" (UID: \"e40b774b-77a1-455f-90c3-a4fe7ff1c9ad\") " pod="openstack/dnsmasq-dns-6ddf665d6c-8wdxb" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.552912 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e40b774b-77a1-455f-90c3-a4fe7ff1c9ad-dns-svc\") pod \"dnsmasq-dns-6ddf665d6c-8wdxb\" (UID: \"e40b774b-77a1-455f-90c3-a4fe7ff1c9ad\") " pod="openstack/dnsmasq-dns-6ddf665d6c-8wdxb" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.552954 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e40b774b-77a1-455f-90c3-a4fe7ff1c9ad-config\") pod \"dnsmasq-dns-6ddf665d6c-8wdxb\" (UID: \"e40b774b-77a1-455f-90c3-a4fe7ff1c9ad\") " pod="openstack/dnsmasq-dns-6ddf665d6c-8wdxb" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.553115 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e40b774b-77a1-455f-90c3-a4fe7ff1c9ad-ovsdbserver-sb\") pod \"dnsmasq-dns-6ddf665d6c-8wdxb\" (UID: \"e40b774b-77a1-455f-90c3-a4fe7ff1c9ad\") " pod="openstack/dnsmasq-dns-6ddf665d6c-8wdxb" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.553415 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e40b774b-77a1-455f-90c3-a4fe7ff1c9ad-dns-swift-storage-0\") pod \"dnsmasq-dns-6ddf665d6c-8wdxb\" (UID: \"e40b774b-77a1-455f-90c3-a4fe7ff1c9ad\") " pod="openstack/dnsmasq-dns-6ddf665d6c-8wdxb" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.553436 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmv96\" (UniqueName: \"kubernetes.io/projected/e40b774b-77a1-455f-90c3-a4fe7ff1c9ad-kube-api-access-pmv96\") pod \"dnsmasq-dns-6ddf665d6c-8wdxb\" (UID: \"e40b774b-77a1-455f-90c3-a4fe7ff1c9ad\") " pod="openstack/dnsmasq-dns-6ddf665d6c-8wdxb" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.655717 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e40b774b-77a1-455f-90c3-a4fe7ff1c9ad-config\") pod \"dnsmasq-dns-6ddf665d6c-8wdxb\" (UID: \"e40b774b-77a1-455f-90c3-a4fe7ff1c9ad\") " pod="openstack/dnsmasq-dns-6ddf665d6c-8wdxb" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.655782 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e40b774b-77a1-455f-90c3-a4fe7ff1c9ad-ovsdbserver-sb\") pod \"dnsmasq-dns-6ddf665d6c-8wdxb\" (UID: \"e40b774b-77a1-455f-90c3-a4fe7ff1c9ad\") " pod="openstack/dnsmasq-dns-6ddf665d6c-8wdxb" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.656198 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e40b774b-77a1-455f-90c3-a4fe7ff1c9ad-dns-swift-storage-0\") pod \"dnsmasq-dns-6ddf665d6c-8wdxb\" (UID: \"e40b774b-77a1-455f-90c3-a4fe7ff1c9ad\") " pod="openstack/dnsmasq-dns-6ddf665d6c-8wdxb" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.656275 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmv96\" (UniqueName: \"kubernetes.io/projected/e40b774b-77a1-455f-90c3-a4fe7ff1c9ad-kube-api-access-pmv96\") pod \"dnsmasq-dns-6ddf665d6c-8wdxb\" (UID: \"e40b774b-77a1-455f-90c3-a4fe7ff1c9ad\") " pod="openstack/dnsmasq-dns-6ddf665d6c-8wdxb" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.656446 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e40b774b-77a1-455f-90c3-a4fe7ff1c9ad-ovsdbserver-nb\") pod \"dnsmasq-dns-6ddf665d6c-8wdxb\" (UID: \"e40b774b-77a1-455f-90c3-a4fe7ff1c9ad\") " pod="openstack/dnsmasq-dns-6ddf665d6c-8wdxb" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.656506 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e40b774b-77a1-455f-90c3-a4fe7ff1c9ad-dns-svc\") pod \"dnsmasq-dns-6ddf665d6c-8wdxb\" (UID: \"e40b774b-77a1-455f-90c3-a4fe7ff1c9ad\") " pod="openstack/dnsmasq-dns-6ddf665d6c-8wdxb" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.657679 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e40b774b-77a1-455f-90c3-a4fe7ff1c9ad-config\") pod \"dnsmasq-dns-6ddf665d6c-8wdxb\" (UID: \"e40b774b-77a1-455f-90c3-a4fe7ff1c9ad\") " pod="openstack/dnsmasq-dns-6ddf665d6c-8wdxb" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.657847 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e40b774b-77a1-455f-90c3-a4fe7ff1c9ad-ovsdbserver-nb\") pod \"dnsmasq-dns-6ddf665d6c-8wdxb\" (UID: \"e40b774b-77a1-455f-90c3-a4fe7ff1c9ad\") " pod="openstack/dnsmasq-dns-6ddf665d6c-8wdxb" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.657855 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e40b774b-77a1-455f-90c3-a4fe7ff1c9ad-dns-swift-storage-0\") pod \"dnsmasq-dns-6ddf665d6c-8wdxb\" (UID: \"e40b774b-77a1-455f-90c3-a4fe7ff1c9ad\") " pod="openstack/dnsmasq-dns-6ddf665d6c-8wdxb" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.657948 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e40b774b-77a1-455f-90c3-a4fe7ff1c9ad-ovsdbserver-sb\") pod \"dnsmasq-dns-6ddf665d6c-8wdxb\" (UID: \"e40b774b-77a1-455f-90c3-a4fe7ff1c9ad\") " pod="openstack/dnsmasq-dns-6ddf665d6c-8wdxb" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.659044 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e40b774b-77a1-455f-90c3-a4fe7ff1c9ad-dns-svc\") pod \"dnsmasq-dns-6ddf665d6c-8wdxb\" (UID: \"e40b774b-77a1-455f-90c3-a4fe7ff1c9ad\") " pod="openstack/dnsmasq-dns-6ddf665d6c-8wdxb" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.687561 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmv96\" (UniqueName: \"kubernetes.io/projected/e40b774b-77a1-455f-90c3-a4fe7ff1c9ad-kube-api-access-pmv96\") pod \"dnsmasq-dns-6ddf665d6c-8wdxb\" (UID: \"e40b774b-77a1-455f-90c3-a4fe7ff1c9ad\") " pod="openstack/dnsmasq-dns-6ddf665d6c-8wdxb" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.716734 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vgwjn"] Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.745155 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ddf665d6c-8wdxb" Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.804192 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5fb66c999-tf56s"] Dec 04 10:33:52 crc kubenswrapper[4831]: W1204 10:33:52.816262 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33636a94_6988_4866_81d9_408053be5c58.slice/crio-af0b1bbd541b7e37f8d499e43fc9a02f95cf897521af46664deef383c8f39c62 WatchSource:0}: Error finding container af0b1bbd541b7e37f8d499e43fc9a02f95cf897521af46664deef383c8f39c62: Status 404 returned error can't find the container with id af0b1bbd541b7e37f8d499e43fc9a02f95cf897521af46664deef383c8f39c62 Dec 04 10:33:52 crc kubenswrapper[4831]: I1204 10:33:52.907396 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b886-account-create-r4hm2" Dec 04 10:33:53 crc kubenswrapper[4831]: I1204 10:33:53.018223 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78556fb4f5-cnrnk"] Dec 04 10:33:53 crc kubenswrapper[4831]: W1204 10:33:53.023346 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cb9e923_4f6a_469c_b748_d14d105d2cac.slice/crio-7f4240f82bc73b192df05dee03bf0e50b3b97ac07c1d07bf451b6fa2ce84fd5e WatchSource:0}: Error finding container 7f4240f82bc73b192df05dee03bf0e50b3b97ac07c1d07bf451b6fa2ce84fd5e: Status 404 returned error can't find the container with id 7f4240f82bc73b192df05dee03bf0e50b3b97ac07c1d07bf451b6fa2ce84fd5e Dec 04 10:33:54 crc kubenswrapper[4831]: I1204 10:33:53.064214 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vkgr\" (UniqueName: \"kubernetes.io/projected/91a201ba-1bd6-41a4-a891-a82a2e017da1-kube-api-access-4vkgr\") pod \"91a201ba-1bd6-41a4-a891-a82a2e017da1\" (UID: \"91a201ba-1bd6-41a4-a891-a82a2e017da1\") " Dec 04 10:33:54 crc kubenswrapper[4831]: I1204 10:33:53.076982 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91a201ba-1bd6-41a4-a891-a82a2e017da1-kube-api-access-4vkgr" (OuterVolumeSpecName: "kube-api-access-4vkgr") pod "91a201ba-1bd6-41a4-a891-a82a2e017da1" (UID: "91a201ba-1bd6-41a4-a891-a82a2e017da1"). InnerVolumeSpecName "kube-api-access-4vkgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:33:54 crc kubenswrapper[4831]: I1204 10:33:53.120042 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:33:54 crc kubenswrapper[4831]: I1204 10:33:53.126773 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5cb786f76c-8xlwx"] Dec 04 10:33:54 crc kubenswrapper[4831]: W1204 10:33:53.138259 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb769c363_a026_4bf4_9b56_3d1452b6847d.slice/crio-f2c3e0a23f31bf8a111eff6abb6fe99f64eb8ffd0065a656e683fda674ca4796 WatchSource:0}: Error finding container f2c3e0a23f31bf8a111eff6abb6fe99f64eb8ffd0065a656e683fda674ca4796: Status 404 returned error can't find the container with id f2c3e0a23f31bf8a111eff6abb6fe99f64eb8ffd0065a656e683fda674ca4796 Dec 04 10:33:54 crc kubenswrapper[4831]: I1204 10:33:53.165702 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vkgr\" (UniqueName: \"kubernetes.io/projected/91a201ba-1bd6-41a4-a891-a82a2e017da1-kube-api-access-4vkgr\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:54 crc kubenswrapper[4831]: I1204 10:33:53.256640 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b769c363-a026-4bf4-9b56-3d1452b6847d","Type":"ContainerStarted","Data":"f2c3e0a23f31bf8a111eff6abb6fe99f64eb8ffd0065a656e683fda674ca4796"} Dec 04 10:33:54 crc kubenswrapper[4831]: I1204 10:33:53.257857 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cb786f76c-8xlwx" event={"ID":"e41bbb94-b986-4268-8040-77542216b905","Type":"ContainerStarted","Data":"07637652f60eb1d3bdcddc42ab98e28f52f5e2b9ed7d75a32fc515fa0eb5aec0"} Dec 04 10:33:54 crc kubenswrapper[4831]: I1204 10:33:53.257961 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-9hwx9"] Dec 04 10:33:54 crc kubenswrapper[4831]: I1204 10:33:53.259033 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fb66c999-tf56s" event={"ID":"33636a94-6988-4866-81d9-408053be5c58","Type":"ContainerStarted","Data":"af0b1bbd541b7e37f8d499e43fc9a02f95cf897521af46664deef383c8f39c62"} Dec 04 10:33:54 crc kubenswrapper[4831]: W1204 10:33:53.264984 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec9a65ef_c4b5_4c6f_b4c1_91c51e7c070e.slice/crio-0e9857b0e3f42f15b48fc33dbf305ca1f2e5745bd0cfa5a505e1f0c2f262352b WatchSource:0}: Error finding container 0e9857b0e3f42f15b48fc33dbf305ca1f2e5745bd0cfa5a505e1f0c2f262352b: Status 404 returned error can't find the container with id 0e9857b0e3f42f15b48fc33dbf305ca1f2e5745bd0cfa5a505e1f0c2f262352b Dec 04 10:33:54 crc kubenswrapper[4831]: I1204 10:33:53.267187 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vgwjn" event={"ID":"cb735a75-e006-4422-9ef1-f1ba13d56646","Type":"ContainerStarted","Data":"5514ffa5aa7b95857d999cd68e6858a5314a5224d4bd3c5d55ae82cfacefa7fc"} Dec 04 10:33:54 crc kubenswrapper[4831]: I1204 10:33:53.268521 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78556fb4f5-cnrnk" event={"ID":"9cb9e923-4f6a-469c-b748-d14d105d2cac","Type":"ContainerStarted","Data":"7f4240f82bc73b192df05dee03bf0e50b3b97ac07c1d07bf451b6fa2ce84fd5e"} Dec 04 10:33:54 crc kubenswrapper[4831]: I1204 10:33:53.269991 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b886-account-create-r4hm2" event={"ID":"91a201ba-1bd6-41a4-a891-a82a2e017da1","Type":"ContainerDied","Data":"9b4c2573d2c55125a53423a52968bbd0808691dc0bd4be2a9481ca602692947e"} Dec 04 10:33:54 crc kubenswrapper[4831]: I1204 10:33:53.270012 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b4c2573d2c55125a53423a52968bbd0808691dc0bd4be2a9481ca602692947e" Dec 04 10:33:54 crc kubenswrapper[4831]: I1204 10:33:53.270071 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b886-account-create-r4hm2" Dec 04 10:33:54 crc kubenswrapper[4831]: I1204 10:33:53.375467 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6ddf665d6c-8wdxb"] Dec 04 10:33:54 crc kubenswrapper[4831]: I1204 10:33:54.292684 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vgwjn" event={"ID":"cb735a75-e006-4422-9ef1-f1ba13d56646","Type":"ContainerStarted","Data":"20ad925ebfdeacb2990ca32c947d6b7e78747b6b27ce748e6d9105619c7a23ab"} Dec 04 10:33:54 crc kubenswrapper[4831]: I1204 10:33:54.306192 4831 generic.go:334] "Generic (PLEG): container finished" podID="e40b774b-77a1-455f-90c3-a4fe7ff1c9ad" containerID="babba742bc0c252166bceb9fdddd621b9493270a5f6e6c907a55cac89a927268" exitCode=0 Dec 04 10:33:54 crc kubenswrapper[4831]: I1204 10:33:54.307200 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ddf665d6c-8wdxb" event={"ID":"e40b774b-77a1-455f-90c3-a4fe7ff1c9ad","Type":"ContainerDied","Data":"babba742bc0c252166bceb9fdddd621b9493270a5f6e6c907a55cac89a927268"} Dec 04 10:33:54 crc kubenswrapper[4831]: I1204 10:33:54.307271 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ddf665d6c-8wdxb" event={"ID":"e40b774b-77a1-455f-90c3-a4fe7ff1c9ad","Type":"ContainerStarted","Data":"7478e31b15553d2de1dd23e54b7e18fe6046fb9e0131bd67ed1bb96d0522697b"} Dec 04 10:33:54 crc kubenswrapper[4831]: I1204 10:33:54.309749 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-9hwx9" event={"ID":"ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e","Type":"ContainerStarted","Data":"0e9857b0e3f42f15b48fc33dbf305ca1f2e5745bd0cfa5a505e1f0c2f262352b"} Dec 04 10:33:54 crc kubenswrapper[4831]: I1204 10:33:54.323017 4831 generic.go:334] "Generic (PLEG): container finished" podID="9cb9e923-4f6a-469c-b748-d14d105d2cac" containerID="6c81018892b7002067576527b7258ccb284dd96f0bc35cbfd4eec9fe10e8ec40" exitCode=0 Dec 04 10:33:54 crc kubenswrapper[4831]: I1204 10:33:54.323062 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78556fb4f5-cnrnk" event={"ID":"9cb9e923-4f6a-469c-b748-d14d105d2cac","Type":"ContainerDied","Data":"6c81018892b7002067576527b7258ccb284dd96f0bc35cbfd4eec9fe10e8ec40"} Dec 04 10:33:54 crc kubenswrapper[4831]: I1204 10:33:54.332686 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-vgwjn" podStartSLOduration=3.3326444410000002 podStartE2EDuration="3.332644441s" podCreationTimestamp="2025-12-04 10:33:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:33:54.319024435 +0000 UTC m=+1131.268199759" watchObservedRunningTime="2025-12-04 10:33:54.332644441 +0000 UTC m=+1131.281819755" Dec 04 10:33:54 crc kubenswrapper[4831]: I1204 10:33:54.816900 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78556fb4f5-cnrnk" Dec 04 10:33:54 crc kubenswrapper[4831]: I1204 10:33:54.911108 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9cb9e923-4f6a-469c-b748-d14d105d2cac-dns-svc\") pod \"9cb9e923-4f6a-469c-b748-d14d105d2cac\" (UID: \"9cb9e923-4f6a-469c-b748-d14d105d2cac\") " Dec 04 10:33:54 crc kubenswrapper[4831]: I1204 10:33:54.911351 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cb9e923-4f6a-469c-b748-d14d105d2cac-config\") pod \"9cb9e923-4f6a-469c-b748-d14d105d2cac\" (UID: \"9cb9e923-4f6a-469c-b748-d14d105d2cac\") " Dec 04 10:33:54 crc kubenswrapper[4831]: I1204 10:33:54.911438 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z47wk\" (UniqueName: \"kubernetes.io/projected/9cb9e923-4f6a-469c-b748-d14d105d2cac-kube-api-access-z47wk\") pod \"9cb9e923-4f6a-469c-b748-d14d105d2cac\" (UID: \"9cb9e923-4f6a-469c-b748-d14d105d2cac\") " Dec 04 10:33:54 crc kubenswrapper[4831]: I1204 10:33:54.911728 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9cb9e923-4f6a-469c-b748-d14d105d2cac-dns-swift-storage-0\") pod \"9cb9e923-4f6a-469c-b748-d14d105d2cac\" (UID: \"9cb9e923-4f6a-469c-b748-d14d105d2cac\") " Dec 04 10:33:54 crc kubenswrapper[4831]: I1204 10:33:54.911812 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9cb9e923-4f6a-469c-b748-d14d105d2cac-ovsdbserver-sb\") pod \"9cb9e923-4f6a-469c-b748-d14d105d2cac\" (UID: \"9cb9e923-4f6a-469c-b748-d14d105d2cac\") " Dec 04 10:33:54 crc kubenswrapper[4831]: I1204 10:33:54.912052 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9cb9e923-4f6a-469c-b748-d14d105d2cac-ovsdbserver-nb\") pod \"9cb9e923-4f6a-469c-b748-d14d105d2cac\" (UID: \"9cb9e923-4f6a-469c-b748-d14d105d2cac\") " Dec 04 10:33:54 crc kubenswrapper[4831]: I1204 10:33:54.916460 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cb9e923-4f6a-469c-b748-d14d105d2cac-kube-api-access-z47wk" (OuterVolumeSpecName: "kube-api-access-z47wk") pod "9cb9e923-4f6a-469c-b748-d14d105d2cac" (UID: "9cb9e923-4f6a-469c-b748-d14d105d2cac"). InnerVolumeSpecName "kube-api-access-z47wk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:33:54 crc kubenswrapper[4831]: I1204 10:33:54.935690 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cb9e923-4f6a-469c-b748-d14d105d2cac-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9cb9e923-4f6a-469c-b748-d14d105d2cac" (UID: "9cb9e923-4f6a-469c-b748-d14d105d2cac"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:33:54 crc kubenswrapper[4831]: I1204 10:33:54.952003 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cb9e923-4f6a-469c-b748-d14d105d2cac-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9cb9e923-4f6a-469c-b748-d14d105d2cac" (UID: "9cb9e923-4f6a-469c-b748-d14d105d2cac"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:33:54 crc kubenswrapper[4831]: I1204 10:33:54.954319 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cb9e923-4f6a-469c-b748-d14d105d2cac-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9cb9e923-4f6a-469c-b748-d14d105d2cac" (UID: "9cb9e923-4f6a-469c-b748-d14d105d2cac"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:33:54 crc kubenswrapper[4831]: I1204 10:33:54.955378 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cb9e923-4f6a-469c-b748-d14d105d2cac-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9cb9e923-4f6a-469c-b748-d14d105d2cac" (UID: "9cb9e923-4f6a-469c-b748-d14d105d2cac"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:33:54 crc kubenswrapper[4831]: I1204 10:33:54.966078 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cb9e923-4f6a-469c-b748-d14d105d2cac-config" (OuterVolumeSpecName: "config") pod "9cb9e923-4f6a-469c-b748-d14d105d2cac" (UID: "9cb9e923-4f6a-469c-b748-d14d105d2cac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.015426 4831 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9cb9e923-4f6a-469c-b748-d14d105d2cac-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.015772 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9cb9e923-4f6a-469c-b748-d14d105d2cac-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.015788 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9cb9e923-4f6a-469c-b748-d14d105d2cac-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.015799 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9cb9e923-4f6a-469c-b748-d14d105d2cac-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.015813 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cb9e923-4f6a-469c-b748-d14d105d2cac-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.015824 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z47wk\" (UniqueName: \"kubernetes.io/projected/9cb9e923-4f6a-469c-b748-d14d105d2cac-kube-api-access-z47wk\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.072524 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5cb786f76c-8xlwx"] Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.093270 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.108938 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6f77cc7d69-g6vcb"] Dec 04 10:33:55 crc kubenswrapper[4831]: E1204 10:33:55.109292 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cb9e923-4f6a-469c-b748-d14d105d2cac" containerName="init" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.109307 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cb9e923-4f6a-469c-b748-d14d105d2cac" containerName="init" Dec 04 10:33:55 crc kubenswrapper[4831]: E1204 10:33:55.109338 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91a201ba-1bd6-41a4-a891-a82a2e017da1" containerName="mariadb-account-create" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.109346 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="91a201ba-1bd6-41a4-a891-a82a2e017da1" containerName="mariadb-account-create" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.109547 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="91a201ba-1bd6-41a4-a891-a82a2e017da1" containerName="mariadb-account-create" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.109575 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cb9e923-4f6a-469c-b748-d14d105d2cac" containerName="init" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.110430 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f77cc7d69-g6vcb" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.121166 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6f77cc7d69-g6vcb"] Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.219791 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85393e48-dbc6-41c7-9419-4baf00f072db-scripts\") pod \"horizon-6f77cc7d69-g6vcb\" (UID: \"85393e48-dbc6-41c7-9419-4baf00f072db\") " pod="openstack/horizon-6f77cc7d69-g6vcb" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.219887 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85393e48-dbc6-41c7-9419-4baf00f072db-config-data\") pod \"horizon-6f77cc7d69-g6vcb\" (UID: \"85393e48-dbc6-41c7-9419-4baf00f072db\") " pod="openstack/horizon-6f77cc7d69-g6vcb" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.219931 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85393e48-dbc6-41c7-9419-4baf00f072db-logs\") pod \"horizon-6f77cc7d69-g6vcb\" (UID: \"85393e48-dbc6-41c7-9419-4baf00f072db\") " pod="openstack/horizon-6f77cc7d69-g6vcb" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.219965 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmzzf\" (UniqueName: \"kubernetes.io/projected/85393e48-dbc6-41c7-9419-4baf00f072db-kube-api-access-kmzzf\") pod \"horizon-6f77cc7d69-g6vcb\" (UID: \"85393e48-dbc6-41c7-9419-4baf00f072db\") " pod="openstack/horizon-6f77cc7d69-g6vcb" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.220037 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/85393e48-dbc6-41c7-9419-4baf00f072db-horizon-secret-key\") pod \"horizon-6f77cc7d69-g6vcb\" (UID: \"85393e48-dbc6-41c7-9419-4baf00f072db\") " pod="openstack/horizon-6f77cc7d69-g6vcb" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.303898 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-ba03-account-create-gvcf6"] Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.312851 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ba03-account-create-gvcf6" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.315112 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.316611 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-ba03-account-create-gvcf6"] Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.323866 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85393e48-dbc6-41c7-9419-4baf00f072db-scripts\") pod \"horizon-6f77cc7d69-g6vcb\" (UID: \"85393e48-dbc6-41c7-9419-4baf00f072db\") " pod="openstack/horizon-6f77cc7d69-g6vcb" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.323967 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85393e48-dbc6-41c7-9419-4baf00f072db-config-data\") pod \"horizon-6f77cc7d69-g6vcb\" (UID: \"85393e48-dbc6-41c7-9419-4baf00f072db\") " pod="openstack/horizon-6f77cc7d69-g6vcb" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.324011 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85393e48-dbc6-41c7-9419-4baf00f072db-logs\") pod \"horizon-6f77cc7d69-g6vcb\" (UID: \"85393e48-dbc6-41c7-9419-4baf00f072db\") " pod="openstack/horizon-6f77cc7d69-g6vcb" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.324046 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmzzf\" (UniqueName: \"kubernetes.io/projected/85393e48-dbc6-41c7-9419-4baf00f072db-kube-api-access-kmzzf\") pod \"horizon-6f77cc7d69-g6vcb\" (UID: \"85393e48-dbc6-41c7-9419-4baf00f072db\") " pod="openstack/horizon-6f77cc7d69-g6vcb" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.324115 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/85393e48-dbc6-41c7-9419-4baf00f072db-horizon-secret-key\") pod \"horizon-6f77cc7d69-g6vcb\" (UID: \"85393e48-dbc6-41c7-9419-4baf00f072db\") " pod="openstack/horizon-6f77cc7d69-g6vcb" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.325711 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85393e48-dbc6-41c7-9419-4baf00f072db-logs\") pod \"horizon-6f77cc7d69-g6vcb\" (UID: \"85393e48-dbc6-41c7-9419-4baf00f072db\") " pod="openstack/horizon-6f77cc7d69-g6vcb" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.326021 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85393e48-dbc6-41c7-9419-4baf00f072db-scripts\") pod \"horizon-6f77cc7d69-g6vcb\" (UID: \"85393e48-dbc6-41c7-9419-4baf00f072db\") " pod="openstack/horizon-6f77cc7d69-g6vcb" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.328821 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85393e48-dbc6-41c7-9419-4baf00f072db-config-data\") pod \"horizon-6f77cc7d69-g6vcb\" (UID: \"85393e48-dbc6-41c7-9419-4baf00f072db\") " pod="openstack/horizon-6f77cc7d69-g6vcb" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.331410 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/85393e48-dbc6-41c7-9419-4baf00f072db-horizon-secret-key\") pod \"horizon-6f77cc7d69-g6vcb\" (UID: \"85393e48-dbc6-41c7-9419-4baf00f072db\") " pod="openstack/horizon-6f77cc7d69-g6vcb" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.341647 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmzzf\" (UniqueName: \"kubernetes.io/projected/85393e48-dbc6-41c7-9419-4baf00f072db-kube-api-access-kmzzf\") pod \"horizon-6f77cc7d69-g6vcb\" (UID: \"85393e48-dbc6-41c7-9419-4baf00f072db\") " pod="openstack/horizon-6f77cc7d69-g6vcb" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.351864 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78556fb4f5-cnrnk" event={"ID":"9cb9e923-4f6a-469c-b748-d14d105d2cac","Type":"ContainerDied","Data":"7f4240f82bc73b192df05dee03bf0e50b3b97ac07c1d07bf451b6fa2ce84fd5e"} Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.351910 4831 scope.go:117] "RemoveContainer" containerID="6c81018892b7002067576527b7258ccb284dd96f0bc35cbfd4eec9fe10e8ec40" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.351911 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78556fb4f5-cnrnk" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.355304 4831 generic.go:334] "Generic (PLEG): container finished" podID="ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a" containerID="00de293c46492e8856a60277fad082eca3709b73ad4afe3288c8baff079841aa" exitCode=0 Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.355360 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-hwbtz" event={"ID":"ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a","Type":"ContainerDied","Data":"00de293c46492e8856a60277fad082eca3709b73ad4afe3288c8baff079841aa"} Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.374097 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ddf665d6c-8wdxb" event={"ID":"e40b774b-77a1-455f-90c3-a4fe7ff1c9ad","Type":"ContainerStarted","Data":"93fa1f9d1c6c393f66567187f37aed9234757c1848a2067184607761bb26b064"} Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.374317 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6ddf665d6c-8wdxb" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.413151 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-edf8-account-create-w79xp"] Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.414531 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-edf8-account-create-w79xp" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.417465 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.426165 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt2dx\" (UniqueName: \"kubernetes.io/projected/d40946a3-b49e-4cb0-b113-18711def8c0e-kube-api-access-kt2dx\") pod \"barbican-ba03-account-create-gvcf6\" (UID: \"d40946a3-b49e-4cb0-b113-18711def8c0e\") " pod="openstack/barbican-ba03-account-create-gvcf6" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.427518 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f77cc7d69-g6vcb" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.448181 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-edf8-account-create-w79xp"] Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.450178 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6ddf665d6c-8wdxb" podStartSLOduration=3.450161546 podStartE2EDuration="3.450161546s" podCreationTimestamp="2025-12-04 10:33:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:33:55.417222659 +0000 UTC m=+1132.366397983" watchObservedRunningTime="2025-12-04 10:33:55.450161546 +0000 UTC m=+1132.399336860" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.529782 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt2dx\" (UniqueName: \"kubernetes.io/projected/d40946a3-b49e-4cb0-b113-18711def8c0e-kube-api-access-kt2dx\") pod \"barbican-ba03-account-create-gvcf6\" (UID: \"d40946a3-b49e-4cb0-b113-18711def8c0e\") " pod="openstack/barbican-ba03-account-create-gvcf6" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.531278 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm5v9\" (UniqueName: \"kubernetes.io/projected/664dafb4-41fc-4bfc-8355-32ae4ef51867-kube-api-access-wm5v9\") pod \"cinder-edf8-account-create-w79xp\" (UID: \"664dafb4-41fc-4bfc-8355-32ae4ef51867\") " pod="openstack/cinder-edf8-account-create-w79xp" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.563312 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt2dx\" (UniqueName: \"kubernetes.io/projected/d40946a3-b49e-4cb0-b113-18711def8c0e-kube-api-access-kt2dx\") pod \"barbican-ba03-account-create-gvcf6\" (UID: \"d40946a3-b49e-4cb0-b113-18711def8c0e\") " pod="openstack/barbican-ba03-account-create-gvcf6" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.633770 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm5v9\" (UniqueName: \"kubernetes.io/projected/664dafb4-41fc-4bfc-8355-32ae4ef51867-kube-api-access-wm5v9\") pod \"cinder-edf8-account-create-w79xp\" (UID: \"664dafb4-41fc-4bfc-8355-32ae4ef51867\") " pod="openstack/cinder-edf8-account-create-w79xp" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.641239 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78556fb4f5-cnrnk"] Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.650351 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78556fb4f5-cnrnk"] Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.656270 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm5v9\" (UniqueName: \"kubernetes.io/projected/664dafb4-41fc-4bfc-8355-32ae4ef51867-kube-api-access-wm5v9\") pod \"cinder-edf8-account-create-w79xp\" (UID: \"664dafb4-41fc-4bfc-8355-32ae4ef51867\") " pod="openstack/cinder-edf8-account-create-w79xp" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.742003 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ba03-account-create-gvcf6" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.868832 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-edf8-account-create-w79xp" Dec 04 10:33:55 crc kubenswrapper[4831]: I1204 10:33:55.991715 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6f77cc7d69-g6vcb"] Dec 04 10:33:55 crc kubenswrapper[4831]: W1204 10:33:55.993647 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85393e48_dbc6_41c7_9419_4baf00f072db.slice/crio-e73ac3f72a023789976fd027e5019729d742db81e03acf7e43e140d1704cea71 WatchSource:0}: Error finding container e73ac3f72a023789976fd027e5019729d742db81e03acf7e43e140d1704cea71: Status 404 returned error can't find the container with id e73ac3f72a023789976fd027e5019729d742db81e03acf7e43e140d1704cea71 Dec 04 10:33:56 crc kubenswrapper[4831]: I1204 10:33:56.065962 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-ba03-account-create-gvcf6"] Dec 04 10:33:56 crc kubenswrapper[4831]: I1204 10:33:56.410858 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ba03-account-create-gvcf6" event={"ID":"d40946a3-b49e-4cb0-b113-18711def8c0e","Type":"ContainerStarted","Data":"3a59c0d03be93060e3a9fe378e2e30e8fff82d35410f3b3df1ebbbb59f7355d1"} Dec 04 10:33:56 crc kubenswrapper[4831]: I1204 10:33:56.421437 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-edf8-account-create-w79xp"] Dec 04 10:33:56 crc kubenswrapper[4831]: I1204 10:33:56.425994 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f77cc7d69-g6vcb" event={"ID":"85393e48-dbc6-41c7-9419-4baf00f072db","Type":"ContainerStarted","Data":"e73ac3f72a023789976fd027e5019729d742db81e03acf7e43e140d1704cea71"} Dec 04 10:33:57 crc kubenswrapper[4831]: I1204 10:33:57.151841 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:57 crc kubenswrapper[4831]: I1204 10:33:57.157776 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:57 crc kubenswrapper[4831]: I1204 10:33:57.291903 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cb9e923-4f6a-469c-b748-d14d105d2cac" path="/var/lib/kubelet/pods/9cb9e923-4f6a-469c-b748-d14d105d2cac/volumes" Dec 04 10:33:57 crc kubenswrapper[4831]: I1204 10:33:57.438191 4831 generic.go:334] "Generic (PLEG): container finished" podID="d40946a3-b49e-4cb0-b113-18711def8c0e" containerID="32a6413864550a3235d47d439de607080de0a4fcd14b12c63c541d7e0e15d3fe" exitCode=0 Dec 04 10:33:57 crc kubenswrapper[4831]: I1204 10:33:57.438433 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ba03-account-create-gvcf6" event={"ID":"d40946a3-b49e-4cb0-b113-18711def8c0e","Type":"ContainerDied","Data":"32a6413864550a3235d47d439de607080de0a4fcd14b12c63c541d7e0e15d3fe"} Dec 04 10:33:57 crc kubenswrapper[4831]: I1204 10:33:57.442684 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 04 10:33:58 crc kubenswrapper[4831]: I1204 10:33:58.039238 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-377a-account-create-c9vwp"] Dec 04 10:33:58 crc kubenswrapper[4831]: I1204 10:33:58.044347 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-377a-account-create-c9vwp" Dec 04 10:33:58 crc kubenswrapper[4831]: I1204 10:33:58.047247 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 04 10:33:58 crc kubenswrapper[4831]: I1204 10:33:58.072821 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-377a-account-create-c9vwp"] Dec 04 10:33:58 crc kubenswrapper[4831]: I1204 10:33:58.130585 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxwlq\" (UniqueName: \"kubernetes.io/projected/f67e7c2e-46d3-4e53-97fe-1b1a7d2e7d9f-kube-api-access-jxwlq\") pod \"glance-377a-account-create-c9vwp\" (UID: \"f67e7c2e-46d3-4e53-97fe-1b1a7d2e7d9f\") " pod="openstack/glance-377a-account-create-c9vwp" Dec 04 10:33:58 crc kubenswrapper[4831]: I1204 10:33:58.232452 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxwlq\" (UniqueName: \"kubernetes.io/projected/f67e7c2e-46d3-4e53-97fe-1b1a7d2e7d9f-kube-api-access-jxwlq\") pod \"glance-377a-account-create-c9vwp\" (UID: \"f67e7c2e-46d3-4e53-97fe-1b1a7d2e7d9f\") " pod="openstack/glance-377a-account-create-c9vwp" Dec 04 10:33:58 crc kubenswrapper[4831]: I1204 10:33:58.256701 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxwlq\" (UniqueName: \"kubernetes.io/projected/f67e7c2e-46d3-4e53-97fe-1b1a7d2e7d9f-kube-api-access-jxwlq\") pod \"glance-377a-account-create-c9vwp\" (UID: \"f67e7c2e-46d3-4e53-97fe-1b1a7d2e7d9f\") " pod="openstack/glance-377a-account-create-c9vwp" Dec 04 10:33:58 crc kubenswrapper[4831]: I1204 10:33:58.369717 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-377a-account-create-c9vwp" Dec 04 10:33:58 crc kubenswrapper[4831]: I1204 10:33:58.642987 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-q6jc5"] Dec 04 10:33:58 crc kubenswrapper[4831]: I1204 10:33:58.644822 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-q6jc5" Dec 04 10:33:58 crc kubenswrapper[4831]: I1204 10:33:58.648583 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 04 10:33:58 crc kubenswrapper[4831]: I1204 10:33:58.649370 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-cf59p" Dec 04 10:33:58 crc kubenswrapper[4831]: I1204 10:33:58.649431 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 04 10:33:58 crc kubenswrapper[4831]: I1204 10:33:58.666311 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-q6jc5"] Dec 04 10:33:58 crc kubenswrapper[4831]: I1204 10:33:58.742577 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/41899055-8db6-4cdb-a9da-2bbb143b9f3f-config\") pod \"neutron-db-sync-q6jc5\" (UID: \"41899055-8db6-4cdb-a9da-2bbb143b9f3f\") " pod="openstack/neutron-db-sync-q6jc5" Dec 04 10:33:58 crc kubenswrapper[4831]: I1204 10:33:58.742670 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpd6b\" (UniqueName: \"kubernetes.io/projected/41899055-8db6-4cdb-a9da-2bbb143b9f3f-kube-api-access-hpd6b\") pod \"neutron-db-sync-q6jc5\" (UID: \"41899055-8db6-4cdb-a9da-2bbb143b9f3f\") " pod="openstack/neutron-db-sync-q6jc5" Dec 04 10:33:58 crc kubenswrapper[4831]: I1204 10:33:58.742805 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41899055-8db6-4cdb-a9da-2bbb143b9f3f-combined-ca-bundle\") pod \"neutron-db-sync-q6jc5\" (UID: \"41899055-8db6-4cdb-a9da-2bbb143b9f3f\") " pod="openstack/neutron-db-sync-q6jc5" Dec 04 10:33:58 crc kubenswrapper[4831]: I1204 10:33:58.846086 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/41899055-8db6-4cdb-a9da-2bbb143b9f3f-config\") pod \"neutron-db-sync-q6jc5\" (UID: \"41899055-8db6-4cdb-a9da-2bbb143b9f3f\") " pod="openstack/neutron-db-sync-q6jc5" Dec 04 10:33:58 crc kubenswrapper[4831]: I1204 10:33:58.846140 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpd6b\" (UniqueName: \"kubernetes.io/projected/41899055-8db6-4cdb-a9da-2bbb143b9f3f-kube-api-access-hpd6b\") pod \"neutron-db-sync-q6jc5\" (UID: \"41899055-8db6-4cdb-a9da-2bbb143b9f3f\") " pod="openstack/neutron-db-sync-q6jc5" Dec 04 10:33:58 crc kubenswrapper[4831]: I1204 10:33:58.846180 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41899055-8db6-4cdb-a9da-2bbb143b9f3f-combined-ca-bundle\") pod \"neutron-db-sync-q6jc5\" (UID: \"41899055-8db6-4cdb-a9da-2bbb143b9f3f\") " pod="openstack/neutron-db-sync-q6jc5" Dec 04 10:33:58 crc kubenswrapper[4831]: I1204 10:33:58.866219 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41899055-8db6-4cdb-a9da-2bbb143b9f3f-combined-ca-bundle\") pod \"neutron-db-sync-q6jc5\" (UID: \"41899055-8db6-4cdb-a9da-2bbb143b9f3f\") " pod="openstack/neutron-db-sync-q6jc5" Dec 04 10:33:58 crc kubenswrapper[4831]: I1204 10:33:58.868308 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/41899055-8db6-4cdb-a9da-2bbb143b9f3f-config\") pod \"neutron-db-sync-q6jc5\" (UID: \"41899055-8db6-4cdb-a9da-2bbb143b9f3f\") " pod="openstack/neutron-db-sync-q6jc5" Dec 04 10:33:58 crc kubenswrapper[4831]: I1204 10:33:58.868788 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpd6b\" (UniqueName: \"kubernetes.io/projected/41899055-8db6-4cdb-a9da-2bbb143b9f3f-kube-api-access-hpd6b\") pod \"neutron-db-sync-q6jc5\" (UID: \"41899055-8db6-4cdb-a9da-2bbb143b9f3f\") " pod="openstack/neutron-db-sync-q6jc5" Dec 04 10:33:58 crc kubenswrapper[4831]: I1204 10:33:58.965256 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-q6jc5" Dec 04 10:34:00 crc kubenswrapper[4831]: I1204 10:34:00.467783 4831 generic.go:334] "Generic (PLEG): container finished" podID="cb735a75-e006-4422-9ef1-f1ba13d56646" containerID="20ad925ebfdeacb2990ca32c947d6b7e78747b6b27ce748e6d9105619c7a23ab" exitCode=0 Dec 04 10:34:00 crc kubenswrapper[4831]: I1204 10:34:00.467897 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vgwjn" event={"ID":"cb735a75-e006-4422-9ef1-f1ba13d56646","Type":"ContainerDied","Data":"20ad925ebfdeacb2990ca32c947d6b7e78747b6b27ce748e6d9105619c7a23ab"} Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.177005 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5fb66c999-tf56s"] Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.223520 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6694d6d998-wgcht"] Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.227322 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6694d6d998-wgcht" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.231717 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.253130 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6694d6d998-wgcht"] Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.288570 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42db10e0-67a3-49d3-b6c5-8f48e31775e7-config-data\") pod \"horizon-6694d6d998-wgcht\" (UID: \"42db10e0-67a3-49d3-b6c5-8f48e31775e7\") " pod="openstack/horizon-6694d6d998-wgcht" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.288927 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/42db10e0-67a3-49d3-b6c5-8f48e31775e7-scripts\") pod \"horizon-6694d6d998-wgcht\" (UID: \"42db10e0-67a3-49d3-b6c5-8f48e31775e7\") " pod="openstack/horizon-6694d6d998-wgcht" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.288980 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwhgl\" (UniqueName: \"kubernetes.io/projected/42db10e0-67a3-49d3-b6c5-8f48e31775e7-kube-api-access-gwhgl\") pod \"horizon-6694d6d998-wgcht\" (UID: \"42db10e0-67a3-49d3-b6c5-8f48e31775e7\") " pod="openstack/horizon-6694d6d998-wgcht" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.289104 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/42db10e0-67a3-49d3-b6c5-8f48e31775e7-horizon-secret-key\") pod \"horizon-6694d6d998-wgcht\" (UID: \"42db10e0-67a3-49d3-b6c5-8f48e31775e7\") " pod="openstack/horizon-6694d6d998-wgcht" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.289195 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42db10e0-67a3-49d3-b6c5-8f48e31775e7-logs\") pod \"horizon-6694d6d998-wgcht\" (UID: \"42db10e0-67a3-49d3-b6c5-8f48e31775e7\") " pod="openstack/horizon-6694d6d998-wgcht" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.289232 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/42db10e0-67a3-49d3-b6c5-8f48e31775e7-horizon-tls-certs\") pod \"horizon-6694d6d998-wgcht\" (UID: \"42db10e0-67a3-49d3-b6c5-8f48e31775e7\") " pod="openstack/horizon-6694d6d998-wgcht" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.289284 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42db10e0-67a3-49d3-b6c5-8f48e31775e7-combined-ca-bundle\") pod \"horizon-6694d6d998-wgcht\" (UID: \"42db10e0-67a3-49d3-b6c5-8f48e31775e7\") " pod="openstack/horizon-6694d6d998-wgcht" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.315976 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6f77cc7d69-g6vcb"] Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.346794 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5cf8898794-dbfdf"] Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.348181 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cf8898794-dbfdf" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.384989 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5cf8898794-dbfdf"] Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.394780 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/42db10e0-67a3-49d3-b6c5-8f48e31775e7-scripts\") pod \"horizon-6694d6d998-wgcht\" (UID: \"42db10e0-67a3-49d3-b6c5-8f48e31775e7\") " pod="openstack/horizon-6694d6d998-wgcht" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.394865 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwhgl\" (UniqueName: \"kubernetes.io/projected/42db10e0-67a3-49d3-b6c5-8f48e31775e7-kube-api-access-gwhgl\") pod \"horizon-6694d6d998-wgcht\" (UID: \"42db10e0-67a3-49d3-b6c5-8f48e31775e7\") " pod="openstack/horizon-6694d6d998-wgcht" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.394949 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/42db10e0-67a3-49d3-b6c5-8f48e31775e7-horizon-secret-key\") pod \"horizon-6694d6d998-wgcht\" (UID: \"42db10e0-67a3-49d3-b6c5-8f48e31775e7\") " pod="openstack/horizon-6694d6d998-wgcht" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.396071 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/42db10e0-67a3-49d3-b6c5-8f48e31775e7-scripts\") pod \"horizon-6694d6d998-wgcht\" (UID: \"42db10e0-67a3-49d3-b6c5-8f48e31775e7\") " pod="openstack/horizon-6694d6d998-wgcht" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.403969 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42db10e0-67a3-49d3-b6c5-8f48e31775e7-logs\") pod \"horizon-6694d6d998-wgcht\" (UID: \"42db10e0-67a3-49d3-b6c5-8f48e31775e7\") " pod="openstack/horizon-6694d6d998-wgcht" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.404048 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/42db10e0-67a3-49d3-b6c5-8f48e31775e7-horizon-tls-certs\") pod \"horizon-6694d6d998-wgcht\" (UID: \"42db10e0-67a3-49d3-b6c5-8f48e31775e7\") " pod="openstack/horizon-6694d6d998-wgcht" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.404130 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42db10e0-67a3-49d3-b6c5-8f48e31775e7-combined-ca-bundle\") pod \"horizon-6694d6d998-wgcht\" (UID: \"42db10e0-67a3-49d3-b6c5-8f48e31775e7\") " pod="openstack/horizon-6694d6d998-wgcht" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.404244 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42db10e0-67a3-49d3-b6c5-8f48e31775e7-config-data\") pod \"horizon-6694d6d998-wgcht\" (UID: \"42db10e0-67a3-49d3-b6c5-8f48e31775e7\") " pod="openstack/horizon-6694d6d998-wgcht" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.405073 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42db10e0-67a3-49d3-b6c5-8f48e31775e7-logs\") pod \"horizon-6694d6d998-wgcht\" (UID: \"42db10e0-67a3-49d3-b6c5-8f48e31775e7\") " pod="openstack/horizon-6694d6d998-wgcht" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.405390 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42db10e0-67a3-49d3-b6c5-8f48e31775e7-config-data\") pod \"horizon-6694d6d998-wgcht\" (UID: \"42db10e0-67a3-49d3-b6c5-8f48e31775e7\") " pod="openstack/horizon-6694d6d998-wgcht" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.410988 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/42db10e0-67a3-49d3-b6c5-8f48e31775e7-horizon-secret-key\") pod \"horizon-6694d6d998-wgcht\" (UID: \"42db10e0-67a3-49d3-b6c5-8f48e31775e7\") " pod="openstack/horizon-6694d6d998-wgcht" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.411563 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42db10e0-67a3-49d3-b6c5-8f48e31775e7-combined-ca-bundle\") pod \"horizon-6694d6d998-wgcht\" (UID: \"42db10e0-67a3-49d3-b6c5-8f48e31775e7\") " pod="openstack/horizon-6694d6d998-wgcht" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.425568 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwhgl\" (UniqueName: \"kubernetes.io/projected/42db10e0-67a3-49d3-b6c5-8f48e31775e7-kube-api-access-gwhgl\") pod \"horizon-6694d6d998-wgcht\" (UID: \"42db10e0-67a3-49d3-b6c5-8f48e31775e7\") " pod="openstack/horizon-6694d6d998-wgcht" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.446026 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/42db10e0-67a3-49d3-b6c5-8f48e31775e7-horizon-tls-certs\") pod \"horizon-6694d6d998-wgcht\" (UID: \"42db10e0-67a3-49d3-b6c5-8f48e31775e7\") " pod="openstack/horizon-6694d6d998-wgcht" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.506158 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f1f8e9de-4491-4e25-bee0-457da0500046-config-data\") pod \"horizon-5cf8898794-dbfdf\" (UID: \"f1f8e9de-4491-4e25-bee0-457da0500046\") " pod="openstack/horizon-5cf8898794-dbfdf" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.506223 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1f8e9de-4491-4e25-bee0-457da0500046-logs\") pod \"horizon-5cf8898794-dbfdf\" (UID: \"f1f8e9de-4491-4e25-bee0-457da0500046\") " pod="openstack/horizon-5cf8898794-dbfdf" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.506252 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j2lb\" (UniqueName: \"kubernetes.io/projected/f1f8e9de-4491-4e25-bee0-457da0500046-kube-api-access-9j2lb\") pod \"horizon-5cf8898794-dbfdf\" (UID: \"f1f8e9de-4491-4e25-bee0-457da0500046\") " pod="openstack/horizon-5cf8898794-dbfdf" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.506428 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f8e9de-4491-4e25-bee0-457da0500046-horizon-tls-certs\") pod \"horizon-5cf8898794-dbfdf\" (UID: \"f1f8e9de-4491-4e25-bee0-457da0500046\") " pod="openstack/horizon-5cf8898794-dbfdf" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.506696 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1f8e9de-4491-4e25-bee0-457da0500046-scripts\") pod \"horizon-5cf8898794-dbfdf\" (UID: \"f1f8e9de-4491-4e25-bee0-457da0500046\") " pod="openstack/horizon-5cf8898794-dbfdf" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.506723 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1f8e9de-4491-4e25-bee0-457da0500046-combined-ca-bundle\") pod \"horizon-5cf8898794-dbfdf\" (UID: \"f1f8e9de-4491-4e25-bee0-457da0500046\") " pod="openstack/horizon-5cf8898794-dbfdf" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.506784 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f1f8e9de-4491-4e25-bee0-457da0500046-horizon-secret-key\") pod \"horizon-5cf8898794-dbfdf\" (UID: \"f1f8e9de-4491-4e25-bee0-457da0500046\") " pod="openstack/horizon-5cf8898794-dbfdf" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.553286 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6694d6d998-wgcht" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.608226 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f1f8e9de-4491-4e25-bee0-457da0500046-config-data\") pod \"horizon-5cf8898794-dbfdf\" (UID: \"f1f8e9de-4491-4e25-bee0-457da0500046\") " pod="openstack/horizon-5cf8898794-dbfdf" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.608281 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1f8e9de-4491-4e25-bee0-457da0500046-logs\") pod \"horizon-5cf8898794-dbfdf\" (UID: \"f1f8e9de-4491-4e25-bee0-457da0500046\") " pod="openstack/horizon-5cf8898794-dbfdf" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.608301 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j2lb\" (UniqueName: \"kubernetes.io/projected/f1f8e9de-4491-4e25-bee0-457da0500046-kube-api-access-9j2lb\") pod \"horizon-5cf8898794-dbfdf\" (UID: \"f1f8e9de-4491-4e25-bee0-457da0500046\") " pod="openstack/horizon-5cf8898794-dbfdf" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.608334 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f8e9de-4491-4e25-bee0-457da0500046-horizon-tls-certs\") pod \"horizon-5cf8898794-dbfdf\" (UID: \"f1f8e9de-4491-4e25-bee0-457da0500046\") " pod="openstack/horizon-5cf8898794-dbfdf" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.608396 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1f8e9de-4491-4e25-bee0-457da0500046-scripts\") pod \"horizon-5cf8898794-dbfdf\" (UID: \"f1f8e9de-4491-4e25-bee0-457da0500046\") " pod="openstack/horizon-5cf8898794-dbfdf" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.608415 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1f8e9de-4491-4e25-bee0-457da0500046-combined-ca-bundle\") pod \"horizon-5cf8898794-dbfdf\" (UID: \"f1f8e9de-4491-4e25-bee0-457da0500046\") " pod="openstack/horizon-5cf8898794-dbfdf" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.608442 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f1f8e9de-4491-4e25-bee0-457da0500046-horizon-secret-key\") pod \"horizon-5cf8898794-dbfdf\" (UID: \"f1f8e9de-4491-4e25-bee0-457da0500046\") " pod="openstack/horizon-5cf8898794-dbfdf" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.608941 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1f8e9de-4491-4e25-bee0-457da0500046-logs\") pod \"horizon-5cf8898794-dbfdf\" (UID: \"f1f8e9de-4491-4e25-bee0-457da0500046\") " pod="openstack/horizon-5cf8898794-dbfdf" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.609630 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1f8e9de-4491-4e25-bee0-457da0500046-scripts\") pod \"horizon-5cf8898794-dbfdf\" (UID: \"f1f8e9de-4491-4e25-bee0-457da0500046\") " pod="openstack/horizon-5cf8898794-dbfdf" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.609854 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f1f8e9de-4491-4e25-bee0-457da0500046-config-data\") pod \"horizon-5cf8898794-dbfdf\" (UID: \"f1f8e9de-4491-4e25-bee0-457da0500046\") " pod="openstack/horizon-5cf8898794-dbfdf" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.615309 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f8e9de-4491-4e25-bee0-457da0500046-horizon-tls-certs\") pod \"horizon-5cf8898794-dbfdf\" (UID: \"f1f8e9de-4491-4e25-bee0-457da0500046\") " pod="openstack/horizon-5cf8898794-dbfdf" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.616869 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f1f8e9de-4491-4e25-bee0-457da0500046-horizon-secret-key\") pod \"horizon-5cf8898794-dbfdf\" (UID: \"f1f8e9de-4491-4e25-bee0-457da0500046\") " pod="openstack/horizon-5cf8898794-dbfdf" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.638314 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1f8e9de-4491-4e25-bee0-457da0500046-combined-ca-bundle\") pod \"horizon-5cf8898794-dbfdf\" (UID: \"f1f8e9de-4491-4e25-bee0-457da0500046\") " pod="openstack/horizon-5cf8898794-dbfdf" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.643649 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j2lb\" (UniqueName: \"kubernetes.io/projected/f1f8e9de-4491-4e25-bee0-457da0500046-kube-api-access-9j2lb\") pod \"horizon-5cf8898794-dbfdf\" (UID: \"f1f8e9de-4491-4e25-bee0-457da0500046\") " pod="openstack/horizon-5cf8898794-dbfdf" Dec 04 10:34:01 crc kubenswrapper[4831]: I1204 10:34:01.705117 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cf8898794-dbfdf" Dec 04 10:34:02 crc kubenswrapper[4831]: I1204 10:34:02.747820 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6ddf665d6c-8wdxb" Dec 04 10:34:02 crc kubenswrapper[4831]: I1204 10:34:02.820694 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59db87f787-ql2pd"] Dec 04 10:34:02 crc kubenswrapper[4831]: I1204 10:34:02.821208 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59db87f787-ql2pd" podUID="2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61" containerName="dnsmasq-dns" containerID="cri-o://82cdfc36281272e7a838a3d13d489979203c8808a5e83c3711a76a6ac90ae985" gracePeriod=10 Dec 04 10:34:03 crc kubenswrapper[4831]: I1204 10:34:03.502946 4831 generic.go:334] "Generic (PLEG): container finished" podID="2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61" containerID="82cdfc36281272e7a838a3d13d489979203c8808a5e83c3711a76a6ac90ae985" exitCode=0 Dec 04 10:34:03 crc kubenswrapper[4831]: I1204 10:34:03.503258 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59db87f787-ql2pd" event={"ID":"2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61","Type":"ContainerDied","Data":"82cdfc36281272e7a838a3d13d489979203c8808a5e83c3711a76a6ac90ae985"} Dec 04 10:34:07 crc kubenswrapper[4831]: I1204 10:34:07.537972 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-59db87f787-ql2pd" podUID="2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.133:5353: connect: connection refused" Dec 04 10:34:08 crc kubenswrapper[4831]: E1204 10:34:08.822006 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.47:5001/podified-master-centos10/openstack-placement-api:watcher_latest" Dec 04 10:34:08 crc kubenswrapper[4831]: E1204 10:34:08.822053 4831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.47:5001/podified-master-centos10/openstack-placement-api:watcher_latest" Dec 04 10:34:08 crc kubenswrapper[4831]: E1204 10:34:08.822176 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:38.102.83.47:5001/podified-master-centos10/openstack-placement-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vpj5d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-9hwx9_openstack(ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 10:34:08 crc kubenswrapper[4831]: E1204 10:34:08.823533 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-9hwx9" podUID="ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e" Dec 04 10:34:08 crc kubenswrapper[4831]: E1204 10:34:08.854865 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.47:5001/podified-master-centos10/openstack-horizon:watcher_latest" Dec 04 10:34:08 crc kubenswrapper[4831]: E1204 10:34:08.854945 4831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.47:5001/podified-master-centos10/openstack-horizon:watcher_latest" Dec 04 10:34:08 crc kubenswrapper[4831]: E1204 10:34:08.855105 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:38.102.83.47:5001/podified-master-centos10/openstack-horizon:watcher_latest,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n546h645h57h5cbhbch54dhb7h4h5d7h689h649h576h5cfh77h544h59bh5fchfh7dh5fch64dhfh9bh56h59bh656hd8h5cch56dh55h5c9h7q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5wmt4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-5fb66c999-tf56s_openstack(33636a94-6988-4866-81d9-408053be5c58): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 10:34:08 crc kubenswrapper[4831]: I1204 10:34:08.857804 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 04 10:34:08 crc kubenswrapper[4831]: E1204 10:34:08.878495 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.47:5001/podified-master-centos10/openstack-horizon:watcher_latest\\\"\"]" pod="openstack/horizon-5fb66c999-tf56s" podUID="33636a94-6988-4866-81d9-408053be5c58" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.105587 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ba03-account-create-gvcf6" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.139257 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-hwbtz" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.152878 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vgwjn" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.256113 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb735a75-e006-4422-9ef1-f1ba13d56646-scripts\") pod \"cb735a75-e006-4422-9ef1-f1ba13d56646\" (UID: \"cb735a75-e006-4422-9ef1-f1ba13d56646\") " Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.256622 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-569l9\" (UniqueName: \"kubernetes.io/projected/ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a-kube-api-access-569l9\") pod \"ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a\" (UID: \"ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a\") " Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.256835 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cb735a75-e006-4422-9ef1-f1ba13d56646-credential-keys\") pod \"cb735a75-e006-4422-9ef1-f1ba13d56646\" (UID: \"cb735a75-e006-4422-9ef1-f1ba13d56646\") " Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.256963 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a-config-data\") pod \"ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a\" (UID: \"ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a\") " Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.257042 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a-db-sync-config-data\") pod \"ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a\" (UID: \"ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a\") " Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.257130 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kt2dx\" (UniqueName: \"kubernetes.io/projected/d40946a3-b49e-4cb0-b113-18711def8c0e-kube-api-access-kt2dx\") pod \"d40946a3-b49e-4cb0-b113-18711def8c0e\" (UID: \"d40946a3-b49e-4cb0-b113-18711def8c0e\") " Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.257247 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb735a75-e006-4422-9ef1-f1ba13d56646-config-data\") pod \"cb735a75-e006-4422-9ef1-f1ba13d56646\" (UID: \"cb735a75-e006-4422-9ef1-f1ba13d56646\") " Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.257357 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a-combined-ca-bundle\") pod \"ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a\" (UID: \"ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a\") " Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.257473 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56qwc\" (UniqueName: \"kubernetes.io/projected/cb735a75-e006-4422-9ef1-f1ba13d56646-kube-api-access-56qwc\") pod \"cb735a75-e006-4422-9ef1-f1ba13d56646\" (UID: \"cb735a75-e006-4422-9ef1-f1ba13d56646\") " Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.257581 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb735a75-e006-4422-9ef1-f1ba13d56646-combined-ca-bundle\") pod \"cb735a75-e006-4422-9ef1-f1ba13d56646\" (UID: \"cb735a75-e006-4422-9ef1-f1ba13d56646\") " Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.257647 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cb735a75-e006-4422-9ef1-f1ba13d56646-fernet-keys\") pod \"cb735a75-e006-4422-9ef1-f1ba13d56646\" (UID: \"cb735a75-e006-4422-9ef1-f1ba13d56646\") " Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.269759 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59db87f787-ql2pd" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.270222 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb735a75-e006-4422-9ef1-f1ba13d56646-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "cb735a75-e006-4422-9ef1-f1ba13d56646" (UID: "cb735a75-e006-4422-9ef1-f1ba13d56646"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.270368 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb735a75-e006-4422-9ef1-f1ba13d56646-scripts" (OuterVolumeSpecName: "scripts") pod "cb735a75-e006-4422-9ef1-f1ba13d56646" (UID: "cb735a75-e006-4422-9ef1-f1ba13d56646"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.271295 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a-kube-api-access-569l9" (OuterVolumeSpecName: "kube-api-access-569l9") pod "ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a" (UID: "ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a"). InnerVolumeSpecName "kube-api-access-569l9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.273689 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb735a75-e006-4422-9ef1-f1ba13d56646-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "cb735a75-e006-4422-9ef1-f1ba13d56646" (UID: "cb735a75-e006-4422-9ef1-f1ba13d56646"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.273944 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb735a75-e006-4422-9ef1-f1ba13d56646-kube-api-access-56qwc" (OuterVolumeSpecName: "kube-api-access-56qwc") pod "cb735a75-e006-4422-9ef1-f1ba13d56646" (UID: "cb735a75-e006-4422-9ef1-f1ba13d56646"). InnerVolumeSpecName "kube-api-access-56qwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.274403 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a" (UID: "ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.277768 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d40946a3-b49e-4cb0-b113-18711def8c0e-kube-api-access-kt2dx" (OuterVolumeSpecName: "kube-api-access-kt2dx") pod "d40946a3-b49e-4cb0-b113-18711def8c0e" (UID: "d40946a3-b49e-4cb0-b113-18711def8c0e"). InnerVolumeSpecName "kube-api-access-kt2dx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.359491 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fg7d\" (UniqueName: \"kubernetes.io/projected/2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61-kube-api-access-2fg7d\") pod \"2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61\" (UID: \"2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61\") " Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.359526 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61-dns-swift-storage-0\") pod \"2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61\" (UID: \"2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61\") " Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.359688 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61-ovsdbserver-nb\") pod \"2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61\" (UID: \"2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61\") " Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.359748 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61-ovsdbserver-sb\") pod \"2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61\" (UID: \"2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61\") " Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.359776 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61-config\") pod \"2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61\" (UID: \"2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61\") " Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.359830 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61-dns-svc\") pod \"2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61\" (UID: \"2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61\") " Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.360216 4831 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cb735a75-e006-4422-9ef1-f1ba13d56646-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.360233 4831 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.360242 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kt2dx\" (UniqueName: \"kubernetes.io/projected/d40946a3-b49e-4cb0-b113-18711def8c0e-kube-api-access-kt2dx\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.360251 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56qwc\" (UniqueName: \"kubernetes.io/projected/cb735a75-e006-4422-9ef1-f1ba13d56646-kube-api-access-56qwc\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.360260 4831 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cb735a75-e006-4422-9ef1-f1ba13d56646-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.360268 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb735a75-e006-4422-9ef1-f1ba13d56646-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.360277 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-569l9\" (UniqueName: \"kubernetes.io/projected/ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a-kube-api-access-569l9\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.366988 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61-kube-api-access-2fg7d" (OuterVolumeSpecName: "kube-api-access-2fg7d") pod "2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61" (UID: "2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61"). InnerVolumeSpecName "kube-api-access-2fg7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.367004 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a" (UID: "ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.375171 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb735a75-e006-4422-9ef1-f1ba13d56646-config-data" (OuterVolumeSpecName: "config-data") pod "cb735a75-e006-4422-9ef1-f1ba13d56646" (UID: "cb735a75-e006-4422-9ef1-f1ba13d56646"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.438057 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-377a-account-create-c9vwp"] Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.445753 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb735a75-e006-4422-9ef1-f1ba13d56646-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb735a75-e006-4422-9ef1-f1ba13d56646" (UID: "cb735a75-e006-4422-9ef1-f1ba13d56646"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:09 crc kubenswrapper[4831]: W1204 10:34:09.448283 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf67e7c2e_46d3_4e53_97fe_1b1a7d2e7d9f.slice/crio-a4f59bb0c9730b008ecd4a186846008d8d2e7365d725abc1653c25add7b9fcde WatchSource:0}: Error finding container a4f59bb0c9730b008ecd4a186846008d8d2e7365d725abc1653c25add7b9fcde: Status 404 returned error can't find the container with id a4f59bb0c9730b008ecd4a186846008d8d2e7365d725abc1653c25add7b9fcde Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.453116 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.461788 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb735a75-e006-4422-9ef1-f1ba13d56646-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.461817 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fg7d\" (UniqueName: \"kubernetes.io/projected/2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61-kube-api-access-2fg7d\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.461827 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb735a75-e006-4422-9ef1-f1ba13d56646-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.461834 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.573354 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ba03-account-create-gvcf6" event={"ID":"d40946a3-b49e-4cb0-b113-18711def8c0e","Type":"ContainerDied","Data":"3a59c0d03be93060e3a9fe378e2e30e8fff82d35410f3b3df1ebbbb59f7355d1"} Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.573734 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a59c0d03be93060e3a9fe378e2e30e8fff82d35410f3b3df1ebbbb59f7355d1" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.573865 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ba03-account-create-gvcf6" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.589503 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6694d6d998-wgcht"] Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.594598 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59db87f787-ql2pd" event={"ID":"2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61","Type":"ContainerDied","Data":"3b2ab94b0ba2a0fb250c141450a8c84a7e427f1c716a41e4539468d6ebc71464"} Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.594638 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59db87f787-ql2pd" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.594648 4831 scope.go:117] "RemoveContainer" containerID="82cdfc36281272e7a838a3d13d489979203c8808a5e83c3711a76a6ac90ae985" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.594979 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61" (UID: "2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.601054 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61" (UID: "2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.603758 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vgwjn" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.604574 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vgwjn" event={"ID":"cb735a75-e006-4422-9ef1-f1ba13d56646","Type":"ContainerDied","Data":"5514ffa5aa7b95857d999cd68e6858a5314a5224d4bd3c5d55ae82cfacefa7fc"} Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.604834 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5514ffa5aa7b95857d999cd68e6858a5314a5224d4bd3c5d55ae82cfacefa7fc" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.612919 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f77cc7d69-g6vcb" event={"ID":"85393e48-dbc6-41c7-9419-4baf00f072db","Type":"ContainerStarted","Data":"936c2a23b071f14add02e0c514fe8771dabd15308495a12484cd9404a1590f1d"} Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.616308 4831 generic.go:334] "Generic (PLEG): container finished" podID="664dafb4-41fc-4bfc-8355-32ae4ef51867" containerID="292bb796952f79a1b29382424747ba3fc7838fec6389269074b00bfd131670d1" exitCode=0 Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.616361 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-edf8-account-create-w79xp" event={"ID":"664dafb4-41fc-4bfc-8355-32ae4ef51867","Type":"ContainerDied","Data":"292bb796952f79a1b29382424747ba3fc7838fec6389269074b00bfd131670d1"} Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.616382 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-edf8-account-create-w79xp" event={"ID":"664dafb4-41fc-4bfc-8355-32ae4ef51867","Type":"ContainerStarted","Data":"4822608cfc2181a01a791220dccb8ba9ac4a97eff3641f777ac7c4fb760c92cd"} Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.618121 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-377a-account-create-c9vwp" event={"ID":"f67e7c2e-46d3-4e53-97fe-1b1a7d2e7d9f","Type":"ContainerStarted","Data":"a4f59bb0c9730b008ecd4a186846008d8d2e7365d725abc1653c25add7b9fcde"} Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.626265 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-hwbtz" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.626262 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-hwbtz" event={"ID":"ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a","Type":"ContainerDied","Data":"99fc17db23ccd6fce633cd8cee0802f1759006bd9ddd03c419feb9f04ca3b9fd"} Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.626454 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99fc17db23ccd6fce633cd8cee0802f1759006bd9ddd03c419feb9f04ca3b9fd" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.626965 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a-config-data" (OuterVolumeSpecName: "config-data") pod "ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a" (UID: "ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.632867 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61" (UID: "2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.636820 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cb786f76c-8xlwx" event={"ID":"e41bbb94-b986-4268-8040-77542216b905","Type":"ContainerStarted","Data":"db4983df9038a909cc6c9e8bd5cddb1e6d9ff4f8542855af2578e2773b6584f9"} Dec 04 10:34:09 crc kubenswrapper[4831]: E1204 10:34:09.641047 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.47:5001/podified-master-centos10/openstack-placement-api:watcher_latest\\\"\"" pod="openstack/placement-db-sync-9hwx9" podUID="ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.666001 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61-config" (OuterVolumeSpecName: "config") pod "2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61" (UID: "2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.668734 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.668818 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.668836 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.668848 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.668861 4831 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.670867 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61" (UID: "2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.686560 4831 scope.go:117] "RemoveContainer" containerID="c914bf4be613426ce9e2fd91cb2f06ec674896d4ae1a394ceb55b91ae1dd71f9" Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.686768 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5cf8898794-dbfdf"] Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.699043 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-q6jc5"] Dec 04 10:34:09 crc kubenswrapper[4831]: W1204 10:34:09.725526 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1f8e9de_4491_4e25_bee0_457da0500046.slice/crio-85f464ec95c50797f11c4a7ad4438df833af4ef926fba0c8dae8e22de4d6a475 WatchSource:0}: Error finding container 85f464ec95c50797f11c4a7ad4438df833af4ef926fba0c8dae8e22de4d6a475: Status 404 returned error can't find the container with id 85f464ec95c50797f11c4a7ad4438df833af4ef926fba0c8dae8e22de4d6a475 Dec 04 10:34:09 crc kubenswrapper[4831]: I1204 10:34:09.771042 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.035577 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fb66c999-tf56s" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.178233 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/33636a94-6988-4866-81d9-408053be5c58-scripts\") pod \"33636a94-6988-4866-81d9-408053be5c58\" (UID: \"33636a94-6988-4866-81d9-408053be5c58\") " Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.178624 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33636a94-6988-4866-81d9-408053be5c58-logs\") pod \"33636a94-6988-4866-81d9-408053be5c58\" (UID: \"33636a94-6988-4866-81d9-408053be5c58\") " Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.178731 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/33636a94-6988-4866-81d9-408053be5c58-config-data\") pod \"33636a94-6988-4866-81d9-408053be5c58\" (UID: \"33636a94-6988-4866-81d9-408053be5c58\") " Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.178920 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wmt4\" (UniqueName: \"kubernetes.io/projected/33636a94-6988-4866-81d9-408053be5c58-kube-api-access-5wmt4\") pod \"33636a94-6988-4866-81d9-408053be5c58\" (UID: \"33636a94-6988-4866-81d9-408053be5c58\") " Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.179054 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/33636a94-6988-4866-81d9-408053be5c58-horizon-secret-key\") pod \"33636a94-6988-4866-81d9-408053be5c58\" (UID: \"33636a94-6988-4866-81d9-408053be5c58\") " Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.179204 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33636a94-6988-4866-81d9-408053be5c58-logs" (OuterVolumeSpecName: "logs") pod "33636a94-6988-4866-81d9-408053be5c58" (UID: "33636a94-6988-4866-81d9-408053be5c58"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.179466 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33636a94-6988-4866-81d9-408053be5c58-scripts" (OuterVolumeSpecName: "scripts") pod "33636a94-6988-4866-81d9-408053be5c58" (UID: "33636a94-6988-4866-81d9-408053be5c58"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.179646 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/33636a94-6988-4866-81d9-408053be5c58-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.179680 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33636a94-6988-4866-81d9-408053be5c58-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.180179 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33636a94-6988-4866-81d9-408053be5c58-config-data" (OuterVolumeSpecName: "config-data") pod "33636a94-6988-4866-81d9-408053be5c58" (UID: "33636a94-6988-4866-81d9-408053be5c58"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.191396 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33636a94-6988-4866-81d9-408053be5c58-kube-api-access-5wmt4" (OuterVolumeSpecName: "kube-api-access-5wmt4") pod "33636a94-6988-4866-81d9-408053be5c58" (UID: "33636a94-6988-4866-81d9-408053be5c58"). InnerVolumeSpecName "kube-api-access-5wmt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.192701 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33636a94-6988-4866-81d9-408053be5c58-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "33636a94-6988-4866-81d9-408053be5c58" (UID: "33636a94-6988-4866-81d9-408053be5c58"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.283048 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/33636a94-6988-4866-81d9-408053be5c58-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.283084 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wmt4\" (UniqueName: \"kubernetes.io/projected/33636a94-6988-4866-81d9-408053be5c58-kube-api-access-5wmt4\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.283095 4831 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/33636a94-6988-4866-81d9-408053be5c58-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.405480 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59db87f787-ql2pd"] Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.417526 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59db87f787-ql2pd"] Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.450191 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-vgwjn"] Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.465008 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-vgwjn"] Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.478616 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-s9bgg"] Dec 04 10:34:10 crc kubenswrapper[4831]: E1204 10:34:10.478997 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a" containerName="watcher-db-sync" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.479017 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a" containerName="watcher-db-sync" Dec 04 10:34:10 crc kubenswrapper[4831]: E1204 10:34:10.479028 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d40946a3-b49e-4cb0-b113-18711def8c0e" containerName="mariadb-account-create" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.479035 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="d40946a3-b49e-4cb0-b113-18711def8c0e" containerName="mariadb-account-create" Dec 04 10:34:10 crc kubenswrapper[4831]: E1204 10:34:10.479051 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb735a75-e006-4422-9ef1-f1ba13d56646" containerName="keystone-bootstrap" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.479057 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb735a75-e006-4422-9ef1-f1ba13d56646" containerName="keystone-bootstrap" Dec 04 10:34:10 crc kubenswrapper[4831]: E1204 10:34:10.479074 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61" containerName="dnsmasq-dns" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.479080 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61" containerName="dnsmasq-dns" Dec 04 10:34:10 crc kubenswrapper[4831]: E1204 10:34:10.479091 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61" containerName="init" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.479097 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61" containerName="init" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.479296 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="d40946a3-b49e-4cb0-b113-18711def8c0e" containerName="mariadb-account-create" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.479317 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a" containerName="watcher-db-sync" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.479326 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb735a75-e006-4422-9ef1-f1ba13d56646" containerName="keystone-bootstrap" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.479344 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61" containerName="dnsmasq-dns" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.479916 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s9bgg" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.481813 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.486058 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-5hfxd" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.486113 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.490091 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.493396 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-s9bgg"] Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.582753 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.597590 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.601245 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5-config-data\") pod \"keystone-bootstrap-s9bgg\" (UID: \"ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5\") " pod="openstack/keystone-bootstrap-s9bgg" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.601342 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5-scripts\") pod \"keystone-bootstrap-s9bgg\" (UID: \"ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5\") " pod="openstack/keystone-bootstrap-s9bgg" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.601426 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5-combined-ca-bundle\") pod \"keystone-bootstrap-s9bgg\" (UID: \"ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5\") " pod="openstack/keystone-bootstrap-s9bgg" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.601466 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5-fernet-keys\") pod \"keystone-bootstrap-s9bgg\" (UID: \"ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5\") " pod="openstack/keystone-bootstrap-s9bgg" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.601509 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h9kx\" (UniqueName: \"kubernetes.io/projected/ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5-kube-api-access-5h9kx\") pod \"keystone-bootstrap-s9bgg\" (UID: \"ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5\") " pod="openstack/keystone-bootstrap-s9bgg" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.601590 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5-credential-keys\") pod \"keystone-bootstrap-s9bgg\" (UID: \"ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5\") " pod="openstack/keystone-bootstrap-s9bgg" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.608258 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-2vnjf" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.612916 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.677732 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.704435 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5-scripts\") pod \"keystone-bootstrap-s9bgg\" (UID: \"ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5\") " pod="openstack/keystone-bootstrap-s9bgg" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.704532 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5-combined-ca-bundle\") pod \"keystone-bootstrap-s9bgg\" (UID: \"ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5\") " pod="openstack/keystone-bootstrap-s9bgg" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.704565 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5-fernet-keys\") pod \"keystone-bootstrap-s9bgg\" (UID: \"ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5\") " pod="openstack/keystone-bootstrap-s9bgg" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.704593 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h89xx\" (UniqueName: \"kubernetes.io/projected/6040f79c-a151-4681-a758-d2741bff68b6-kube-api-access-h89xx\") pod \"watcher-decision-engine-0\" (UID: \"6040f79c-a151-4681-a758-d2741bff68b6\") " pod="openstack/watcher-decision-engine-0" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.704626 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5h9kx\" (UniqueName: \"kubernetes.io/projected/ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5-kube-api-access-5h9kx\") pod \"keystone-bootstrap-s9bgg\" (UID: \"ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5\") " pod="openstack/keystone-bootstrap-s9bgg" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.704708 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5-credential-keys\") pod \"keystone-bootstrap-s9bgg\" (UID: \"ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5\") " pod="openstack/keystone-bootstrap-s9bgg" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.704731 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6040f79c-a151-4681-a758-d2741bff68b6-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"6040f79c-a151-4681-a758-d2741bff68b6\") " pod="openstack/watcher-decision-engine-0" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.704763 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6040f79c-a151-4681-a758-d2741bff68b6-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"6040f79c-a151-4681-a758-d2741bff68b6\") " pod="openstack/watcher-decision-engine-0" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.704788 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5-config-data\") pod \"keystone-bootstrap-s9bgg\" (UID: \"ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5\") " pod="openstack/keystone-bootstrap-s9bgg" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.704806 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6040f79c-a151-4681-a758-d2741bff68b6-config-data\") pod \"watcher-decision-engine-0\" (UID: \"6040f79c-a151-4681-a758-d2741bff68b6\") " pod="openstack/watcher-decision-engine-0" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.704848 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6040f79c-a151-4681-a758-d2741bff68b6-logs\") pod \"watcher-decision-engine-0\" (UID: \"6040f79c-a151-4681-a758-d2741bff68b6\") " pod="openstack/watcher-decision-engine-0" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.758312 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5-scripts\") pod \"keystone-bootstrap-s9bgg\" (UID: \"ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5\") " pod="openstack/keystone-bootstrap-s9bgg" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.760621 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5-fernet-keys\") pod \"keystone-bootstrap-s9bgg\" (UID: \"ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5\") " pod="openstack/keystone-bootstrap-s9bgg" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.769205 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5-credential-keys\") pod \"keystone-bootstrap-s9bgg\" (UID: \"ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5\") " pod="openstack/keystone-bootstrap-s9bgg" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.769884 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5-config-data\") pod \"keystone-bootstrap-s9bgg\" (UID: \"ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5\") " pod="openstack/keystone-bootstrap-s9bgg" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.779623 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h9kx\" (UniqueName: \"kubernetes.io/projected/ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5-kube-api-access-5h9kx\") pod \"keystone-bootstrap-s9bgg\" (UID: \"ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5\") " pod="openstack/keystone-bootstrap-s9bgg" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.795255 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5-combined-ca-bundle\") pod \"keystone-bootstrap-s9bgg\" (UID: \"ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5\") " pod="openstack/keystone-bootstrap-s9bgg" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.799323 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s9bgg" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.806978 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.808985 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.809553 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6040f79c-a151-4681-a758-d2741bff68b6-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"6040f79c-a151-4681-a758-d2741bff68b6\") " pod="openstack/watcher-decision-engine-0" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.809596 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6040f79c-a151-4681-a758-d2741bff68b6-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"6040f79c-a151-4681-a758-d2741bff68b6\") " pod="openstack/watcher-decision-engine-0" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.809620 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6040f79c-a151-4681-a758-d2741bff68b6-config-data\") pod \"watcher-decision-engine-0\" (UID: \"6040f79c-a151-4681-a758-d2741bff68b6\") " pod="openstack/watcher-decision-engine-0" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.809651 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6040f79c-a151-4681-a758-d2741bff68b6-logs\") pod \"watcher-decision-engine-0\" (UID: \"6040f79c-a151-4681-a758-d2741bff68b6\") " pod="openstack/watcher-decision-engine-0" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.809778 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h89xx\" (UniqueName: \"kubernetes.io/projected/6040f79c-a151-4681-a758-d2741bff68b6-kube-api-access-h89xx\") pod \"watcher-decision-engine-0\" (UID: \"6040f79c-a151-4681-a758-d2741bff68b6\") " pod="openstack/watcher-decision-engine-0" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.810393 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6040f79c-a151-4681-a758-d2741bff68b6-logs\") pod \"watcher-decision-engine-0\" (UID: \"6040f79c-a151-4681-a758-d2741bff68b6\") " pod="openstack/watcher-decision-engine-0" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.813684 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.817467 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.822358 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6040f79c-a151-4681-a758-d2741bff68b6-config-data\") pod \"watcher-decision-engine-0\" (UID: \"6040f79c-a151-4681-a758-d2741bff68b6\") " pod="openstack/watcher-decision-engine-0" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.826911 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-7svk9"] Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.828637 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-7svk9" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.835742 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6040f79c-a151-4681-a758-d2741bff68b6-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"6040f79c-a151-4681-a758-d2741bff68b6\") " pod="openstack/watcher-decision-engine-0" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.846758 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6040f79c-a151-4681-a758-d2741bff68b6-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"6040f79c-a151-4681-a758-d2741bff68b6\") " pod="openstack/watcher-decision-engine-0" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.852923 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.853476 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-tsc2h" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.857242 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cf8898794-dbfdf" event={"ID":"f1f8e9de-4491-4e25-bee0-457da0500046","Type":"ContainerStarted","Data":"85f464ec95c50797f11c4a7ad4438df833af4ef926fba0c8dae8e22de4d6a475"} Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.870427 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h89xx\" (UniqueName: \"kubernetes.io/projected/6040f79c-a151-4681-a758-d2741bff68b6-kube-api-access-h89xx\") pod \"watcher-decision-engine-0\" (UID: \"6040f79c-a151-4681-a758-d2741bff68b6\") " pod="openstack/watcher-decision-engine-0" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.876970 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-q6jc5" event={"ID":"41899055-8db6-4cdb-a9da-2bbb143b9f3f","Type":"ContainerStarted","Data":"e1f772ec096859627654955f2c6c91d0fe6fa9818d8f0b540d61604d41528c2a"} Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.878616 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-7svk9"] Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.882614 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cb786f76c-8xlwx" event={"ID":"e41bbb94-b986-4268-8040-77542216b905","Type":"ContainerStarted","Data":"abeef5ea612be1debf78fcd95ef3690d3ede7c824d0834ae5bda84e1dd93c780"} Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.885903 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5cb786f76c-8xlwx" podUID="e41bbb94-b986-4268-8040-77542216b905" containerName="horizon-log" containerID="cri-o://db4983df9038a909cc6c9e8bd5cddb1e6d9ff4f8542855af2578e2773b6584f9" gracePeriod=30 Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.886044 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5cb786f76c-8xlwx" podUID="e41bbb94-b986-4268-8040-77542216b905" containerName="horizon" containerID="cri-o://abeef5ea612be1debf78fcd95ef3690d3ede7c824d0834ae5bda84e1dd93c780" gracePeriod=30 Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.892555 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.894703 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.895222 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fb66c999-tf56s" event={"ID":"33636a94-6988-4866-81d9-408053be5c58","Type":"ContainerDied","Data":"af0b1bbd541b7e37f8d499e43fc9a02f95cf897521af46664deef383c8f39c62"} Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.895303 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fb66c999-tf56s" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.902224 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.909260 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.919570 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/077ce354-b1d3-40d2-bf19-0eee3d474753-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"077ce354-b1d3-40d2-bf19-0eee3d474753\") " pod="openstack/watcher-api-0" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.919896 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbdnw\" (UniqueName: \"kubernetes.io/projected/077ce354-b1d3-40d2-bf19-0eee3d474753-kube-api-access-jbdnw\") pod \"watcher-api-0\" (UID: \"077ce354-b1d3-40d2-bf19-0eee3d474753\") " pod="openstack/watcher-api-0" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.920027 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q885\" (UniqueName: \"kubernetes.io/projected/a79b812b-5862-43d0-a6d5-5c6ec3a63e51-kube-api-access-8q885\") pod \"barbican-db-sync-7svk9\" (UID: \"a79b812b-5862-43d0-a6d5-5c6ec3a63e51\") " pod="openstack/barbican-db-sync-7svk9" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.920883 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a79b812b-5862-43d0-a6d5-5c6ec3a63e51-db-sync-config-data\") pod \"barbican-db-sync-7svk9\" (UID: \"a79b812b-5862-43d0-a6d5-5c6ec3a63e51\") " pod="openstack/barbican-db-sync-7svk9" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.921640 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/077ce354-b1d3-40d2-bf19-0eee3d474753-logs\") pod \"watcher-api-0\" (UID: \"077ce354-b1d3-40d2-bf19-0eee3d474753\") " pod="openstack/watcher-api-0" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.921743 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/077ce354-b1d3-40d2-bf19-0eee3d474753-config-data\") pod \"watcher-api-0\" (UID: \"077ce354-b1d3-40d2-bf19-0eee3d474753\") " pod="openstack/watcher-api-0" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.921863 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/077ce354-b1d3-40d2-bf19-0eee3d474753-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"077ce354-b1d3-40d2-bf19-0eee3d474753\") " pod="openstack/watcher-api-0" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.922071 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a79b812b-5862-43d0-a6d5-5c6ec3a63e51-combined-ca-bundle\") pod \"barbican-db-sync-7svk9\" (UID: \"a79b812b-5862-43d0-a6d5-5c6ec3a63e51\") " pod="openstack/barbican-db-sync-7svk9" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.933187 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6f77cc7d69-g6vcb" podUID="85393e48-dbc6-41c7-9419-4baf00f072db" containerName="horizon" containerID="cri-o://ced500055055bcb942e672ef9b1d0182950bde3c7bee1e55888858a7e706bad5" gracePeriod=30 Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.937723 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6f77cc7d69-g6vcb" podUID="85393e48-dbc6-41c7-9419-4baf00f072db" containerName="horizon-log" containerID="cri-o://936c2a23b071f14add02e0c514fe8771dabd15308495a12484cd9404a1590f1d" gracePeriod=30 Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.942573 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6694d6d998-wgcht" event={"ID":"42db10e0-67a3-49d3-b6c5-8f48e31775e7","Type":"ContainerStarted","Data":"64dc7716d32e5689512dbff216be683df65b38feec19853322823f863328f562"} Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.960338 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.962821 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b769c363-a026-4bf4-9b56-3d1452b6847d","Type":"ContainerStarted","Data":"021d8e6d61418e963a5e960d7a12ba82d80b9c251032566a717d19caa4e03441"} Dec 04 10:34:10 crc kubenswrapper[4831]: I1204 10:34:10.991646 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-377a-account-create-c9vwp" podStartSLOduration=12.991628267 podStartE2EDuration="12.991628267s" podCreationTimestamp="2025-12-04 10:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:34:10.887119443 +0000 UTC m=+1147.836294767" watchObservedRunningTime="2025-12-04 10:34:10.991628267 +0000 UTC m=+1147.940803581" Dec 04 10:34:11 crc kubenswrapper[4831]: I1204 10:34:11.024407 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/077ce354-b1d3-40d2-bf19-0eee3d474753-logs\") pod \"watcher-api-0\" (UID: \"077ce354-b1d3-40d2-bf19-0eee3d474753\") " pod="openstack/watcher-api-0" Dec 04 10:34:11 crc kubenswrapper[4831]: I1204 10:34:11.024673 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/077ce354-b1d3-40d2-bf19-0eee3d474753-config-data\") pod \"watcher-api-0\" (UID: \"077ce354-b1d3-40d2-bf19-0eee3d474753\") " pod="openstack/watcher-api-0" Dec 04 10:34:11 crc kubenswrapper[4831]: I1204 10:34:11.024703 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/077ce354-b1d3-40d2-bf19-0eee3d474753-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"077ce354-b1d3-40d2-bf19-0eee3d474753\") " pod="openstack/watcher-api-0" Dec 04 10:34:11 crc kubenswrapper[4831]: I1204 10:34:11.024772 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d998428-237f-4a86-9be9-8e3f563003b9-config-data\") pod \"watcher-applier-0\" (UID: \"8d998428-237f-4a86-9be9-8e3f563003b9\") " pod="openstack/watcher-applier-0" Dec 04 10:34:11 crc kubenswrapper[4831]: I1204 10:34:11.024791 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d998428-237f-4a86-9be9-8e3f563003b9-logs\") pod \"watcher-applier-0\" (UID: \"8d998428-237f-4a86-9be9-8e3f563003b9\") " pod="openstack/watcher-applier-0" Dec 04 10:34:11 crc kubenswrapper[4831]: I1204 10:34:11.024814 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a79b812b-5862-43d0-a6d5-5c6ec3a63e51-combined-ca-bundle\") pod \"barbican-db-sync-7svk9\" (UID: \"a79b812b-5862-43d0-a6d5-5c6ec3a63e51\") " pod="openstack/barbican-db-sync-7svk9" Dec 04 10:34:11 crc kubenswrapper[4831]: I1204 10:34:11.024852 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh6fx\" (UniqueName: \"kubernetes.io/projected/8d998428-237f-4a86-9be9-8e3f563003b9-kube-api-access-vh6fx\") pod \"watcher-applier-0\" (UID: \"8d998428-237f-4a86-9be9-8e3f563003b9\") " pod="openstack/watcher-applier-0" Dec 04 10:34:11 crc kubenswrapper[4831]: I1204 10:34:11.024870 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d998428-237f-4a86-9be9-8e3f563003b9-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"8d998428-237f-4a86-9be9-8e3f563003b9\") " pod="openstack/watcher-applier-0" Dec 04 10:34:11 crc kubenswrapper[4831]: I1204 10:34:11.024891 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/077ce354-b1d3-40d2-bf19-0eee3d474753-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"077ce354-b1d3-40d2-bf19-0eee3d474753\") " pod="openstack/watcher-api-0" Dec 04 10:34:11 crc kubenswrapper[4831]: I1204 10:34:11.024913 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbdnw\" (UniqueName: \"kubernetes.io/projected/077ce354-b1d3-40d2-bf19-0eee3d474753-kube-api-access-jbdnw\") pod \"watcher-api-0\" (UID: \"077ce354-b1d3-40d2-bf19-0eee3d474753\") " pod="openstack/watcher-api-0" Dec 04 10:34:11 crc kubenswrapper[4831]: I1204 10:34:11.024928 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8q885\" (UniqueName: \"kubernetes.io/projected/a79b812b-5862-43d0-a6d5-5c6ec3a63e51-kube-api-access-8q885\") pod \"barbican-db-sync-7svk9\" (UID: \"a79b812b-5862-43d0-a6d5-5c6ec3a63e51\") " pod="openstack/barbican-db-sync-7svk9" Dec 04 10:34:11 crc kubenswrapper[4831]: I1204 10:34:11.024950 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a79b812b-5862-43d0-a6d5-5c6ec3a63e51-db-sync-config-data\") pod \"barbican-db-sync-7svk9\" (UID: \"a79b812b-5862-43d0-a6d5-5c6ec3a63e51\") " pod="openstack/barbican-db-sync-7svk9" Dec 04 10:34:11 crc kubenswrapper[4831]: I1204 10:34:11.025843 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/077ce354-b1d3-40d2-bf19-0eee3d474753-logs\") pod \"watcher-api-0\" (UID: \"077ce354-b1d3-40d2-bf19-0eee3d474753\") " pod="openstack/watcher-api-0" Dec 04 10:34:11 crc kubenswrapper[4831]: I1204 10:34:11.035065 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a79b812b-5862-43d0-a6d5-5c6ec3a63e51-db-sync-config-data\") pod \"barbican-db-sync-7svk9\" (UID: \"a79b812b-5862-43d0-a6d5-5c6ec3a63e51\") " pod="openstack/barbican-db-sync-7svk9" Dec 04 10:34:11 crc kubenswrapper[4831]: I1204 10:34:11.040361 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/077ce354-b1d3-40d2-bf19-0eee3d474753-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"077ce354-b1d3-40d2-bf19-0eee3d474753\") " pod="openstack/watcher-api-0" Dec 04 10:34:11 crc kubenswrapper[4831]: I1204 10:34:11.041207 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a79b812b-5862-43d0-a6d5-5c6ec3a63e51-combined-ca-bundle\") pod \"barbican-db-sync-7svk9\" (UID: \"a79b812b-5862-43d0-a6d5-5c6ec3a63e51\") " pod="openstack/barbican-db-sync-7svk9" Dec 04 10:34:11 crc kubenswrapper[4831]: I1204 10:34:11.043752 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/077ce354-b1d3-40d2-bf19-0eee3d474753-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"077ce354-b1d3-40d2-bf19-0eee3d474753\") " pod="openstack/watcher-api-0" Dec 04 10:34:11 crc kubenswrapper[4831]: I1204 10:34:11.056021 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5cb786f76c-8xlwx" podStartSLOduration=4.2283392 podStartE2EDuration="20.056003862s" podCreationTimestamp="2025-12-04 10:33:51 +0000 UTC" firstStartedPulling="2025-12-04 10:33:53.145761985 +0000 UTC m=+1130.094937299" lastFinishedPulling="2025-12-04 10:34:08.973426647 +0000 UTC m=+1145.922601961" observedRunningTime="2025-12-04 10:34:10.90901061 +0000 UTC m=+1147.858185924" watchObservedRunningTime="2025-12-04 10:34:11.056003862 +0000 UTC m=+1148.005179176" Dec 04 10:34:11 crc kubenswrapper[4831]: I1204 10:34:11.056365 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/077ce354-b1d3-40d2-bf19-0eee3d474753-config-data\") pod \"watcher-api-0\" (UID: \"077ce354-b1d3-40d2-bf19-0eee3d474753\") " pod="openstack/watcher-api-0" Dec 04 10:34:11 crc kubenswrapper[4831]: I1204 10:34:11.064277 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q885\" (UniqueName: \"kubernetes.io/projected/a79b812b-5862-43d0-a6d5-5c6ec3a63e51-kube-api-access-8q885\") pod \"barbican-db-sync-7svk9\" (UID: \"a79b812b-5862-43d0-a6d5-5c6ec3a63e51\") " pod="openstack/barbican-db-sync-7svk9" Dec 04 10:34:11 crc kubenswrapper[4831]: I1204 10:34:11.073258 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbdnw\" (UniqueName: \"kubernetes.io/projected/077ce354-b1d3-40d2-bf19-0eee3d474753-kube-api-access-jbdnw\") pod \"watcher-api-0\" (UID: \"077ce354-b1d3-40d2-bf19-0eee3d474753\") " pod="openstack/watcher-api-0" Dec 04 10:34:11 crc kubenswrapper[4831]: I1204 10:34:11.105008 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6f77cc7d69-g6vcb" podStartSLOduration=3.073015194 podStartE2EDuration="16.104984916s" podCreationTimestamp="2025-12-04 10:33:55 +0000 UTC" firstStartedPulling="2025-12-04 10:33:56.012000446 +0000 UTC m=+1132.961175760" lastFinishedPulling="2025-12-04 10:34:09.043970168 +0000 UTC m=+1145.993145482" observedRunningTime="2025-12-04 10:34:10.975535009 +0000 UTC m=+1147.924710323" watchObservedRunningTime="2025-12-04 10:34:11.104984916 +0000 UTC m=+1148.054160230" Dec 04 10:34:11 crc kubenswrapper[4831]: I1204 10:34:11.130914 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh6fx\" (UniqueName: \"kubernetes.io/projected/8d998428-237f-4a86-9be9-8e3f563003b9-kube-api-access-vh6fx\") pod \"watcher-applier-0\" (UID: \"8d998428-237f-4a86-9be9-8e3f563003b9\") " pod="openstack/watcher-applier-0" Dec 04 10:34:11 crc kubenswrapper[4831]: I1204 10:34:11.130960 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d998428-237f-4a86-9be9-8e3f563003b9-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"8d998428-237f-4a86-9be9-8e3f563003b9\") " pod="openstack/watcher-applier-0" Dec 04 10:34:11 crc kubenswrapper[4831]: I1204 10:34:11.131153 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d998428-237f-4a86-9be9-8e3f563003b9-config-data\") pod \"watcher-applier-0\" (UID: \"8d998428-237f-4a86-9be9-8e3f563003b9\") " pod="openstack/watcher-applier-0" Dec 04 10:34:11 crc kubenswrapper[4831]: I1204 10:34:11.131174 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d998428-237f-4a86-9be9-8e3f563003b9-logs\") pod \"watcher-applier-0\" (UID: \"8d998428-237f-4a86-9be9-8e3f563003b9\") " pod="openstack/watcher-applier-0" Dec 04 10:34:11 crc kubenswrapper[4831]: I1204 10:34:11.132291 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d998428-237f-4a86-9be9-8e3f563003b9-logs\") pod \"watcher-applier-0\" (UID: \"8d998428-237f-4a86-9be9-8e3f563003b9\") " pod="openstack/watcher-applier-0" Dec 04 10:34:11 crc kubenswrapper[4831]: I1204 10:34:11.139765 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d998428-237f-4a86-9be9-8e3f563003b9-config-data\") pod \"watcher-applier-0\" (UID: \"8d998428-237f-4a86-9be9-8e3f563003b9\") " pod="openstack/watcher-applier-0" Dec 04 10:34:11 crc kubenswrapper[4831]: I1204 10:34:11.152530 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d998428-237f-4a86-9be9-8e3f563003b9-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"8d998428-237f-4a86-9be9-8e3f563003b9\") " pod="openstack/watcher-applier-0" Dec 04 10:34:11 crc kubenswrapper[4831]: I1204 10:34:11.162951 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh6fx\" (UniqueName: \"kubernetes.io/projected/8d998428-237f-4a86-9be9-8e3f563003b9-kube-api-access-vh6fx\") pod \"watcher-applier-0\" (UID: \"8d998428-237f-4a86-9be9-8e3f563003b9\") " pod="openstack/watcher-applier-0" Dec 04 10:34:11 crc kubenswrapper[4831]: I1204 10:34:11.188181 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-7svk9" Dec 04 10:34:11 crc kubenswrapper[4831]: I1204 10:34:11.227711 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Dec 04 10:34:11 crc kubenswrapper[4831]: I1204 10:34:11.272452 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5fb66c999-tf56s"] Dec 04 10:34:11 crc kubenswrapper[4831]: I1204 10:34:11.332495 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61" path="/var/lib/kubelet/pods/2ee1b2fc-3b04-4f1c-9671-a3cb4945fd61/volumes" Dec 04 10:34:11 crc kubenswrapper[4831]: I1204 10:34:11.335165 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 04 10:34:11 crc kubenswrapper[4831]: I1204 10:34:11.343490 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb735a75-e006-4422-9ef1-f1ba13d56646" path="/var/lib/kubelet/pods/cb735a75-e006-4422-9ef1-f1ba13d56646/volumes" Dec 04 10:34:11 crc kubenswrapper[4831]: I1204 10:34:11.344115 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5fb66c999-tf56s"] Dec 04 10:34:11 crc kubenswrapper[4831]: E1204 10:34:11.429747 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33636a94_6988_4866_81d9_408053be5c58.slice\": RecentStats: unable to find data in memory cache]" Dec 04 10:34:11 crc kubenswrapper[4831]: I1204 10:34:11.592927 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-s9bgg"] Dec 04 10:34:11 crc kubenswrapper[4831]: I1204 10:34:11.720804 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-edf8-account-create-w79xp" Dec 04 10:34:11 crc kubenswrapper[4831]: I1204 10:34:11.862689 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wm5v9\" (UniqueName: \"kubernetes.io/projected/664dafb4-41fc-4bfc-8355-32ae4ef51867-kube-api-access-wm5v9\") pod \"664dafb4-41fc-4bfc-8355-32ae4ef51867\" (UID: \"664dafb4-41fc-4bfc-8355-32ae4ef51867\") " Dec 04 10:34:11 crc kubenswrapper[4831]: I1204 10:34:11.877288 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/664dafb4-41fc-4bfc-8355-32ae4ef51867-kube-api-access-wm5v9" (OuterVolumeSpecName: "kube-api-access-wm5v9") pod "664dafb4-41fc-4bfc-8355-32ae4ef51867" (UID: "664dafb4-41fc-4bfc-8355-32ae4ef51867"). InnerVolumeSpecName "kube-api-access-wm5v9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:34:11 crc kubenswrapper[4831]: I1204 10:34:11.967163 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 04 10:34:11 crc kubenswrapper[4831]: I1204 10:34:11.967320 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wm5v9\" (UniqueName: \"kubernetes.io/projected/664dafb4-41fc-4bfc-8355-32ae4ef51867-kube-api-access-wm5v9\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:12 crc kubenswrapper[4831]: I1204 10:34:12.001206 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"6040f79c-a151-4681-a758-d2741bff68b6","Type":"ContainerStarted","Data":"3ced7baec4e0db4fc5d0edc8669d31b19513384a9477f798f028a7a0cb3f7a39"} Dec 04 10:34:12 crc kubenswrapper[4831]: I1204 10:34:12.005035 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f77cc7d69-g6vcb" event={"ID":"85393e48-dbc6-41c7-9419-4baf00f072db","Type":"ContainerStarted","Data":"ced500055055bcb942e672ef9b1d0182950bde3c7bee1e55888858a7e706bad5"} Dec 04 10:34:12 crc kubenswrapper[4831]: I1204 10:34:12.012670 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s9bgg" event={"ID":"ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5","Type":"ContainerStarted","Data":"ad385629ce59d58b186d0c360c1876ad8d7a01e96e773c4c52a5a84490d884ad"} Dec 04 10:34:12 crc kubenswrapper[4831]: I1204 10:34:12.012707 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s9bgg" event={"ID":"ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5","Type":"ContainerStarted","Data":"171b4e9c73dee7ff85131084990960ad9f9c22119a40a59b610b610ca3f5404f"} Dec 04 10:34:12 crc kubenswrapper[4831]: I1204 10:34:12.028033 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-edf8-account-create-w79xp" event={"ID":"664dafb4-41fc-4bfc-8355-32ae4ef51867","Type":"ContainerDied","Data":"4822608cfc2181a01a791220dccb8ba9ac4a97eff3641f777ac7c4fb760c92cd"} Dec 04 10:34:12 crc kubenswrapper[4831]: I1204 10:34:12.028078 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4822608cfc2181a01a791220dccb8ba9ac4a97eff3641f777ac7c4fb760c92cd" Dec 04 10:34:12 crc kubenswrapper[4831]: I1204 10:34:12.028143 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-edf8-account-create-w79xp" Dec 04 10:34:12 crc kubenswrapper[4831]: I1204 10:34:12.038741 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-7svk9"] Dec 04 10:34:12 crc kubenswrapper[4831]: I1204 10:34:12.039742 4831 generic.go:334] "Generic (PLEG): container finished" podID="f67e7c2e-46d3-4e53-97fe-1b1a7d2e7d9f" containerID="e77834aaccfef7207ad6ec5de99f29061ec47a32f6211f8ba60f1ebd9a582255" exitCode=0 Dec 04 10:34:12 crc kubenswrapper[4831]: I1204 10:34:12.039798 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-377a-account-create-c9vwp" event={"ID":"f67e7c2e-46d3-4e53-97fe-1b1a7d2e7d9f","Type":"ContainerDied","Data":"e77834aaccfef7207ad6ec5de99f29061ec47a32f6211f8ba60f1ebd9a582255"} Dec 04 10:34:12 crc kubenswrapper[4831]: I1204 10:34:12.045050 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cf8898794-dbfdf" event={"ID":"f1f8e9de-4491-4e25-bee0-457da0500046","Type":"ContainerStarted","Data":"8e283c75beb19fe7baa3ce5696a287b5602cff470b27cef31751d41f29b6e12f"} Dec 04 10:34:12 crc kubenswrapper[4831]: I1204 10:34:12.045092 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cf8898794-dbfdf" event={"ID":"f1f8e9de-4491-4e25-bee0-457da0500046","Type":"ContainerStarted","Data":"5538b7a3acd8ec138542a0db142fd4d8a4efd13b727e505a41c6d36cf8268745"} Dec 04 10:34:12 crc kubenswrapper[4831]: I1204 10:34:12.063487 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-s9bgg" podStartSLOduration=2.0634485 podStartE2EDuration="2.0634485s" podCreationTimestamp="2025-12-04 10:34:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:34:12.030395601 +0000 UTC m=+1148.979570915" watchObservedRunningTime="2025-12-04 10:34:12.0634485 +0000 UTC m=+1149.012623814" Dec 04 10:34:12 crc kubenswrapper[4831]: I1204 10:34:12.088233 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-q6jc5" event={"ID":"41899055-8db6-4cdb-a9da-2bbb143b9f3f","Type":"ContainerStarted","Data":"3955d53461eddec4027820d5075fab1744fdfab5ddf7f0c3d04e497e833aa4b9"} Dec 04 10:34:12 crc kubenswrapper[4831]: I1204 10:34:12.093482 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5cf8898794-dbfdf" podStartSLOduration=11.093463573 podStartE2EDuration="11.093463573s" podCreationTimestamp="2025-12-04 10:34:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:34:12.085312576 +0000 UTC m=+1149.034487890" watchObservedRunningTime="2025-12-04 10:34:12.093463573 +0000 UTC m=+1149.042638887" Dec 04 10:34:12 crc kubenswrapper[4831]: I1204 10:34:12.098776 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6694d6d998-wgcht" event={"ID":"42db10e0-67a3-49d3-b6c5-8f48e31775e7","Type":"ContainerStarted","Data":"7ee53d79998a754dec4a0a0360839ddcc132026c78ae90554a04a9bb064b610a"} Dec 04 10:34:12 crc kubenswrapper[4831]: I1204 10:34:12.098871 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6694d6d998-wgcht" event={"ID":"42db10e0-67a3-49d3-b6c5-8f48e31775e7","Type":"ContainerStarted","Data":"5c5695bb0b5c313b73a355c41b184d4d28fd82fedf874d90f04b2d87b2c1843a"} Dec 04 10:34:12 crc kubenswrapper[4831]: I1204 10:34:12.135037 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-q6jc5" podStartSLOduration=14.135019448 podStartE2EDuration="14.135019448s" podCreationTimestamp="2025-12-04 10:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:34:12.11184588 +0000 UTC m=+1149.061021194" watchObservedRunningTime="2025-12-04 10:34:12.135019448 +0000 UTC m=+1149.084194762" Dec 04 10:34:12 crc kubenswrapper[4831]: I1204 10:34:12.151082 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Dec 04 10:34:12 crc kubenswrapper[4831]: I1204 10:34:12.163629 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6694d6d998-wgcht" podStartSLOduration=11.163610384 podStartE2EDuration="11.163610384s" podCreationTimestamp="2025-12-04 10:34:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:34:12.134813183 +0000 UTC m=+1149.083988497" watchObservedRunningTime="2025-12-04 10:34:12.163610384 +0000 UTC m=+1149.112785698" Dec 04 10:34:12 crc kubenswrapper[4831]: I1204 10:34:12.176477 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Dec 04 10:34:12 crc kubenswrapper[4831]: I1204 10:34:12.431990 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5cb786f76c-8xlwx" Dec 04 10:34:13 crc kubenswrapper[4831]: I1204 10:34:13.107277 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-7svk9" event={"ID":"a79b812b-5862-43d0-a6d5-5c6ec3a63e51","Type":"ContainerStarted","Data":"4cd2e5ce6cdbfe9fb9495743f410cbeb852ed0fd24d3fa070837a71293dbc911"} Dec 04 10:34:13 crc kubenswrapper[4831]: I1204 10:34:13.108511 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"077ce354-b1d3-40d2-bf19-0eee3d474753","Type":"ContainerStarted","Data":"13382f8529ff796d8a356c27250577db1aff612b0eec44dd5ec22a8084db6f59"} Dec 04 10:34:13 crc kubenswrapper[4831]: I1204 10:34:13.108550 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"077ce354-b1d3-40d2-bf19-0eee3d474753","Type":"ContainerStarted","Data":"42c50d79fe4d00ec562856ffcd0393ff382e2a8a7f73e1b903f651da3536aea2"} Dec 04 10:34:13 crc kubenswrapper[4831]: I1204 10:34:13.109947 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"8d998428-237f-4a86-9be9-8e3f563003b9","Type":"ContainerStarted","Data":"34eeca10869a0010aeeb267383a0a3c1913bbf09cc141e8343ab58c4b84b3165"} Dec 04 10:34:13 crc kubenswrapper[4831]: I1204 10:34:13.313025 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33636a94-6988-4866-81d9-408053be5c58" path="/var/lib/kubelet/pods/33636a94-6988-4866-81d9-408053be5c58/volumes" Dec 04 10:34:15 crc kubenswrapper[4831]: I1204 10:34:15.429133 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6f77cc7d69-g6vcb" Dec 04 10:34:15 crc kubenswrapper[4831]: I1204 10:34:15.651243 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-dmcxs"] Dec 04 10:34:15 crc kubenswrapper[4831]: E1204 10:34:15.651606 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="664dafb4-41fc-4bfc-8355-32ae4ef51867" containerName="mariadb-account-create" Dec 04 10:34:15 crc kubenswrapper[4831]: I1204 10:34:15.651622 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="664dafb4-41fc-4bfc-8355-32ae4ef51867" containerName="mariadb-account-create" Dec 04 10:34:15 crc kubenswrapper[4831]: I1204 10:34:15.651814 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="664dafb4-41fc-4bfc-8355-32ae4ef51867" containerName="mariadb-account-create" Dec 04 10:34:15 crc kubenswrapper[4831]: I1204 10:34:15.652406 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-dmcxs" Dec 04 10:34:15 crc kubenswrapper[4831]: I1204 10:34:15.654548 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-5f2ml" Dec 04 10:34:15 crc kubenswrapper[4831]: I1204 10:34:15.654897 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 04 10:34:15 crc kubenswrapper[4831]: I1204 10:34:15.655051 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 04 10:34:15 crc kubenswrapper[4831]: I1204 10:34:15.672990 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-dmcxs"] Dec 04 10:34:15 crc kubenswrapper[4831]: I1204 10:34:15.740337 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/94129f00-4043-4552-9724-feef1585cd20-db-sync-config-data\") pod \"cinder-db-sync-dmcxs\" (UID: \"94129f00-4043-4552-9724-feef1585cd20\") " pod="openstack/cinder-db-sync-dmcxs" Dec 04 10:34:15 crc kubenswrapper[4831]: I1204 10:34:15.740407 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxr78\" (UniqueName: \"kubernetes.io/projected/94129f00-4043-4552-9724-feef1585cd20-kube-api-access-pxr78\") pod \"cinder-db-sync-dmcxs\" (UID: \"94129f00-4043-4552-9724-feef1585cd20\") " pod="openstack/cinder-db-sync-dmcxs" Dec 04 10:34:15 crc kubenswrapper[4831]: I1204 10:34:15.740430 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/94129f00-4043-4552-9724-feef1585cd20-etc-machine-id\") pod \"cinder-db-sync-dmcxs\" (UID: \"94129f00-4043-4552-9724-feef1585cd20\") " pod="openstack/cinder-db-sync-dmcxs" Dec 04 10:34:15 crc kubenswrapper[4831]: I1204 10:34:15.740506 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94129f00-4043-4552-9724-feef1585cd20-scripts\") pod \"cinder-db-sync-dmcxs\" (UID: \"94129f00-4043-4552-9724-feef1585cd20\") " pod="openstack/cinder-db-sync-dmcxs" Dec 04 10:34:15 crc kubenswrapper[4831]: I1204 10:34:15.740535 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94129f00-4043-4552-9724-feef1585cd20-config-data\") pod \"cinder-db-sync-dmcxs\" (UID: \"94129f00-4043-4552-9724-feef1585cd20\") " pod="openstack/cinder-db-sync-dmcxs" Dec 04 10:34:15 crc kubenswrapper[4831]: I1204 10:34:15.740589 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94129f00-4043-4552-9724-feef1585cd20-combined-ca-bundle\") pod \"cinder-db-sync-dmcxs\" (UID: \"94129f00-4043-4552-9724-feef1585cd20\") " pod="openstack/cinder-db-sync-dmcxs" Dec 04 10:34:15 crc kubenswrapper[4831]: I1204 10:34:15.841981 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94129f00-4043-4552-9724-feef1585cd20-scripts\") pod \"cinder-db-sync-dmcxs\" (UID: \"94129f00-4043-4552-9724-feef1585cd20\") " pod="openstack/cinder-db-sync-dmcxs" Dec 04 10:34:15 crc kubenswrapper[4831]: I1204 10:34:15.842044 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94129f00-4043-4552-9724-feef1585cd20-config-data\") pod \"cinder-db-sync-dmcxs\" (UID: \"94129f00-4043-4552-9724-feef1585cd20\") " pod="openstack/cinder-db-sync-dmcxs" Dec 04 10:34:15 crc kubenswrapper[4831]: I1204 10:34:15.842988 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94129f00-4043-4552-9724-feef1585cd20-combined-ca-bundle\") pod \"cinder-db-sync-dmcxs\" (UID: \"94129f00-4043-4552-9724-feef1585cd20\") " pod="openstack/cinder-db-sync-dmcxs" Dec 04 10:34:15 crc kubenswrapper[4831]: I1204 10:34:15.843045 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/94129f00-4043-4552-9724-feef1585cd20-db-sync-config-data\") pod \"cinder-db-sync-dmcxs\" (UID: \"94129f00-4043-4552-9724-feef1585cd20\") " pod="openstack/cinder-db-sync-dmcxs" Dec 04 10:34:15 crc kubenswrapper[4831]: I1204 10:34:15.843095 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxr78\" (UniqueName: \"kubernetes.io/projected/94129f00-4043-4552-9724-feef1585cd20-kube-api-access-pxr78\") pod \"cinder-db-sync-dmcxs\" (UID: \"94129f00-4043-4552-9724-feef1585cd20\") " pod="openstack/cinder-db-sync-dmcxs" Dec 04 10:34:15 crc kubenswrapper[4831]: I1204 10:34:15.843123 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/94129f00-4043-4552-9724-feef1585cd20-etc-machine-id\") pod \"cinder-db-sync-dmcxs\" (UID: \"94129f00-4043-4552-9724-feef1585cd20\") " pod="openstack/cinder-db-sync-dmcxs" Dec 04 10:34:15 crc kubenswrapper[4831]: I1204 10:34:15.843235 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/94129f00-4043-4552-9724-feef1585cd20-etc-machine-id\") pod \"cinder-db-sync-dmcxs\" (UID: \"94129f00-4043-4552-9724-feef1585cd20\") " pod="openstack/cinder-db-sync-dmcxs" Dec 04 10:34:15 crc kubenswrapper[4831]: I1204 10:34:15.850912 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94129f00-4043-4552-9724-feef1585cd20-scripts\") pod \"cinder-db-sync-dmcxs\" (UID: \"94129f00-4043-4552-9724-feef1585cd20\") " pod="openstack/cinder-db-sync-dmcxs" Dec 04 10:34:15 crc kubenswrapper[4831]: I1204 10:34:15.851010 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94129f00-4043-4552-9724-feef1585cd20-combined-ca-bundle\") pod \"cinder-db-sync-dmcxs\" (UID: \"94129f00-4043-4552-9724-feef1585cd20\") " pod="openstack/cinder-db-sync-dmcxs" Dec 04 10:34:15 crc kubenswrapper[4831]: I1204 10:34:15.851057 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/94129f00-4043-4552-9724-feef1585cd20-db-sync-config-data\") pod \"cinder-db-sync-dmcxs\" (UID: \"94129f00-4043-4552-9724-feef1585cd20\") " pod="openstack/cinder-db-sync-dmcxs" Dec 04 10:34:15 crc kubenswrapper[4831]: I1204 10:34:15.851337 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94129f00-4043-4552-9724-feef1585cd20-config-data\") pod \"cinder-db-sync-dmcxs\" (UID: \"94129f00-4043-4552-9724-feef1585cd20\") " pod="openstack/cinder-db-sync-dmcxs" Dec 04 10:34:15 crc kubenswrapper[4831]: I1204 10:34:15.861229 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxr78\" (UniqueName: \"kubernetes.io/projected/94129f00-4043-4552-9724-feef1585cd20-kube-api-access-pxr78\") pod \"cinder-db-sync-dmcxs\" (UID: \"94129f00-4043-4552-9724-feef1585cd20\") " pod="openstack/cinder-db-sync-dmcxs" Dec 04 10:34:15 crc kubenswrapper[4831]: I1204 10:34:15.971347 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-dmcxs" Dec 04 10:34:17 crc kubenswrapper[4831]: I1204 10:34:17.170633 4831 generic.go:334] "Generic (PLEG): container finished" podID="ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5" containerID="ad385629ce59d58b186d0c360c1876ad8d7a01e96e773c4c52a5a84490d884ad" exitCode=0 Dec 04 10:34:17 crc kubenswrapper[4831]: I1204 10:34:17.170712 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s9bgg" event={"ID":"ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5","Type":"ContainerDied","Data":"ad385629ce59d58b186d0c360c1876ad8d7a01e96e773c4c52a5a84490d884ad"} Dec 04 10:34:17 crc kubenswrapper[4831]: I1204 10:34:17.840550 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-377a-account-create-c9vwp" Dec 04 10:34:17 crc kubenswrapper[4831]: I1204 10:34:17.979412 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxwlq\" (UniqueName: \"kubernetes.io/projected/f67e7c2e-46d3-4e53-97fe-1b1a7d2e7d9f-kube-api-access-jxwlq\") pod \"f67e7c2e-46d3-4e53-97fe-1b1a7d2e7d9f\" (UID: \"f67e7c2e-46d3-4e53-97fe-1b1a7d2e7d9f\") " Dec 04 10:34:18 crc kubenswrapper[4831]: I1204 10:34:18.002935 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f67e7c2e-46d3-4e53-97fe-1b1a7d2e7d9f-kube-api-access-jxwlq" (OuterVolumeSpecName: "kube-api-access-jxwlq") pod "f67e7c2e-46d3-4e53-97fe-1b1a7d2e7d9f" (UID: "f67e7c2e-46d3-4e53-97fe-1b1a7d2e7d9f"). InnerVolumeSpecName "kube-api-access-jxwlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:34:18 crc kubenswrapper[4831]: I1204 10:34:18.082417 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxwlq\" (UniqueName: \"kubernetes.io/projected/f67e7c2e-46d3-4e53-97fe-1b1a7d2e7d9f-kube-api-access-jxwlq\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:18 crc kubenswrapper[4831]: I1204 10:34:18.197897 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-377a-account-create-c9vwp" event={"ID":"f67e7c2e-46d3-4e53-97fe-1b1a7d2e7d9f","Type":"ContainerDied","Data":"a4f59bb0c9730b008ecd4a186846008d8d2e7365d725abc1653c25add7b9fcde"} Dec 04 10:34:18 crc kubenswrapper[4831]: I1204 10:34:18.197937 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4f59bb0c9730b008ecd4a186846008d8d2e7365d725abc1653c25add7b9fcde" Dec 04 10:34:18 crc kubenswrapper[4831]: I1204 10:34:18.197986 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-377a-account-create-c9vwp" Dec 04 10:34:18 crc kubenswrapper[4831]: I1204 10:34:18.203446 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"077ce354-b1d3-40d2-bf19-0eee3d474753","Type":"ContainerStarted","Data":"4d14cc312ce3fdd5254c5eeef2f36ed49f38e1d7d481bf48b33eb6616558c934"} Dec 04 10:34:18 crc kubenswrapper[4831]: I1204 10:34:18.204133 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Dec 04 10:34:18 crc kubenswrapper[4831]: I1204 10:34:18.209185 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"8d998428-237f-4a86-9be9-8e3f563003b9","Type":"ContainerStarted","Data":"3c1b562caa701fdd6bd58b8bfe4f7bf3cfc9fef5539cce4c653bda4ee794a344"} Dec 04 10:34:18 crc kubenswrapper[4831]: I1204 10:34:18.230819 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="077ce354-b1d3-40d2-bf19-0eee3d474753" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.157:9322/\": dial tcp 10.217.0.157:9322: connect: connection refused" Dec 04 10:34:18 crc kubenswrapper[4831]: I1204 10:34:18.242636 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b769c363-a026-4bf4-9b56-3d1452b6847d","Type":"ContainerStarted","Data":"7c81ef06f0d2ef4ac48cdbdbbce8ccf5febd339c9121f55b37b94aff84afb322"} Dec 04 10:34:18 crc kubenswrapper[4831]: I1204 10:34:18.252884 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-7svk9" event={"ID":"a79b812b-5862-43d0-a6d5-5c6ec3a63e51","Type":"ContainerStarted","Data":"f926e57e3fd75cc46053dfb6e24b1843bd59fb8625828329f64b893c77a761da"} Dec 04 10:34:18 crc kubenswrapper[4831]: I1204 10:34:18.270572 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=8.270549886 podStartE2EDuration="8.270549886s" podCreationTimestamp="2025-12-04 10:34:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:34:18.240675108 +0000 UTC m=+1155.189850442" watchObservedRunningTime="2025-12-04 10:34:18.270549886 +0000 UTC m=+1155.219725190" Dec 04 10:34:18 crc kubenswrapper[4831]: I1204 10:34:18.302297 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=2.610693609 podStartE2EDuration="8.302276302s" podCreationTimestamp="2025-12-04 10:34:10 +0000 UTC" firstStartedPulling="2025-12-04 10:34:12.176276726 +0000 UTC m=+1149.125452040" lastFinishedPulling="2025-12-04 10:34:17.867859419 +0000 UTC m=+1154.817034733" observedRunningTime="2025-12-04 10:34:18.276018745 +0000 UTC m=+1155.225194079" watchObservedRunningTime="2025-12-04 10:34:18.302276302 +0000 UTC m=+1155.251451616" Dec 04 10:34:18 crc kubenswrapper[4831]: I1204 10:34:18.309630 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-7svk9" podStartSLOduration=2.525535696 podStartE2EDuration="8.309609038s" podCreationTimestamp="2025-12-04 10:34:10 +0000 UTC" firstStartedPulling="2025-12-04 10:34:12.064023505 +0000 UTC m=+1149.013198819" lastFinishedPulling="2025-12-04 10:34:17.848096847 +0000 UTC m=+1154.797272161" observedRunningTime="2025-12-04 10:34:18.29630155 +0000 UTC m=+1155.245476884" watchObservedRunningTime="2025-12-04 10:34:18.309609038 +0000 UTC m=+1155.258784352" Dec 04 10:34:18 crc kubenswrapper[4831]: I1204 10:34:18.346990 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=2.4472454470000002 podStartE2EDuration="8.346759802s" podCreationTimestamp="2025-12-04 10:34:10 +0000 UTC" firstStartedPulling="2025-12-04 10:34:11.968639482 +0000 UTC m=+1148.917814796" lastFinishedPulling="2025-12-04 10:34:17.868153837 +0000 UTC m=+1154.817329151" observedRunningTime="2025-12-04 10:34:18.327307768 +0000 UTC m=+1155.276483082" watchObservedRunningTime="2025-12-04 10:34:18.346759802 +0000 UTC m=+1155.295935116" Dec 04 10:34:18 crc kubenswrapper[4831]: I1204 10:34:18.398004 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-dmcxs"] Dec 04 10:34:18 crc kubenswrapper[4831]: I1204 10:34:18.570548 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s9bgg" Dec 04 10:34:18 crc kubenswrapper[4831]: I1204 10:34:18.707162 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5-combined-ca-bundle\") pod \"ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5\" (UID: \"ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5\") " Dec 04 10:34:18 crc kubenswrapper[4831]: I1204 10:34:18.707282 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5-fernet-keys\") pod \"ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5\" (UID: \"ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5\") " Dec 04 10:34:18 crc kubenswrapper[4831]: I1204 10:34:18.707335 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5-config-data\") pod \"ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5\" (UID: \"ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5\") " Dec 04 10:34:18 crc kubenswrapper[4831]: I1204 10:34:18.707377 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5-scripts\") pod \"ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5\" (UID: \"ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5\") " Dec 04 10:34:18 crc kubenswrapper[4831]: I1204 10:34:18.707396 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5-credential-keys\") pod \"ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5\" (UID: \"ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5\") " Dec 04 10:34:18 crc kubenswrapper[4831]: I1204 10:34:18.708138 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5h9kx\" (UniqueName: \"kubernetes.io/projected/ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5-kube-api-access-5h9kx\") pod \"ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5\" (UID: \"ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5\") " Dec 04 10:34:18 crc kubenswrapper[4831]: I1204 10:34:18.724412 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5" (UID: "ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:18 crc kubenswrapper[4831]: I1204 10:34:18.724784 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5-scripts" (OuterVolumeSpecName: "scripts") pod "ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5" (UID: "ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:18 crc kubenswrapper[4831]: I1204 10:34:18.727375 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5-kube-api-access-5h9kx" (OuterVolumeSpecName: "kube-api-access-5h9kx") pod "ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5" (UID: "ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5"). InnerVolumeSpecName "kube-api-access-5h9kx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:34:18 crc kubenswrapper[4831]: I1204 10:34:18.730747 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5" (UID: "ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:18 crc kubenswrapper[4831]: I1204 10:34:18.743644 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5" (UID: "ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:18 crc kubenswrapper[4831]: I1204 10:34:18.755519 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5-config-data" (OuterVolumeSpecName: "config-data") pod "ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5" (UID: "ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:18 crc kubenswrapper[4831]: I1204 10:34:18.810166 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:18 crc kubenswrapper[4831]: I1204 10:34:18.810221 4831 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:18 crc kubenswrapper[4831]: I1204 10:34:18.810239 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:18 crc kubenswrapper[4831]: I1204 10:34:18.810256 4831 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:18 crc kubenswrapper[4831]: I1204 10:34:18.810275 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:18 crc kubenswrapper[4831]: I1204 10:34:18.810294 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5h9kx\" (UniqueName: \"kubernetes.io/projected/ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5-kube-api-access-5h9kx\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:19 crc kubenswrapper[4831]: I1204 10:34:19.266820 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6f799f5dc-hv6fj"] Dec 04 10:34:19 crc kubenswrapper[4831]: E1204 10:34:19.267186 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5" containerName="keystone-bootstrap" Dec 04 10:34:19 crc kubenswrapper[4831]: I1204 10:34:19.267204 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5" containerName="keystone-bootstrap" Dec 04 10:34:19 crc kubenswrapper[4831]: E1204 10:34:19.267228 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f67e7c2e-46d3-4e53-97fe-1b1a7d2e7d9f" containerName="mariadb-account-create" Dec 04 10:34:19 crc kubenswrapper[4831]: I1204 10:34:19.267236 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f67e7c2e-46d3-4e53-97fe-1b1a7d2e7d9f" containerName="mariadb-account-create" Dec 04 10:34:19 crc kubenswrapper[4831]: I1204 10:34:19.267392 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f67e7c2e-46d3-4e53-97fe-1b1a7d2e7d9f" containerName="mariadb-account-create" Dec 04 10:34:19 crc kubenswrapper[4831]: I1204 10:34:19.267422 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5" containerName="keystone-bootstrap" Dec 04 10:34:19 crc kubenswrapper[4831]: I1204 10:34:19.268009 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6f799f5dc-hv6fj" Dec 04 10:34:19 crc kubenswrapper[4831]: I1204 10:34:19.271048 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 04 10:34:19 crc kubenswrapper[4831]: I1204 10:34:19.271217 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 04 10:34:19 crc kubenswrapper[4831]: I1204 10:34:19.277430 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"6040f79c-a151-4681-a758-d2741bff68b6","Type":"ContainerStarted","Data":"14cf7ccbd3c01321069a858e40065913f80313d32694b050774b055ef233e105"} Dec 04 10:34:19 crc kubenswrapper[4831]: I1204 10:34:19.283600 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s9bgg" Dec 04 10:34:19 crc kubenswrapper[4831]: I1204 10:34:19.335816 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6f799f5dc-hv6fj"] Dec 04 10:34:19 crc kubenswrapper[4831]: I1204 10:34:19.335855 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s9bgg" event={"ID":"ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5","Type":"ContainerDied","Data":"171b4e9c73dee7ff85131084990960ad9f9c22119a40a59b610b610ca3f5404f"} Dec 04 10:34:19 crc kubenswrapper[4831]: I1204 10:34:19.335875 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="171b4e9c73dee7ff85131084990960ad9f9c22119a40a59b610b610ca3f5404f" Dec 04 10:34:19 crc kubenswrapper[4831]: I1204 10:34:19.335884 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-dmcxs" event={"ID":"94129f00-4043-4552-9724-feef1585cd20","Type":"ContainerStarted","Data":"12e2ef018400b0e25abb0efd00ab7137f3ef68f1bf719dc0723c4361151114ed"} Dec 04 10:34:19 crc kubenswrapper[4831]: I1204 10:34:19.427389 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d4c2366-4b32-485c-88d3-2e6ff2d19bc8-public-tls-certs\") pod \"keystone-6f799f5dc-hv6fj\" (UID: \"3d4c2366-4b32-485c-88d3-2e6ff2d19bc8\") " pod="openstack/keystone-6f799f5dc-hv6fj" Dec 04 10:34:19 crc kubenswrapper[4831]: I1204 10:34:19.427520 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d4c2366-4b32-485c-88d3-2e6ff2d19bc8-internal-tls-certs\") pod \"keystone-6f799f5dc-hv6fj\" (UID: \"3d4c2366-4b32-485c-88d3-2e6ff2d19bc8\") " pod="openstack/keystone-6f799f5dc-hv6fj" Dec 04 10:34:19 crc kubenswrapper[4831]: I1204 10:34:19.427567 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9h5h\" (UniqueName: \"kubernetes.io/projected/3d4c2366-4b32-485c-88d3-2e6ff2d19bc8-kube-api-access-v9h5h\") pod \"keystone-6f799f5dc-hv6fj\" (UID: \"3d4c2366-4b32-485c-88d3-2e6ff2d19bc8\") " pod="openstack/keystone-6f799f5dc-hv6fj" Dec 04 10:34:19 crc kubenswrapper[4831]: I1204 10:34:19.427615 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3d4c2366-4b32-485c-88d3-2e6ff2d19bc8-fernet-keys\") pod \"keystone-6f799f5dc-hv6fj\" (UID: \"3d4c2366-4b32-485c-88d3-2e6ff2d19bc8\") " pod="openstack/keystone-6f799f5dc-hv6fj" Dec 04 10:34:19 crc kubenswrapper[4831]: I1204 10:34:19.427650 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d4c2366-4b32-485c-88d3-2e6ff2d19bc8-config-data\") pod \"keystone-6f799f5dc-hv6fj\" (UID: \"3d4c2366-4b32-485c-88d3-2e6ff2d19bc8\") " pod="openstack/keystone-6f799f5dc-hv6fj" Dec 04 10:34:19 crc kubenswrapper[4831]: I1204 10:34:19.427735 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3d4c2366-4b32-485c-88d3-2e6ff2d19bc8-credential-keys\") pod \"keystone-6f799f5dc-hv6fj\" (UID: \"3d4c2366-4b32-485c-88d3-2e6ff2d19bc8\") " pod="openstack/keystone-6f799f5dc-hv6fj" Dec 04 10:34:19 crc kubenswrapper[4831]: I1204 10:34:19.427809 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d4c2366-4b32-485c-88d3-2e6ff2d19bc8-combined-ca-bundle\") pod \"keystone-6f799f5dc-hv6fj\" (UID: \"3d4c2366-4b32-485c-88d3-2e6ff2d19bc8\") " pod="openstack/keystone-6f799f5dc-hv6fj" Dec 04 10:34:19 crc kubenswrapper[4831]: I1204 10:34:19.427893 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d4c2366-4b32-485c-88d3-2e6ff2d19bc8-scripts\") pod \"keystone-6f799f5dc-hv6fj\" (UID: \"3d4c2366-4b32-485c-88d3-2e6ff2d19bc8\") " pod="openstack/keystone-6f799f5dc-hv6fj" Dec 04 10:34:19 crc kubenswrapper[4831]: I1204 10:34:19.529723 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d4c2366-4b32-485c-88d3-2e6ff2d19bc8-config-data\") pod \"keystone-6f799f5dc-hv6fj\" (UID: \"3d4c2366-4b32-485c-88d3-2e6ff2d19bc8\") " pod="openstack/keystone-6f799f5dc-hv6fj" Dec 04 10:34:19 crc kubenswrapper[4831]: I1204 10:34:19.529794 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3d4c2366-4b32-485c-88d3-2e6ff2d19bc8-credential-keys\") pod \"keystone-6f799f5dc-hv6fj\" (UID: \"3d4c2366-4b32-485c-88d3-2e6ff2d19bc8\") " pod="openstack/keystone-6f799f5dc-hv6fj" Dec 04 10:34:19 crc kubenswrapper[4831]: I1204 10:34:19.529840 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d4c2366-4b32-485c-88d3-2e6ff2d19bc8-combined-ca-bundle\") pod \"keystone-6f799f5dc-hv6fj\" (UID: \"3d4c2366-4b32-485c-88d3-2e6ff2d19bc8\") " pod="openstack/keystone-6f799f5dc-hv6fj" Dec 04 10:34:19 crc kubenswrapper[4831]: I1204 10:34:19.529885 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d4c2366-4b32-485c-88d3-2e6ff2d19bc8-scripts\") pod \"keystone-6f799f5dc-hv6fj\" (UID: \"3d4c2366-4b32-485c-88d3-2e6ff2d19bc8\") " pod="openstack/keystone-6f799f5dc-hv6fj" Dec 04 10:34:19 crc kubenswrapper[4831]: I1204 10:34:19.529987 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d4c2366-4b32-485c-88d3-2e6ff2d19bc8-public-tls-certs\") pod \"keystone-6f799f5dc-hv6fj\" (UID: \"3d4c2366-4b32-485c-88d3-2e6ff2d19bc8\") " pod="openstack/keystone-6f799f5dc-hv6fj" Dec 04 10:34:19 crc kubenswrapper[4831]: I1204 10:34:19.530745 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d4c2366-4b32-485c-88d3-2e6ff2d19bc8-internal-tls-certs\") pod \"keystone-6f799f5dc-hv6fj\" (UID: \"3d4c2366-4b32-485c-88d3-2e6ff2d19bc8\") " pod="openstack/keystone-6f799f5dc-hv6fj" Dec 04 10:34:19 crc kubenswrapper[4831]: I1204 10:34:19.530790 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9h5h\" (UniqueName: \"kubernetes.io/projected/3d4c2366-4b32-485c-88d3-2e6ff2d19bc8-kube-api-access-v9h5h\") pod \"keystone-6f799f5dc-hv6fj\" (UID: \"3d4c2366-4b32-485c-88d3-2e6ff2d19bc8\") " pod="openstack/keystone-6f799f5dc-hv6fj" Dec 04 10:34:19 crc kubenswrapper[4831]: I1204 10:34:19.530819 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3d4c2366-4b32-485c-88d3-2e6ff2d19bc8-fernet-keys\") pod \"keystone-6f799f5dc-hv6fj\" (UID: \"3d4c2366-4b32-485c-88d3-2e6ff2d19bc8\") " pod="openstack/keystone-6f799f5dc-hv6fj" Dec 04 10:34:19 crc kubenswrapper[4831]: I1204 10:34:19.536620 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3d4c2366-4b32-485c-88d3-2e6ff2d19bc8-credential-keys\") pod \"keystone-6f799f5dc-hv6fj\" (UID: \"3d4c2366-4b32-485c-88d3-2e6ff2d19bc8\") " pod="openstack/keystone-6f799f5dc-hv6fj" Dec 04 10:34:19 crc kubenswrapper[4831]: I1204 10:34:19.537217 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d4c2366-4b32-485c-88d3-2e6ff2d19bc8-internal-tls-certs\") pod \"keystone-6f799f5dc-hv6fj\" (UID: \"3d4c2366-4b32-485c-88d3-2e6ff2d19bc8\") " pod="openstack/keystone-6f799f5dc-hv6fj" Dec 04 10:34:19 crc kubenswrapper[4831]: I1204 10:34:19.537646 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d4c2366-4b32-485c-88d3-2e6ff2d19bc8-public-tls-certs\") pod \"keystone-6f799f5dc-hv6fj\" (UID: \"3d4c2366-4b32-485c-88d3-2e6ff2d19bc8\") " pod="openstack/keystone-6f799f5dc-hv6fj" Dec 04 10:34:19 crc kubenswrapper[4831]: I1204 10:34:19.538209 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d4c2366-4b32-485c-88d3-2e6ff2d19bc8-combined-ca-bundle\") pod \"keystone-6f799f5dc-hv6fj\" (UID: \"3d4c2366-4b32-485c-88d3-2e6ff2d19bc8\") " pod="openstack/keystone-6f799f5dc-hv6fj" Dec 04 10:34:19 crc kubenswrapper[4831]: I1204 10:34:19.539407 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d4c2366-4b32-485c-88d3-2e6ff2d19bc8-config-data\") pod \"keystone-6f799f5dc-hv6fj\" (UID: \"3d4c2366-4b32-485c-88d3-2e6ff2d19bc8\") " pod="openstack/keystone-6f799f5dc-hv6fj" Dec 04 10:34:19 crc kubenswrapper[4831]: I1204 10:34:19.541154 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d4c2366-4b32-485c-88d3-2e6ff2d19bc8-scripts\") pod \"keystone-6f799f5dc-hv6fj\" (UID: \"3d4c2366-4b32-485c-88d3-2e6ff2d19bc8\") " pod="openstack/keystone-6f799f5dc-hv6fj" Dec 04 10:34:19 crc kubenswrapper[4831]: I1204 10:34:19.554328 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9h5h\" (UniqueName: \"kubernetes.io/projected/3d4c2366-4b32-485c-88d3-2e6ff2d19bc8-kube-api-access-v9h5h\") pod \"keystone-6f799f5dc-hv6fj\" (UID: \"3d4c2366-4b32-485c-88d3-2e6ff2d19bc8\") " pod="openstack/keystone-6f799f5dc-hv6fj" Dec 04 10:34:19 crc kubenswrapper[4831]: I1204 10:34:19.575729 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3d4c2366-4b32-485c-88d3-2e6ff2d19bc8-fernet-keys\") pod \"keystone-6f799f5dc-hv6fj\" (UID: \"3d4c2366-4b32-485c-88d3-2e6ff2d19bc8\") " pod="openstack/keystone-6f799f5dc-hv6fj" Dec 04 10:34:19 crc kubenswrapper[4831]: I1204 10:34:19.611776 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6f799f5dc-hv6fj" Dec 04 10:34:20 crc kubenswrapper[4831]: I1204 10:34:20.193573 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6f799f5dc-hv6fj"] Dec 04 10:34:20 crc kubenswrapper[4831]: W1204 10:34:20.199928 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d4c2366_4b32_485c_88d3_2e6ff2d19bc8.slice/crio-c9dd85d98763b591a27af24496cd5694dd054bec1d10039284d0ba808a2a83a8 WatchSource:0}: Error finding container c9dd85d98763b591a27af24496cd5694dd054bec1d10039284d0ba808a2a83a8: Status 404 returned error can't find the container with id c9dd85d98763b591a27af24496cd5694dd054bec1d10039284d0ba808a2a83a8 Dec 04 10:34:20 crc kubenswrapper[4831]: I1204 10:34:20.361840 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6f799f5dc-hv6fj" event={"ID":"3d4c2366-4b32-485c-88d3-2e6ff2d19bc8","Type":"ContainerStarted","Data":"c9dd85d98763b591a27af24496cd5694dd054bec1d10039284d0ba808a2a83a8"} Dec 04 10:34:20 crc kubenswrapper[4831]: I1204 10:34:20.961541 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 04 10:34:21 crc kubenswrapper[4831]: I1204 10:34:21.009009 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Dec 04 10:34:21 crc kubenswrapper[4831]: I1204 10:34:21.228343 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Dec 04 10:34:21 crc kubenswrapper[4831]: I1204 10:34:21.228716 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Dec 04 10:34:21 crc kubenswrapper[4831]: I1204 10:34:21.292610 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Dec 04 10:34:21 crc kubenswrapper[4831]: I1204 10:34:21.340485 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Dec 04 10:34:21 crc kubenswrapper[4831]: I1204 10:34:21.340526 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Dec 04 10:34:21 crc kubenswrapper[4831]: I1204 10:34:21.380371 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6f799f5dc-hv6fj" event={"ID":"3d4c2366-4b32-485c-88d3-2e6ff2d19bc8","Type":"ContainerStarted","Data":"00f2640dd76407c812ae9e3c53f3bfd88805d400a32fd9a12e3a3316004b31cc"} Dec 04 10:34:21 crc kubenswrapper[4831]: I1204 10:34:21.381213 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6f799f5dc-hv6fj" Dec 04 10:34:21 crc kubenswrapper[4831]: I1204 10:34:21.381864 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Dec 04 10:34:21 crc kubenswrapper[4831]: I1204 10:34:21.417052 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6f799f5dc-hv6fj" podStartSLOduration=2.417033056 podStartE2EDuration="2.417033056s" podCreationTimestamp="2025-12-04 10:34:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:34:21.398652869 +0000 UTC m=+1158.347828193" watchObservedRunningTime="2025-12-04 10:34:21.417033056 +0000 UTC m=+1158.366208370" Dec 04 10:34:21 crc kubenswrapper[4831]: I1204 10:34:21.454243 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Dec 04 10:34:21 crc kubenswrapper[4831]: I1204 10:34:21.465949 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Dec 04 10:34:21 crc kubenswrapper[4831]: I1204 10:34:21.555407 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6694d6d998-wgcht" Dec 04 10:34:21 crc kubenswrapper[4831]: I1204 10:34:21.555444 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6694d6d998-wgcht" Dec 04 10:34:21 crc kubenswrapper[4831]: I1204 10:34:21.555833 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6694d6d998-wgcht" podUID="42db10e0-67a3-49d3-b6c5-8f48e31775e7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.153:8443: connect: connection refused" Dec 04 10:34:21 crc kubenswrapper[4831]: I1204 10:34:21.709763 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5cf8898794-dbfdf" Dec 04 10:34:21 crc kubenswrapper[4831]: I1204 10:34:21.709831 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5cf8898794-dbfdf" Dec 04 10:34:21 crc kubenswrapper[4831]: I1204 10:34:21.710235 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5cf8898794-dbfdf" podUID="f1f8e9de-4491-4e25-bee0-457da0500046" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.154:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.154:8443: connect: connection refused" Dec 04 10:34:22 crc kubenswrapper[4831]: I1204 10:34:22.046627 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Dec 04 10:34:22 crc kubenswrapper[4831]: I1204 10:34:22.383274 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/watcher-api-0" podUID="077ce354-b1d3-40d2-bf19-0eee3d474753" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.157:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 10:34:23 crc kubenswrapper[4831]: I1204 10:34:23.115592 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-g5bvk"] Dec 04 10:34:23 crc kubenswrapper[4831]: I1204 10:34:23.117007 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-g5bvk" Dec 04 10:34:23 crc kubenswrapper[4831]: I1204 10:34:23.119196 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-kvqnq" Dec 04 10:34:23 crc kubenswrapper[4831]: I1204 10:34:23.120727 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 04 10:34:23 crc kubenswrapper[4831]: I1204 10:34:23.152688 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-g5bvk"] Dec 04 10:34:23 crc kubenswrapper[4831]: I1204 10:34:23.211216 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpv48\" (UniqueName: \"kubernetes.io/projected/18f48aa7-65d1-41ce-bc0d-4973db8b7abe-kube-api-access-rpv48\") pod \"glance-db-sync-g5bvk\" (UID: \"18f48aa7-65d1-41ce-bc0d-4973db8b7abe\") " pod="openstack/glance-db-sync-g5bvk" Dec 04 10:34:23 crc kubenswrapper[4831]: I1204 10:34:23.211315 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f48aa7-65d1-41ce-bc0d-4973db8b7abe-config-data\") pod \"glance-db-sync-g5bvk\" (UID: \"18f48aa7-65d1-41ce-bc0d-4973db8b7abe\") " pod="openstack/glance-db-sync-g5bvk" Dec 04 10:34:23 crc kubenswrapper[4831]: I1204 10:34:23.211360 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/18f48aa7-65d1-41ce-bc0d-4973db8b7abe-db-sync-config-data\") pod \"glance-db-sync-g5bvk\" (UID: \"18f48aa7-65d1-41ce-bc0d-4973db8b7abe\") " pod="openstack/glance-db-sync-g5bvk" Dec 04 10:34:23 crc kubenswrapper[4831]: I1204 10:34:23.211385 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f48aa7-65d1-41ce-bc0d-4973db8b7abe-combined-ca-bundle\") pod \"glance-db-sync-g5bvk\" (UID: \"18f48aa7-65d1-41ce-bc0d-4973db8b7abe\") " pod="openstack/glance-db-sync-g5bvk" Dec 04 10:34:23 crc kubenswrapper[4831]: I1204 10:34:23.313335 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpv48\" (UniqueName: \"kubernetes.io/projected/18f48aa7-65d1-41ce-bc0d-4973db8b7abe-kube-api-access-rpv48\") pod \"glance-db-sync-g5bvk\" (UID: \"18f48aa7-65d1-41ce-bc0d-4973db8b7abe\") " pod="openstack/glance-db-sync-g5bvk" Dec 04 10:34:23 crc kubenswrapper[4831]: I1204 10:34:23.313422 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f48aa7-65d1-41ce-bc0d-4973db8b7abe-config-data\") pod \"glance-db-sync-g5bvk\" (UID: \"18f48aa7-65d1-41ce-bc0d-4973db8b7abe\") " pod="openstack/glance-db-sync-g5bvk" Dec 04 10:34:23 crc kubenswrapper[4831]: I1204 10:34:23.313477 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/18f48aa7-65d1-41ce-bc0d-4973db8b7abe-db-sync-config-data\") pod \"glance-db-sync-g5bvk\" (UID: \"18f48aa7-65d1-41ce-bc0d-4973db8b7abe\") " pod="openstack/glance-db-sync-g5bvk" Dec 04 10:34:23 crc kubenswrapper[4831]: I1204 10:34:23.313506 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f48aa7-65d1-41ce-bc0d-4973db8b7abe-combined-ca-bundle\") pod \"glance-db-sync-g5bvk\" (UID: \"18f48aa7-65d1-41ce-bc0d-4973db8b7abe\") " pod="openstack/glance-db-sync-g5bvk" Dec 04 10:34:23 crc kubenswrapper[4831]: I1204 10:34:23.318743 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f48aa7-65d1-41ce-bc0d-4973db8b7abe-combined-ca-bundle\") pod \"glance-db-sync-g5bvk\" (UID: \"18f48aa7-65d1-41ce-bc0d-4973db8b7abe\") " pod="openstack/glance-db-sync-g5bvk" Dec 04 10:34:23 crc kubenswrapper[4831]: I1204 10:34:23.318968 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f48aa7-65d1-41ce-bc0d-4973db8b7abe-config-data\") pod \"glance-db-sync-g5bvk\" (UID: \"18f48aa7-65d1-41ce-bc0d-4973db8b7abe\") " pod="openstack/glance-db-sync-g5bvk" Dec 04 10:34:23 crc kubenswrapper[4831]: I1204 10:34:23.333052 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/18f48aa7-65d1-41ce-bc0d-4973db8b7abe-db-sync-config-data\") pod \"glance-db-sync-g5bvk\" (UID: \"18f48aa7-65d1-41ce-bc0d-4973db8b7abe\") " pod="openstack/glance-db-sync-g5bvk" Dec 04 10:34:23 crc kubenswrapper[4831]: I1204 10:34:23.333236 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpv48\" (UniqueName: \"kubernetes.io/projected/18f48aa7-65d1-41ce-bc0d-4973db8b7abe-kube-api-access-rpv48\") pod \"glance-db-sync-g5bvk\" (UID: \"18f48aa7-65d1-41ce-bc0d-4973db8b7abe\") " pod="openstack/glance-db-sync-g5bvk" Dec 04 10:34:23 crc kubenswrapper[4831]: I1204 10:34:23.410184 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-9hwx9" event={"ID":"ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e","Type":"ContainerStarted","Data":"f4613a44c044b4534a4ec68c77b6c0ccafde0d0eef4915d6068731e902b02e5f"} Dec 04 10:34:23 crc kubenswrapper[4831]: I1204 10:34:23.434609 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-g5bvk" Dec 04 10:34:23 crc kubenswrapper[4831]: I1204 10:34:23.442105 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-9hwx9" podStartSLOduration=3.33890691 podStartE2EDuration="32.44208074s" podCreationTimestamp="2025-12-04 10:33:51 +0000 UTC" firstStartedPulling="2025-12-04 10:33:53.267002084 +0000 UTC m=+1130.216177398" lastFinishedPulling="2025-12-04 10:34:22.370175914 +0000 UTC m=+1159.319351228" observedRunningTime="2025-12-04 10:34:23.431309827 +0000 UTC m=+1160.380485141" watchObservedRunningTime="2025-12-04 10:34:23.44208074 +0000 UTC m=+1160.391256054" Dec 04 10:34:24 crc kubenswrapper[4831]: I1204 10:34:24.260223 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-g5bvk"] Dec 04 10:34:26 crc kubenswrapper[4831]: I1204 10:34:26.462728 4831 generic.go:334] "Generic (PLEG): container finished" podID="6040f79c-a151-4681-a758-d2741bff68b6" containerID="14cf7ccbd3c01321069a858e40065913f80313d32694b050774b055ef233e105" exitCode=1 Dec 04 10:34:26 crc kubenswrapper[4831]: I1204 10:34:26.462809 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"6040f79c-a151-4681-a758-d2741bff68b6","Type":"ContainerDied","Data":"14cf7ccbd3c01321069a858e40065913f80313d32694b050774b055ef233e105"} Dec 04 10:34:26 crc kubenswrapper[4831]: I1204 10:34:26.463697 4831 scope.go:117] "RemoveContainer" containerID="14cf7ccbd3c01321069a858e40065913f80313d32694b050774b055ef233e105" Dec 04 10:34:26 crc kubenswrapper[4831]: W1204 10:34:26.824931 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18f48aa7_65d1_41ce_bc0d_4973db8b7abe.slice/crio-2083b444af07eabf8c87ec1c8bad3d26b613524ebda6f40e4e88f35252cc9b96 WatchSource:0}: Error finding container 2083b444af07eabf8c87ec1c8bad3d26b613524ebda6f40e4e88f35252cc9b96: Status 404 returned error can't find the container with id 2083b444af07eabf8c87ec1c8bad3d26b613524ebda6f40e4e88f35252cc9b96 Dec 04 10:34:27 crc kubenswrapper[4831]: I1204 10:34:27.517531 4831 generic.go:334] "Generic (PLEG): container finished" podID="ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e" containerID="f4613a44c044b4534a4ec68c77b6c0ccafde0d0eef4915d6068731e902b02e5f" exitCode=0 Dec 04 10:34:27 crc kubenswrapper[4831]: I1204 10:34:27.517627 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-9hwx9" event={"ID":"ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e","Type":"ContainerDied","Data":"f4613a44c044b4534a4ec68c77b6c0ccafde0d0eef4915d6068731e902b02e5f"} Dec 04 10:34:27 crc kubenswrapper[4831]: I1204 10:34:27.521040 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b769c363-a026-4bf4-9b56-3d1452b6847d","Type":"ContainerStarted","Data":"8f26f07839607b082f46d9f85dba97955a75c334c66d497c223ed587ce5b9dd4"} Dec 04 10:34:27 crc kubenswrapper[4831]: I1204 10:34:27.528289 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"6040f79c-a151-4681-a758-d2741bff68b6","Type":"ContainerStarted","Data":"7751384cd6e87417fdd57ce1a33bf4f4b451bc1ea43414d0e681fdb437c59e89"} Dec 04 10:34:27 crc kubenswrapper[4831]: I1204 10:34:27.529975 4831 generic.go:334] "Generic (PLEG): container finished" podID="a79b812b-5862-43d0-a6d5-5c6ec3a63e51" containerID="f926e57e3fd75cc46053dfb6e24b1843bd59fb8625828329f64b893c77a761da" exitCode=0 Dec 04 10:34:27 crc kubenswrapper[4831]: I1204 10:34:27.530049 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-7svk9" event={"ID":"a79b812b-5862-43d0-a6d5-5c6ec3a63e51","Type":"ContainerDied","Data":"f926e57e3fd75cc46053dfb6e24b1843bd59fb8625828329f64b893c77a761da"} Dec 04 10:34:27 crc kubenswrapper[4831]: I1204 10:34:27.532278 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-g5bvk" event={"ID":"18f48aa7-65d1-41ce-bc0d-4973db8b7abe","Type":"ContainerStarted","Data":"2083b444af07eabf8c87ec1c8bad3d26b613524ebda6f40e4e88f35252cc9b96"} Dec 04 10:34:30 crc kubenswrapper[4831]: I1204 10:34:30.561680 4831 generic.go:334] "Generic (PLEG): container finished" podID="6040f79c-a151-4681-a758-d2741bff68b6" containerID="7751384cd6e87417fdd57ce1a33bf4f4b451bc1ea43414d0e681fdb437c59e89" exitCode=1 Dec 04 10:34:30 crc kubenswrapper[4831]: I1204 10:34:30.561803 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"6040f79c-a151-4681-a758-d2741bff68b6","Type":"ContainerDied","Data":"7751384cd6e87417fdd57ce1a33bf4f4b451bc1ea43414d0e681fdb437c59e89"} Dec 04 10:34:30 crc kubenswrapper[4831]: I1204 10:34:30.562073 4831 scope.go:117] "RemoveContainer" containerID="14cf7ccbd3c01321069a858e40065913f80313d32694b050774b055ef233e105" Dec 04 10:34:30 crc kubenswrapper[4831]: I1204 10:34:30.562756 4831 scope.go:117] "RemoveContainer" containerID="7751384cd6e87417fdd57ce1a33bf4f4b451bc1ea43414d0e681fdb437c59e89" Dec 04 10:34:30 crc kubenswrapper[4831]: E1204 10:34:30.563138 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(6040f79c-a151-4681-a758-d2741bff68b6)\"" pod="openstack/watcher-decision-engine-0" podUID="6040f79c-a151-4681-a758-d2741bff68b6" Dec 04 10:34:30 crc kubenswrapper[4831]: I1204 10:34:30.961860 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 04 10:34:30 crc kubenswrapper[4831]: I1204 10:34:30.962245 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 04 10:34:31 crc kubenswrapper[4831]: I1204 10:34:31.347197 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Dec 04 10:34:31 crc kubenswrapper[4831]: I1204 10:34:31.359106 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Dec 04 10:34:31 crc kubenswrapper[4831]: I1204 10:34:31.575051 4831 scope.go:117] "RemoveContainer" containerID="7751384cd6e87417fdd57ce1a33bf4f4b451bc1ea43414d0e681fdb437c59e89" Dec 04 10:34:31 crc kubenswrapper[4831]: E1204 10:34:31.575283 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(6040f79c-a151-4681-a758-d2741bff68b6)\"" pod="openstack/watcher-decision-engine-0" podUID="6040f79c-a151-4681-a758-d2741bff68b6" Dec 04 10:34:33 crc kubenswrapper[4831]: I1204 10:34:33.638019 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6694d6d998-wgcht" Dec 04 10:34:33 crc kubenswrapper[4831]: I1204 10:34:33.966347 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5cf8898794-dbfdf" Dec 04 10:34:34 crc kubenswrapper[4831]: I1204 10:34:34.037095 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Dec 04 10:34:34 crc kubenswrapper[4831]: I1204 10:34:34.037380 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="077ce354-b1d3-40d2-bf19-0eee3d474753" containerName="watcher-api-log" containerID="cri-o://13382f8529ff796d8a356c27250577db1aff612b0eec44dd5ec22a8084db6f59" gracePeriod=30 Dec 04 10:34:34 crc kubenswrapper[4831]: I1204 10:34:34.037509 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="077ce354-b1d3-40d2-bf19-0eee3d474753" containerName="watcher-api" containerID="cri-o://4d14cc312ce3fdd5254c5eeef2f36ed49f38e1d7d481bf48b33eb6616558c934" gracePeriod=30 Dec 04 10:34:34 crc kubenswrapper[4831]: I1204 10:34:34.619296 4831 generic.go:334] "Generic (PLEG): container finished" podID="077ce354-b1d3-40d2-bf19-0eee3d474753" containerID="13382f8529ff796d8a356c27250577db1aff612b0eec44dd5ec22a8084db6f59" exitCode=143 Dec 04 10:34:34 crc kubenswrapper[4831]: I1204 10:34:34.619655 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"077ce354-b1d3-40d2-bf19-0eee3d474753","Type":"ContainerDied","Data":"13382f8529ff796d8a356c27250577db1aff612b0eec44dd5ec22a8084db6f59"} Dec 04 10:34:35 crc kubenswrapper[4831]: I1204 10:34:35.360526 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6694d6d998-wgcht" Dec 04 10:34:35 crc kubenswrapper[4831]: I1204 10:34:35.634300 4831 generic.go:334] "Generic (PLEG): container finished" podID="077ce354-b1d3-40d2-bf19-0eee3d474753" containerID="4d14cc312ce3fdd5254c5eeef2f36ed49f38e1d7d481bf48b33eb6616558c934" exitCode=0 Dec 04 10:34:35 crc kubenswrapper[4831]: I1204 10:34:35.634349 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"077ce354-b1d3-40d2-bf19-0eee3d474753","Type":"ContainerDied","Data":"4d14cc312ce3fdd5254c5eeef2f36ed49f38e1d7d481bf48b33eb6616558c934"} Dec 04 10:34:35 crc kubenswrapper[4831]: I1204 10:34:35.774111 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5cf8898794-dbfdf" Dec 04 10:34:35 crc kubenswrapper[4831]: I1204 10:34:35.834064 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6694d6d998-wgcht"] Dec 04 10:34:35 crc kubenswrapper[4831]: I1204 10:34:35.834337 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6694d6d998-wgcht" podUID="42db10e0-67a3-49d3-b6c5-8f48e31775e7" containerName="horizon-log" containerID="cri-o://7ee53d79998a754dec4a0a0360839ddcc132026c78ae90554a04a9bb064b610a" gracePeriod=30 Dec 04 10:34:35 crc kubenswrapper[4831]: I1204 10:34:35.834422 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6694d6d998-wgcht" podUID="42db10e0-67a3-49d3-b6c5-8f48e31775e7" containerName="horizon" containerID="cri-o://5c5695bb0b5c313b73a355c41b184d4d28fd82fedf874d90f04b2d87b2c1843a" gracePeriod=30 Dec 04 10:34:36 crc kubenswrapper[4831]: I1204 10:34:36.340964 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="077ce354-b1d3-40d2-bf19-0eee3d474753" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.157:9322/\": dial tcp 10.217.0.157:9322: connect: connection refused" Dec 04 10:34:36 crc kubenswrapper[4831]: I1204 10:34:36.341035 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="077ce354-b1d3-40d2-bf19-0eee3d474753" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.157:9322/\": dial tcp 10.217.0.157:9322: connect: connection refused" Dec 04 10:34:37 crc kubenswrapper[4831]: I1204 10:34:37.654771 4831 generic.go:334] "Generic (PLEG): container finished" podID="42db10e0-67a3-49d3-b6c5-8f48e31775e7" containerID="5c5695bb0b5c313b73a355c41b184d4d28fd82fedf874d90f04b2d87b2c1843a" exitCode=0 Dec 04 10:34:37 crc kubenswrapper[4831]: I1204 10:34:37.654820 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6694d6d998-wgcht" event={"ID":"42db10e0-67a3-49d3-b6c5-8f48e31775e7","Type":"ContainerDied","Data":"5c5695bb0b5c313b73a355c41b184d4d28fd82fedf874d90f04b2d87b2c1843a"} Dec 04 10:34:38 crc kubenswrapper[4831]: I1204 10:34:38.242977 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-7svk9" Dec 04 10:34:38 crc kubenswrapper[4831]: I1204 10:34:38.311348 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a79b812b-5862-43d0-a6d5-5c6ec3a63e51-combined-ca-bundle\") pod \"a79b812b-5862-43d0-a6d5-5c6ec3a63e51\" (UID: \"a79b812b-5862-43d0-a6d5-5c6ec3a63e51\") " Dec 04 10:34:38 crc kubenswrapper[4831]: I1204 10:34:38.311413 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a79b812b-5862-43d0-a6d5-5c6ec3a63e51-db-sync-config-data\") pod \"a79b812b-5862-43d0-a6d5-5c6ec3a63e51\" (UID: \"a79b812b-5862-43d0-a6d5-5c6ec3a63e51\") " Dec 04 10:34:38 crc kubenswrapper[4831]: I1204 10:34:38.311635 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8q885\" (UniqueName: \"kubernetes.io/projected/a79b812b-5862-43d0-a6d5-5c6ec3a63e51-kube-api-access-8q885\") pod \"a79b812b-5862-43d0-a6d5-5c6ec3a63e51\" (UID: \"a79b812b-5862-43d0-a6d5-5c6ec3a63e51\") " Dec 04 10:34:38 crc kubenswrapper[4831]: I1204 10:34:38.318810 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a79b812b-5862-43d0-a6d5-5c6ec3a63e51-kube-api-access-8q885" (OuterVolumeSpecName: "kube-api-access-8q885") pod "a79b812b-5862-43d0-a6d5-5c6ec3a63e51" (UID: "a79b812b-5862-43d0-a6d5-5c6ec3a63e51"). InnerVolumeSpecName "kube-api-access-8q885". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:34:38 crc kubenswrapper[4831]: I1204 10:34:38.321879 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a79b812b-5862-43d0-a6d5-5c6ec3a63e51-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a79b812b-5862-43d0-a6d5-5c6ec3a63e51" (UID: "a79b812b-5862-43d0-a6d5-5c6ec3a63e51"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:38 crc kubenswrapper[4831]: I1204 10:34:38.346861 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a79b812b-5862-43d0-a6d5-5c6ec3a63e51-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a79b812b-5862-43d0-a6d5-5c6ec3a63e51" (UID: "a79b812b-5862-43d0-a6d5-5c6ec3a63e51"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:38 crc kubenswrapper[4831]: I1204 10:34:38.413656 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8q885\" (UniqueName: \"kubernetes.io/projected/a79b812b-5862-43d0-a6d5-5c6ec3a63e51-kube-api-access-8q885\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:38 crc kubenswrapper[4831]: I1204 10:34:38.413725 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a79b812b-5862-43d0-a6d5-5c6ec3a63e51-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:38 crc kubenswrapper[4831]: I1204 10:34:38.413743 4831 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a79b812b-5862-43d0-a6d5-5c6ec3a63e51-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:38 crc kubenswrapper[4831]: I1204 10:34:38.666969 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-7svk9" event={"ID":"a79b812b-5862-43d0-a6d5-5c6ec3a63e51","Type":"ContainerDied","Data":"4cd2e5ce6cdbfe9fb9495743f410cbeb852ed0fd24d3fa070837a71293dbc911"} Dec 04 10:34:38 crc kubenswrapper[4831]: I1204 10:34:38.667005 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cd2e5ce6cdbfe9fb9495743f410cbeb852ed0fd24d3fa070837a71293dbc911" Dec 04 10:34:38 crc kubenswrapper[4831]: I1204 10:34:38.667055 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-7svk9" Dec 04 10:34:39 crc kubenswrapper[4831]: E1204 10:34:39.324798 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.47:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Dec 04 10:34:39 crc kubenswrapper[4831]: E1204 10:34:39.324864 4831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.47:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Dec 04 10:34:39 crc kubenswrapper[4831]: E1204 10:34:39.325002 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:38.102.83.47:5001/podified-master-centos10/openstack-cinder-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pxr78,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-dmcxs_openstack(94129f00-4043-4552-9724-feef1585cd20): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 10:34:39 crc kubenswrapper[4831]: E1204 10:34:39.326719 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-dmcxs" podUID="94129f00-4043-4552-9724-feef1585cd20" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.553395 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-69bc569cc4-dg5gj"] Dec 04 10:34:39 crc kubenswrapper[4831]: E1204 10:34:39.553804 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a79b812b-5862-43d0-a6d5-5c6ec3a63e51" containerName="barbican-db-sync" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.553827 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a79b812b-5862-43d0-a6d5-5c6ec3a63e51" containerName="barbican-db-sync" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.554047 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a79b812b-5862-43d0-a6d5-5c6ec3a63e51" containerName="barbican-db-sync" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.554990 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-69bc569cc4-dg5gj" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.558795 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.558953 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-tsc2h" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.559062 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.561228 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6896bbf7f5-dvgps"] Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.562616 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6896bbf7f5-dvgps" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.566105 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.626503 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6896bbf7f5-dvgps"] Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.641781 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/734f14e8-f267-4c7c-a5eb-d76457ec9d69-logs\") pod \"barbican-keystone-listener-69bc569cc4-dg5gj\" (UID: \"734f14e8-f267-4c7c-a5eb-d76457ec9d69\") " pod="openstack/barbican-keystone-listener-69bc569cc4-dg5gj" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.641866 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/734f14e8-f267-4c7c-a5eb-d76457ec9d69-config-data\") pod \"barbican-keystone-listener-69bc569cc4-dg5gj\" (UID: \"734f14e8-f267-4c7c-a5eb-d76457ec9d69\") " pod="openstack/barbican-keystone-listener-69bc569cc4-dg5gj" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.641965 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/734f14e8-f267-4c7c-a5eb-d76457ec9d69-combined-ca-bundle\") pod \"barbican-keystone-listener-69bc569cc4-dg5gj\" (UID: \"734f14e8-f267-4c7c-a5eb-d76457ec9d69\") " pod="openstack/barbican-keystone-listener-69bc569cc4-dg5gj" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.642063 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c3dabff-2635-4e29-9651-8df5d84838f9-combined-ca-bundle\") pod \"barbican-worker-6896bbf7f5-dvgps\" (UID: \"8c3dabff-2635-4e29-9651-8df5d84838f9\") " pod="openstack/barbican-worker-6896bbf7f5-dvgps" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.642241 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8b8w\" (UniqueName: \"kubernetes.io/projected/8c3dabff-2635-4e29-9651-8df5d84838f9-kube-api-access-n8b8w\") pod \"barbican-worker-6896bbf7f5-dvgps\" (UID: \"8c3dabff-2635-4e29-9651-8df5d84838f9\") " pod="openstack/barbican-worker-6896bbf7f5-dvgps" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.642363 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c3dabff-2635-4e29-9651-8df5d84838f9-logs\") pod \"barbican-worker-6896bbf7f5-dvgps\" (UID: \"8c3dabff-2635-4e29-9651-8df5d84838f9\") " pod="openstack/barbican-worker-6896bbf7f5-dvgps" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.642488 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c3dabff-2635-4e29-9651-8df5d84838f9-config-data\") pod \"barbican-worker-6896bbf7f5-dvgps\" (UID: \"8c3dabff-2635-4e29-9651-8df5d84838f9\") " pod="openstack/barbican-worker-6896bbf7f5-dvgps" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.642555 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c3dabff-2635-4e29-9651-8df5d84838f9-config-data-custom\") pod \"barbican-worker-6896bbf7f5-dvgps\" (UID: \"8c3dabff-2635-4e29-9651-8df5d84838f9\") " pod="openstack/barbican-worker-6896bbf7f5-dvgps" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.642602 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgdxv\" (UniqueName: \"kubernetes.io/projected/734f14e8-f267-4c7c-a5eb-d76457ec9d69-kube-api-access-lgdxv\") pod \"barbican-keystone-listener-69bc569cc4-dg5gj\" (UID: \"734f14e8-f267-4c7c-a5eb-d76457ec9d69\") " pod="openstack/barbican-keystone-listener-69bc569cc4-dg5gj" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.667103 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-69bc569cc4-dg5gj"] Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.679789 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/734f14e8-f267-4c7c-a5eb-d76457ec9d69-config-data-custom\") pod \"barbican-keystone-listener-69bc569cc4-dg5gj\" (UID: \"734f14e8-f267-4c7c-a5eb-d76457ec9d69\") " pod="openstack/barbican-keystone-listener-69bc569cc4-dg5gj" Dec 04 10:34:39 crc kubenswrapper[4831]: E1204 10:34:39.730302 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.47:5001/podified-master-centos10/openstack-cinder-api:watcher_latest\\\"\"" pod="openstack/cinder-db-sync-dmcxs" podUID="94129f00-4043-4552-9724-feef1585cd20" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.780207 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b9767fcff-cwqsr"] Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.781540 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c3dabff-2635-4e29-9651-8df5d84838f9-combined-ca-bundle\") pod \"barbican-worker-6896bbf7f5-dvgps\" (UID: \"8c3dabff-2635-4e29-9651-8df5d84838f9\") " pod="openstack/barbican-worker-6896bbf7f5-dvgps" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.781617 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8b8w\" (UniqueName: \"kubernetes.io/projected/8c3dabff-2635-4e29-9651-8df5d84838f9-kube-api-access-n8b8w\") pod \"barbican-worker-6896bbf7f5-dvgps\" (UID: \"8c3dabff-2635-4e29-9651-8df5d84838f9\") " pod="openstack/barbican-worker-6896bbf7f5-dvgps" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.781651 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c3dabff-2635-4e29-9651-8df5d84838f9-logs\") pod \"barbican-worker-6896bbf7f5-dvgps\" (UID: \"8c3dabff-2635-4e29-9651-8df5d84838f9\") " pod="openstack/barbican-worker-6896bbf7f5-dvgps" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.781765 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c3dabff-2635-4e29-9651-8df5d84838f9-config-data\") pod \"barbican-worker-6896bbf7f5-dvgps\" (UID: \"8c3dabff-2635-4e29-9651-8df5d84838f9\") " pod="openstack/barbican-worker-6896bbf7f5-dvgps" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.781822 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c3dabff-2635-4e29-9651-8df5d84838f9-config-data-custom\") pod \"barbican-worker-6896bbf7f5-dvgps\" (UID: \"8c3dabff-2635-4e29-9651-8df5d84838f9\") " pod="openstack/barbican-worker-6896bbf7f5-dvgps" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.781849 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgdxv\" (UniqueName: \"kubernetes.io/projected/734f14e8-f267-4c7c-a5eb-d76457ec9d69-kube-api-access-lgdxv\") pod \"barbican-keystone-listener-69bc569cc4-dg5gj\" (UID: \"734f14e8-f267-4c7c-a5eb-d76457ec9d69\") " pod="openstack/barbican-keystone-listener-69bc569cc4-dg5gj" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.781868 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/734f14e8-f267-4c7c-a5eb-d76457ec9d69-config-data-custom\") pod \"barbican-keystone-listener-69bc569cc4-dg5gj\" (UID: \"734f14e8-f267-4c7c-a5eb-d76457ec9d69\") " pod="openstack/barbican-keystone-listener-69bc569cc4-dg5gj" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.781939 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/734f14e8-f267-4c7c-a5eb-d76457ec9d69-logs\") pod \"barbican-keystone-listener-69bc569cc4-dg5gj\" (UID: \"734f14e8-f267-4c7c-a5eb-d76457ec9d69\") " pod="openstack/barbican-keystone-listener-69bc569cc4-dg5gj" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.781962 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/734f14e8-f267-4c7c-a5eb-d76457ec9d69-config-data\") pod \"barbican-keystone-listener-69bc569cc4-dg5gj\" (UID: \"734f14e8-f267-4c7c-a5eb-d76457ec9d69\") " pod="openstack/barbican-keystone-listener-69bc569cc4-dg5gj" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.782003 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/734f14e8-f267-4c7c-a5eb-d76457ec9d69-combined-ca-bundle\") pod \"barbican-keystone-listener-69bc569cc4-dg5gj\" (UID: \"734f14e8-f267-4c7c-a5eb-d76457ec9d69\") " pod="openstack/barbican-keystone-listener-69bc569cc4-dg5gj" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.782027 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9767fcff-cwqsr" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.785122 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c3dabff-2635-4e29-9651-8df5d84838f9-logs\") pod \"barbican-worker-6896bbf7f5-dvgps\" (UID: \"8c3dabff-2635-4e29-9651-8df5d84838f9\") " pod="openstack/barbican-worker-6896bbf7f5-dvgps" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.786150 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/734f14e8-f267-4c7c-a5eb-d76457ec9d69-logs\") pod \"barbican-keystone-listener-69bc569cc4-dg5gj\" (UID: \"734f14e8-f267-4c7c-a5eb-d76457ec9d69\") " pod="openstack/barbican-keystone-listener-69bc569cc4-dg5gj" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.792974 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c3dabff-2635-4e29-9651-8df5d84838f9-combined-ca-bundle\") pod \"barbican-worker-6896bbf7f5-dvgps\" (UID: \"8c3dabff-2635-4e29-9651-8df5d84838f9\") " pod="openstack/barbican-worker-6896bbf7f5-dvgps" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.795884 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b9767fcff-cwqsr"] Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.806130 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c3dabff-2635-4e29-9651-8df5d84838f9-config-data\") pod \"barbican-worker-6896bbf7f5-dvgps\" (UID: \"8c3dabff-2635-4e29-9651-8df5d84838f9\") " pod="openstack/barbican-worker-6896bbf7f5-dvgps" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.806195 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-545d99f8dd-55gfz"] Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.808491 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-545d99f8dd-55gfz" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.816796 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/734f14e8-f267-4c7c-a5eb-d76457ec9d69-combined-ca-bundle\") pod \"barbican-keystone-listener-69bc569cc4-dg5gj\" (UID: \"734f14e8-f267-4c7c-a5eb-d76457ec9d69\") " pod="openstack/barbican-keystone-listener-69bc569cc4-dg5gj" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.817729 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c3dabff-2635-4e29-9651-8df5d84838f9-config-data-custom\") pod \"barbican-worker-6896bbf7f5-dvgps\" (UID: \"8c3dabff-2635-4e29-9651-8df5d84838f9\") " pod="openstack/barbican-worker-6896bbf7f5-dvgps" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.818190 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8b8w\" (UniqueName: \"kubernetes.io/projected/8c3dabff-2635-4e29-9651-8df5d84838f9-kube-api-access-n8b8w\") pod \"barbican-worker-6896bbf7f5-dvgps\" (UID: \"8c3dabff-2635-4e29-9651-8df5d84838f9\") " pod="openstack/barbican-worker-6896bbf7f5-dvgps" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.819581 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.819823 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/734f14e8-f267-4c7c-a5eb-d76457ec9d69-config-data\") pod \"barbican-keystone-listener-69bc569cc4-dg5gj\" (UID: \"734f14e8-f267-4c7c-a5eb-d76457ec9d69\") " pod="openstack/barbican-keystone-listener-69bc569cc4-dg5gj" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.820440 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/734f14e8-f267-4c7c-a5eb-d76457ec9d69-config-data-custom\") pod \"barbican-keystone-listener-69bc569cc4-dg5gj\" (UID: \"734f14e8-f267-4c7c-a5eb-d76457ec9d69\") " pod="openstack/barbican-keystone-listener-69bc569cc4-dg5gj" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.831462 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgdxv\" (UniqueName: \"kubernetes.io/projected/734f14e8-f267-4c7c-a5eb-d76457ec9d69-kube-api-access-lgdxv\") pod \"barbican-keystone-listener-69bc569cc4-dg5gj\" (UID: \"734f14e8-f267-4c7c-a5eb-d76457ec9d69\") " pod="openstack/barbican-keystone-listener-69bc569cc4-dg5gj" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.843995 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-545d99f8dd-55gfz"] Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.888260 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3851f8b-c385-4852-8339-c9cb8f4586e5-ovsdbserver-sb\") pod \"dnsmasq-dns-6b9767fcff-cwqsr\" (UID: \"b3851f8b-c385-4852-8339-c9cb8f4586e5\") " pod="openstack/dnsmasq-dns-6b9767fcff-cwqsr" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.888390 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be37a16a-5f7e-4f97-b71d-ee344177919c-combined-ca-bundle\") pod \"barbican-api-545d99f8dd-55gfz\" (UID: \"be37a16a-5f7e-4f97-b71d-ee344177919c\") " pod="openstack/barbican-api-545d99f8dd-55gfz" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.888469 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3851f8b-c385-4852-8339-c9cb8f4586e5-dns-svc\") pod \"dnsmasq-dns-6b9767fcff-cwqsr\" (UID: \"b3851f8b-c385-4852-8339-c9cb8f4586e5\") " pod="openstack/dnsmasq-dns-6b9767fcff-cwqsr" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.888495 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be37a16a-5f7e-4f97-b71d-ee344177919c-logs\") pod \"barbican-api-545d99f8dd-55gfz\" (UID: \"be37a16a-5f7e-4f97-b71d-ee344177919c\") " pod="openstack/barbican-api-545d99f8dd-55gfz" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.888533 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be37a16a-5f7e-4f97-b71d-ee344177919c-config-data\") pod \"barbican-api-545d99f8dd-55gfz\" (UID: \"be37a16a-5f7e-4f97-b71d-ee344177919c\") " pod="openstack/barbican-api-545d99f8dd-55gfz" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.888580 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b3851f8b-c385-4852-8339-c9cb8f4586e5-dns-swift-storage-0\") pod \"dnsmasq-dns-6b9767fcff-cwqsr\" (UID: \"b3851f8b-c385-4852-8339-c9cb8f4586e5\") " pod="openstack/dnsmasq-dns-6b9767fcff-cwqsr" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.888610 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be37a16a-5f7e-4f97-b71d-ee344177919c-config-data-custom\") pod \"barbican-api-545d99f8dd-55gfz\" (UID: \"be37a16a-5f7e-4f97-b71d-ee344177919c\") " pod="openstack/barbican-api-545d99f8dd-55gfz" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.888636 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3851f8b-c385-4852-8339-c9cb8f4586e5-ovsdbserver-nb\") pod \"dnsmasq-dns-6b9767fcff-cwqsr\" (UID: \"b3851f8b-c385-4852-8339-c9cb8f4586e5\") " pod="openstack/dnsmasq-dns-6b9767fcff-cwqsr" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.888684 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9vm5\" (UniqueName: \"kubernetes.io/projected/b3851f8b-c385-4852-8339-c9cb8f4586e5-kube-api-access-f9vm5\") pod \"dnsmasq-dns-6b9767fcff-cwqsr\" (UID: \"b3851f8b-c385-4852-8339-c9cb8f4586e5\") " pod="openstack/dnsmasq-dns-6b9767fcff-cwqsr" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.888709 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3851f8b-c385-4852-8339-c9cb8f4586e5-config\") pod \"dnsmasq-dns-6b9767fcff-cwqsr\" (UID: \"b3851f8b-c385-4852-8339-c9cb8f4586e5\") " pod="openstack/dnsmasq-dns-6b9767fcff-cwqsr" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.888755 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prffl\" (UniqueName: \"kubernetes.io/projected/be37a16a-5f7e-4f97-b71d-ee344177919c-kube-api-access-prffl\") pod \"barbican-api-545d99f8dd-55gfz\" (UID: \"be37a16a-5f7e-4f97-b71d-ee344177919c\") " pod="openstack/barbican-api-545d99f8dd-55gfz" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.894672 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-69bc569cc4-dg5gj" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.910213 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6896bbf7f5-dvgps" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.993589 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be37a16a-5f7e-4f97-b71d-ee344177919c-config-data\") pod \"barbican-api-545d99f8dd-55gfz\" (UID: \"be37a16a-5f7e-4f97-b71d-ee344177919c\") " pod="openstack/barbican-api-545d99f8dd-55gfz" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.993681 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b3851f8b-c385-4852-8339-c9cb8f4586e5-dns-swift-storage-0\") pod \"dnsmasq-dns-6b9767fcff-cwqsr\" (UID: \"b3851f8b-c385-4852-8339-c9cb8f4586e5\") " pod="openstack/dnsmasq-dns-6b9767fcff-cwqsr" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.993716 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be37a16a-5f7e-4f97-b71d-ee344177919c-config-data-custom\") pod \"barbican-api-545d99f8dd-55gfz\" (UID: \"be37a16a-5f7e-4f97-b71d-ee344177919c\") " pod="openstack/barbican-api-545d99f8dd-55gfz" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.993744 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3851f8b-c385-4852-8339-c9cb8f4586e5-ovsdbserver-nb\") pod \"dnsmasq-dns-6b9767fcff-cwqsr\" (UID: \"b3851f8b-c385-4852-8339-c9cb8f4586e5\") " pod="openstack/dnsmasq-dns-6b9767fcff-cwqsr" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.993776 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9vm5\" (UniqueName: \"kubernetes.io/projected/b3851f8b-c385-4852-8339-c9cb8f4586e5-kube-api-access-f9vm5\") pod \"dnsmasq-dns-6b9767fcff-cwqsr\" (UID: \"b3851f8b-c385-4852-8339-c9cb8f4586e5\") " pod="openstack/dnsmasq-dns-6b9767fcff-cwqsr" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.993799 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3851f8b-c385-4852-8339-c9cb8f4586e5-config\") pod \"dnsmasq-dns-6b9767fcff-cwqsr\" (UID: \"b3851f8b-c385-4852-8339-c9cb8f4586e5\") " pod="openstack/dnsmasq-dns-6b9767fcff-cwqsr" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.993843 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prffl\" (UniqueName: \"kubernetes.io/projected/be37a16a-5f7e-4f97-b71d-ee344177919c-kube-api-access-prffl\") pod \"barbican-api-545d99f8dd-55gfz\" (UID: \"be37a16a-5f7e-4f97-b71d-ee344177919c\") " pod="openstack/barbican-api-545d99f8dd-55gfz" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.993899 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3851f8b-c385-4852-8339-c9cb8f4586e5-ovsdbserver-sb\") pod \"dnsmasq-dns-6b9767fcff-cwqsr\" (UID: \"b3851f8b-c385-4852-8339-c9cb8f4586e5\") " pod="openstack/dnsmasq-dns-6b9767fcff-cwqsr" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.993954 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be37a16a-5f7e-4f97-b71d-ee344177919c-combined-ca-bundle\") pod \"barbican-api-545d99f8dd-55gfz\" (UID: \"be37a16a-5f7e-4f97-b71d-ee344177919c\") " pod="openstack/barbican-api-545d99f8dd-55gfz" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.994816 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3851f8b-c385-4852-8339-c9cb8f4586e5-dns-svc\") pod \"dnsmasq-dns-6b9767fcff-cwqsr\" (UID: \"b3851f8b-c385-4852-8339-c9cb8f4586e5\") " pod="openstack/dnsmasq-dns-6b9767fcff-cwqsr" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.994847 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be37a16a-5f7e-4f97-b71d-ee344177919c-logs\") pod \"barbican-api-545d99f8dd-55gfz\" (UID: \"be37a16a-5f7e-4f97-b71d-ee344177919c\") " pod="openstack/barbican-api-545d99f8dd-55gfz" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.995130 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b3851f8b-c385-4852-8339-c9cb8f4586e5-dns-swift-storage-0\") pod \"dnsmasq-dns-6b9767fcff-cwqsr\" (UID: \"b3851f8b-c385-4852-8339-c9cb8f4586e5\") " pod="openstack/dnsmasq-dns-6b9767fcff-cwqsr" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.995483 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be37a16a-5f7e-4f97-b71d-ee344177919c-logs\") pod \"barbican-api-545d99f8dd-55gfz\" (UID: \"be37a16a-5f7e-4f97-b71d-ee344177919c\") " pod="openstack/barbican-api-545d99f8dd-55gfz" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.996206 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3851f8b-c385-4852-8339-c9cb8f4586e5-ovsdbserver-sb\") pod \"dnsmasq-dns-6b9767fcff-cwqsr\" (UID: \"b3851f8b-c385-4852-8339-c9cb8f4586e5\") " pod="openstack/dnsmasq-dns-6b9767fcff-cwqsr" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.996510 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3851f8b-c385-4852-8339-c9cb8f4586e5-config\") pod \"dnsmasq-dns-6b9767fcff-cwqsr\" (UID: \"b3851f8b-c385-4852-8339-c9cb8f4586e5\") " pod="openstack/dnsmasq-dns-6b9767fcff-cwqsr" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.997214 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3851f8b-c385-4852-8339-c9cb8f4586e5-ovsdbserver-nb\") pod \"dnsmasq-dns-6b9767fcff-cwqsr\" (UID: \"b3851f8b-c385-4852-8339-c9cb8f4586e5\") " pod="openstack/dnsmasq-dns-6b9767fcff-cwqsr" Dec 04 10:34:39 crc kubenswrapper[4831]: I1204 10:34:39.999800 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be37a16a-5f7e-4f97-b71d-ee344177919c-config-data-custom\") pod \"barbican-api-545d99f8dd-55gfz\" (UID: \"be37a16a-5f7e-4f97-b71d-ee344177919c\") " pod="openstack/barbican-api-545d99f8dd-55gfz" Dec 04 10:34:40 crc kubenswrapper[4831]: I1204 10:34:40.000288 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be37a16a-5f7e-4f97-b71d-ee344177919c-combined-ca-bundle\") pod \"barbican-api-545d99f8dd-55gfz\" (UID: \"be37a16a-5f7e-4f97-b71d-ee344177919c\") " pod="openstack/barbican-api-545d99f8dd-55gfz" Dec 04 10:34:40 crc kubenswrapper[4831]: I1204 10:34:40.001474 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be37a16a-5f7e-4f97-b71d-ee344177919c-config-data\") pod \"barbican-api-545d99f8dd-55gfz\" (UID: \"be37a16a-5f7e-4f97-b71d-ee344177919c\") " pod="openstack/barbican-api-545d99f8dd-55gfz" Dec 04 10:34:40 crc kubenswrapper[4831]: I1204 10:34:40.002506 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3851f8b-c385-4852-8339-c9cb8f4586e5-dns-svc\") pod \"dnsmasq-dns-6b9767fcff-cwqsr\" (UID: \"b3851f8b-c385-4852-8339-c9cb8f4586e5\") " pod="openstack/dnsmasq-dns-6b9767fcff-cwqsr" Dec 04 10:34:40 crc kubenswrapper[4831]: I1204 10:34:40.012714 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prffl\" (UniqueName: \"kubernetes.io/projected/be37a16a-5f7e-4f97-b71d-ee344177919c-kube-api-access-prffl\") pod \"barbican-api-545d99f8dd-55gfz\" (UID: \"be37a16a-5f7e-4f97-b71d-ee344177919c\") " pod="openstack/barbican-api-545d99f8dd-55gfz" Dec 04 10:34:40 crc kubenswrapper[4831]: I1204 10:34:40.013845 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9vm5\" (UniqueName: \"kubernetes.io/projected/b3851f8b-c385-4852-8339-c9cb8f4586e5-kube-api-access-f9vm5\") pod \"dnsmasq-dns-6b9767fcff-cwqsr\" (UID: \"b3851f8b-c385-4852-8339-c9cb8f4586e5\") " pod="openstack/dnsmasq-dns-6b9767fcff-cwqsr" Dec 04 10:34:40 crc kubenswrapper[4831]: I1204 10:34:40.224268 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9767fcff-cwqsr" Dec 04 10:34:40 crc kubenswrapper[4831]: I1204 10:34:40.230734 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-545d99f8dd-55gfz" Dec 04 10:34:40 crc kubenswrapper[4831]: I1204 10:34:40.961116 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 04 10:34:40 crc kubenswrapper[4831]: I1204 10:34:40.961169 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Dec 04 10:34:40 crc kubenswrapper[4831]: I1204 10:34:40.961909 4831 scope.go:117] "RemoveContainer" containerID="7751384cd6e87417fdd57ce1a33bf4f4b451bc1ea43414d0e681fdb437c59e89" Dec 04 10:34:41 crc kubenswrapper[4831]: I1204 10:34:41.555405 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6694d6d998-wgcht" podUID="42db10e0-67a3-49d3-b6c5-8f48e31775e7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.153:8443: connect: connection refused" Dec 04 10:34:41 crc kubenswrapper[4831]: I1204 10:34:41.737380 4831 generic.go:334] "Generic (PLEG): container finished" podID="e41bbb94-b986-4268-8040-77542216b905" containerID="abeef5ea612be1debf78fcd95ef3690d3ede7c824d0834ae5bda84e1dd93c780" exitCode=137 Dec 04 10:34:41 crc kubenswrapper[4831]: I1204 10:34:41.737421 4831 generic.go:334] "Generic (PLEG): container finished" podID="e41bbb94-b986-4268-8040-77542216b905" containerID="db4983df9038a909cc6c9e8bd5cddb1e6d9ff4f8542855af2578e2773b6584f9" exitCode=137 Dec 04 10:34:41 crc kubenswrapper[4831]: I1204 10:34:41.737461 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cb786f76c-8xlwx" event={"ID":"e41bbb94-b986-4268-8040-77542216b905","Type":"ContainerDied","Data":"abeef5ea612be1debf78fcd95ef3690d3ede7c824d0834ae5bda84e1dd93c780"} Dec 04 10:34:41 crc kubenswrapper[4831]: I1204 10:34:41.737491 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cb786f76c-8xlwx" event={"ID":"e41bbb94-b986-4268-8040-77542216b905","Type":"ContainerDied","Data":"db4983df9038a909cc6c9e8bd5cddb1e6d9ff4f8542855af2578e2773b6584f9"} Dec 04 10:34:41 crc kubenswrapper[4831]: I1204 10:34:41.740145 4831 generic.go:334] "Generic (PLEG): container finished" podID="85393e48-dbc6-41c7-9419-4baf00f072db" containerID="ced500055055bcb942e672ef9b1d0182950bde3c7bee1e55888858a7e706bad5" exitCode=137 Dec 04 10:34:41 crc kubenswrapper[4831]: I1204 10:34:41.740174 4831 generic.go:334] "Generic (PLEG): container finished" podID="85393e48-dbc6-41c7-9419-4baf00f072db" containerID="936c2a23b071f14add02e0c514fe8771dabd15308495a12484cd9404a1590f1d" exitCode=137 Dec 04 10:34:41 crc kubenswrapper[4831]: I1204 10:34:41.740194 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f77cc7d69-g6vcb" event={"ID":"85393e48-dbc6-41c7-9419-4baf00f072db","Type":"ContainerDied","Data":"ced500055055bcb942e672ef9b1d0182950bde3c7bee1e55888858a7e706bad5"} Dec 04 10:34:41 crc kubenswrapper[4831]: I1204 10:34:41.740215 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f77cc7d69-g6vcb" event={"ID":"85393e48-dbc6-41c7-9419-4baf00f072db","Type":"ContainerDied","Data":"936c2a23b071f14add02e0c514fe8771dabd15308495a12484cd9404a1590f1d"} Dec 04 10:34:42 crc kubenswrapper[4831]: I1204 10:34:42.365934 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-78b878b7bb-lxbbq"] Dec 04 10:34:42 crc kubenswrapper[4831]: I1204 10:34:42.367562 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-78b878b7bb-lxbbq" Dec 04 10:34:42 crc kubenswrapper[4831]: I1204 10:34:42.373501 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 04 10:34:42 crc kubenswrapper[4831]: I1204 10:34:42.373677 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 04 10:34:42 crc kubenswrapper[4831]: I1204 10:34:42.377361 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-78b878b7bb-lxbbq"] Dec 04 10:34:42 crc kubenswrapper[4831]: I1204 10:34:42.545102 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dead6584-8c6b-4231-b0c6-54d83d05c250-internal-tls-certs\") pod \"barbican-api-78b878b7bb-lxbbq\" (UID: \"dead6584-8c6b-4231-b0c6-54d83d05c250\") " pod="openstack/barbican-api-78b878b7bb-lxbbq" Dec 04 10:34:42 crc kubenswrapper[4831]: I1204 10:34:42.545157 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvm4p\" (UniqueName: \"kubernetes.io/projected/dead6584-8c6b-4231-b0c6-54d83d05c250-kube-api-access-zvm4p\") pod \"barbican-api-78b878b7bb-lxbbq\" (UID: \"dead6584-8c6b-4231-b0c6-54d83d05c250\") " pod="openstack/barbican-api-78b878b7bb-lxbbq" Dec 04 10:34:42 crc kubenswrapper[4831]: I1204 10:34:42.545207 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dead6584-8c6b-4231-b0c6-54d83d05c250-config-data-custom\") pod \"barbican-api-78b878b7bb-lxbbq\" (UID: \"dead6584-8c6b-4231-b0c6-54d83d05c250\") " pod="openstack/barbican-api-78b878b7bb-lxbbq" Dec 04 10:34:42 crc kubenswrapper[4831]: I1204 10:34:42.545239 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dead6584-8c6b-4231-b0c6-54d83d05c250-combined-ca-bundle\") pod \"barbican-api-78b878b7bb-lxbbq\" (UID: \"dead6584-8c6b-4231-b0c6-54d83d05c250\") " pod="openstack/barbican-api-78b878b7bb-lxbbq" Dec 04 10:34:42 crc kubenswrapper[4831]: I1204 10:34:42.545370 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dead6584-8c6b-4231-b0c6-54d83d05c250-logs\") pod \"barbican-api-78b878b7bb-lxbbq\" (UID: \"dead6584-8c6b-4231-b0c6-54d83d05c250\") " pod="openstack/barbican-api-78b878b7bb-lxbbq" Dec 04 10:34:42 crc kubenswrapper[4831]: I1204 10:34:42.545831 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dead6584-8c6b-4231-b0c6-54d83d05c250-public-tls-certs\") pod \"barbican-api-78b878b7bb-lxbbq\" (UID: \"dead6584-8c6b-4231-b0c6-54d83d05c250\") " pod="openstack/barbican-api-78b878b7bb-lxbbq" Dec 04 10:34:42 crc kubenswrapper[4831]: I1204 10:34:42.545983 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dead6584-8c6b-4231-b0c6-54d83d05c250-config-data\") pod \"barbican-api-78b878b7bb-lxbbq\" (UID: \"dead6584-8c6b-4231-b0c6-54d83d05c250\") " pod="openstack/barbican-api-78b878b7bb-lxbbq" Dec 04 10:34:42 crc kubenswrapper[4831]: I1204 10:34:42.648335 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dead6584-8c6b-4231-b0c6-54d83d05c250-combined-ca-bundle\") pod \"barbican-api-78b878b7bb-lxbbq\" (UID: \"dead6584-8c6b-4231-b0c6-54d83d05c250\") " pod="openstack/barbican-api-78b878b7bb-lxbbq" Dec 04 10:34:42 crc kubenswrapper[4831]: I1204 10:34:42.648410 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dead6584-8c6b-4231-b0c6-54d83d05c250-logs\") pod \"barbican-api-78b878b7bb-lxbbq\" (UID: \"dead6584-8c6b-4231-b0c6-54d83d05c250\") " pod="openstack/barbican-api-78b878b7bb-lxbbq" Dec 04 10:34:42 crc kubenswrapper[4831]: I1204 10:34:42.648481 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dead6584-8c6b-4231-b0c6-54d83d05c250-public-tls-certs\") pod \"barbican-api-78b878b7bb-lxbbq\" (UID: \"dead6584-8c6b-4231-b0c6-54d83d05c250\") " pod="openstack/barbican-api-78b878b7bb-lxbbq" Dec 04 10:34:42 crc kubenswrapper[4831]: I1204 10:34:42.648516 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dead6584-8c6b-4231-b0c6-54d83d05c250-config-data\") pod \"barbican-api-78b878b7bb-lxbbq\" (UID: \"dead6584-8c6b-4231-b0c6-54d83d05c250\") " pod="openstack/barbican-api-78b878b7bb-lxbbq" Dec 04 10:34:42 crc kubenswrapper[4831]: I1204 10:34:42.648591 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dead6584-8c6b-4231-b0c6-54d83d05c250-internal-tls-certs\") pod \"barbican-api-78b878b7bb-lxbbq\" (UID: \"dead6584-8c6b-4231-b0c6-54d83d05c250\") " pod="openstack/barbican-api-78b878b7bb-lxbbq" Dec 04 10:34:42 crc kubenswrapper[4831]: I1204 10:34:42.648617 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvm4p\" (UniqueName: \"kubernetes.io/projected/dead6584-8c6b-4231-b0c6-54d83d05c250-kube-api-access-zvm4p\") pod \"barbican-api-78b878b7bb-lxbbq\" (UID: \"dead6584-8c6b-4231-b0c6-54d83d05c250\") " pod="openstack/barbican-api-78b878b7bb-lxbbq" Dec 04 10:34:42 crc kubenswrapper[4831]: I1204 10:34:42.648719 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dead6584-8c6b-4231-b0c6-54d83d05c250-config-data-custom\") pod \"barbican-api-78b878b7bb-lxbbq\" (UID: \"dead6584-8c6b-4231-b0c6-54d83d05c250\") " pod="openstack/barbican-api-78b878b7bb-lxbbq" Dec 04 10:34:42 crc kubenswrapper[4831]: I1204 10:34:42.649995 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dead6584-8c6b-4231-b0c6-54d83d05c250-logs\") pod \"barbican-api-78b878b7bb-lxbbq\" (UID: \"dead6584-8c6b-4231-b0c6-54d83d05c250\") " pod="openstack/barbican-api-78b878b7bb-lxbbq" Dec 04 10:34:42 crc kubenswrapper[4831]: I1204 10:34:42.655372 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dead6584-8c6b-4231-b0c6-54d83d05c250-public-tls-certs\") pod \"barbican-api-78b878b7bb-lxbbq\" (UID: \"dead6584-8c6b-4231-b0c6-54d83d05c250\") " pod="openstack/barbican-api-78b878b7bb-lxbbq" Dec 04 10:34:42 crc kubenswrapper[4831]: I1204 10:34:42.655883 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dead6584-8c6b-4231-b0c6-54d83d05c250-internal-tls-certs\") pod \"barbican-api-78b878b7bb-lxbbq\" (UID: \"dead6584-8c6b-4231-b0c6-54d83d05c250\") " pod="openstack/barbican-api-78b878b7bb-lxbbq" Dec 04 10:34:42 crc kubenswrapper[4831]: I1204 10:34:42.656379 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dead6584-8c6b-4231-b0c6-54d83d05c250-config-data-custom\") pod \"barbican-api-78b878b7bb-lxbbq\" (UID: \"dead6584-8c6b-4231-b0c6-54d83d05c250\") " pod="openstack/barbican-api-78b878b7bb-lxbbq" Dec 04 10:34:42 crc kubenswrapper[4831]: I1204 10:34:42.666109 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dead6584-8c6b-4231-b0c6-54d83d05c250-config-data\") pod \"barbican-api-78b878b7bb-lxbbq\" (UID: \"dead6584-8c6b-4231-b0c6-54d83d05c250\") " pod="openstack/barbican-api-78b878b7bb-lxbbq" Dec 04 10:34:42 crc kubenswrapper[4831]: I1204 10:34:42.667097 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dead6584-8c6b-4231-b0c6-54d83d05c250-combined-ca-bundle\") pod \"barbican-api-78b878b7bb-lxbbq\" (UID: \"dead6584-8c6b-4231-b0c6-54d83d05c250\") " pod="openstack/barbican-api-78b878b7bb-lxbbq" Dec 04 10:34:42 crc kubenswrapper[4831]: I1204 10:34:42.667640 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvm4p\" (UniqueName: \"kubernetes.io/projected/dead6584-8c6b-4231-b0c6-54d83d05c250-kube-api-access-zvm4p\") pod \"barbican-api-78b878b7bb-lxbbq\" (UID: \"dead6584-8c6b-4231-b0c6-54d83d05c250\") " pod="openstack/barbican-api-78b878b7bb-lxbbq" Dec 04 10:34:42 crc kubenswrapper[4831]: I1204 10:34:42.691685 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-78b878b7bb-lxbbq" Dec 04 10:34:46 crc kubenswrapper[4831]: I1204 10:34:46.341755 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="077ce354-b1d3-40d2-bf19-0eee3d474753" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.157:9322/\": dial tcp 10.217.0.157:9322: i/o timeout (Client.Timeout exceeded while awaiting headers)" Dec 04 10:34:46 crc kubenswrapper[4831]: I1204 10:34:46.341757 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="077ce354-b1d3-40d2-bf19-0eee3d474753" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.157:9322/\": dial tcp 10.217.0.157:9322: i/o timeout (Client.Timeout exceeded while awaiting headers)" Dec 04 10:34:47 crc kubenswrapper[4831]: E1204 10:34:47.646716 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.47:5001/podified-master-centos10/openstack-glance-api:watcher_latest" Dec 04 10:34:47 crc kubenswrapper[4831]: E1204 10:34:47.647002 4831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.47:5001/podified-master-centos10/openstack-glance-api:watcher_latest" Dec 04 10:34:47 crc kubenswrapper[4831]: E1204 10:34:47.647177 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:38.102.83.47:5001/podified-master-centos10/openstack-glance-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rpv48,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-g5bvk_openstack(18f48aa7-65d1-41ce-bc0d-4973db8b7abe): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 10:34:47 crc kubenswrapper[4831]: E1204 10:34:47.649259 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-g5bvk" podUID="18f48aa7-65d1-41ce-bc0d-4973db8b7abe" Dec 04 10:34:47 crc kubenswrapper[4831]: I1204 10:34:47.759170 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 04 10:34:47 crc kubenswrapper[4831]: I1204 10:34:47.764491 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-9hwx9" Dec 04 10:34:47 crc kubenswrapper[4831]: I1204 10:34:47.820907 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"077ce354-b1d3-40d2-bf19-0eee3d474753","Type":"ContainerDied","Data":"42c50d79fe4d00ec562856ffcd0393ff382e2a8a7f73e1b903f651da3536aea2"} Dec 04 10:34:47 crc kubenswrapper[4831]: I1204 10:34:47.820980 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 04 10:34:47 crc kubenswrapper[4831]: I1204 10:34:47.826874 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-9hwx9" Dec 04 10:34:47 crc kubenswrapper[4831]: I1204 10:34:47.827052 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-9hwx9" event={"ID":"ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e","Type":"ContainerDied","Data":"0e9857b0e3f42f15b48fc33dbf305ca1f2e5745bd0cfa5a505e1f0c2f262352b"} Dec 04 10:34:47 crc kubenswrapper[4831]: I1204 10:34:47.827074 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e9857b0e3f42f15b48fc33dbf305ca1f2e5745bd0cfa5a505e1f0c2f262352b" Dec 04 10:34:47 crc kubenswrapper[4831]: E1204 10:34:47.827522 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.47:5001/podified-master-centos10/openstack-glance-api:watcher_latest\\\"\"" pod="openstack/glance-db-sync-g5bvk" podUID="18f48aa7-65d1-41ce-bc0d-4973db8b7abe" Dec 04 10:34:47 crc kubenswrapper[4831]: I1204 10:34:47.868618 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/077ce354-b1d3-40d2-bf19-0eee3d474753-config-data\") pod \"077ce354-b1d3-40d2-bf19-0eee3d474753\" (UID: \"077ce354-b1d3-40d2-bf19-0eee3d474753\") " Dec 04 10:34:47 crc kubenswrapper[4831]: I1204 10:34:47.868946 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e-logs\") pod \"ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e\" (UID: \"ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e\") " Dec 04 10:34:47 crc kubenswrapper[4831]: I1204 10:34:47.869034 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpj5d\" (UniqueName: \"kubernetes.io/projected/ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e-kube-api-access-vpj5d\") pod \"ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e\" (UID: \"ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e\") " Dec 04 10:34:47 crc kubenswrapper[4831]: I1204 10:34:47.869129 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/077ce354-b1d3-40d2-bf19-0eee3d474753-logs\") pod \"077ce354-b1d3-40d2-bf19-0eee3d474753\" (UID: \"077ce354-b1d3-40d2-bf19-0eee3d474753\") " Dec 04 10:34:47 crc kubenswrapper[4831]: I1204 10:34:47.869190 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e-combined-ca-bundle\") pod \"ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e\" (UID: \"ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e\") " Dec 04 10:34:47 crc kubenswrapper[4831]: I1204 10:34:47.869226 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbdnw\" (UniqueName: \"kubernetes.io/projected/077ce354-b1d3-40d2-bf19-0eee3d474753-kube-api-access-jbdnw\") pod \"077ce354-b1d3-40d2-bf19-0eee3d474753\" (UID: \"077ce354-b1d3-40d2-bf19-0eee3d474753\") " Dec 04 10:34:47 crc kubenswrapper[4831]: I1204 10:34:47.869675 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/077ce354-b1d3-40d2-bf19-0eee3d474753-combined-ca-bundle\") pod \"077ce354-b1d3-40d2-bf19-0eee3d474753\" (UID: \"077ce354-b1d3-40d2-bf19-0eee3d474753\") " Dec 04 10:34:47 crc kubenswrapper[4831]: I1204 10:34:47.869707 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e-config-data\") pod \"ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e\" (UID: \"ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e\") " Dec 04 10:34:47 crc kubenswrapper[4831]: I1204 10:34:47.869747 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e-scripts\") pod \"ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e\" (UID: \"ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e\") " Dec 04 10:34:47 crc kubenswrapper[4831]: I1204 10:34:47.869781 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/077ce354-b1d3-40d2-bf19-0eee3d474753-custom-prometheus-ca\") pod \"077ce354-b1d3-40d2-bf19-0eee3d474753\" (UID: \"077ce354-b1d3-40d2-bf19-0eee3d474753\") " Dec 04 10:34:47 crc kubenswrapper[4831]: I1204 10:34:47.869507 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/077ce354-b1d3-40d2-bf19-0eee3d474753-logs" (OuterVolumeSpecName: "logs") pod "077ce354-b1d3-40d2-bf19-0eee3d474753" (UID: "077ce354-b1d3-40d2-bf19-0eee3d474753"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:34:47 crc kubenswrapper[4831]: I1204 10:34:47.869579 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e-logs" (OuterVolumeSpecName: "logs") pod "ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e" (UID: "ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:34:47 crc kubenswrapper[4831]: I1204 10:34:47.876406 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/077ce354-b1d3-40d2-bf19-0eee3d474753-kube-api-access-jbdnw" (OuterVolumeSpecName: "kube-api-access-jbdnw") pod "077ce354-b1d3-40d2-bf19-0eee3d474753" (UID: "077ce354-b1d3-40d2-bf19-0eee3d474753"). InnerVolumeSpecName "kube-api-access-jbdnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:34:47 crc kubenswrapper[4831]: I1204 10:34:47.883003 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e-kube-api-access-vpj5d" (OuterVolumeSpecName: "kube-api-access-vpj5d") pod "ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e" (UID: "ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e"). InnerVolumeSpecName "kube-api-access-vpj5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:34:47 crc kubenswrapper[4831]: I1204 10:34:47.885827 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e-scripts" (OuterVolumeSpecName: "scripts") pod "ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e" (UID: "ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:47 crc kubenswrapper[4831]: I1204 10:34:47.911139 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e" (UID: "ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:47 crc kubenswrapper[4831]: I1204 10:34:47.913807 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/077ce354-b1d3-40d2-bf19-0eee3d474753-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "077ce354-b1d3-40d2-bf19-0eee3d474753" (UID: "077ce354-b1d3-40d2-bf19-0eee3d474753"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:47 crc kubenswrapper[4831]: I1204 10:34:47.920374 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/077ce354-b1d3-40d2-bf19-0eee3d474753-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "077ce354-b1d3-40d2-bf19-0eee3d474753" (UID: "077ce354-b1d3-40d2-bf19-0eee3d474753"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:47 crc kubenswrapper[4831]: I1204 10:34:47.923898 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e-config-data" (OuterVolumeSpecName: "config-data") pod "ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e" (UID: "ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:47 crc kubenswrapper[4831]: I1204 10:34:47.962827 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/077ce354-b1d3-40d2-bf19-0eee3d474753-config-data" (OuterVolumeSpecName: "config-data") pod "077ce354-b1d3-40d2-bf19-0eee3d474753" (UID: "077ce354-b1d3-40d2-bf19-0eee3d474753"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:47 crc kubenswrapper[4831]: I1204 10:34:47.972930 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/077ce354-b1d3-40d2-bf19-0eee3d474753-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:47 crc kubenswrapper[4831]: I1204 10:34:47.973472 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:47 crc kubenswrapper[4831]: I1204 10:34:47.973481 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpj5d\" (UniqueName: \"kubernetes.io/projected/ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e-kube-api-access-vpj5d\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:47 crc kubenswrapper[4831]: I1204 10:34:47.973492 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/077ce354-b1d3-40d2-bf19-0eee3d474753-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:47 crc kubenswrapper[4831]: I1204 10:34:47.973500 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:47 crc kubenswrapper[4831]: I1204 10:34:47.973509 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbdnw\" (UniqueName: \"kubernetes.io/projected/077ce354-b1d3-40d2-bf19-0eee3d474753-kube-api-access-jbdnw\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:47 crc kubenswrapper[4831]: I1204 10:34:47.973535 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/077ce354-b1d3-40d2-bf19-0eee3d474753-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:47 crc kubenswrapper[4831]: I1204 10:34:47.973544 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:47 crc kubenswrapper[4831]: I1204 10:34:47.973552 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:47 crc kubenswrapper[4831]: I1204 10:34:47.973560 4831 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/077ce354-b1d3-40d2-bf19-0eee3d474753-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.206014 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.225618 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.237871 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Dec 04 10:34:48 crc kubenswrapper[4831]: E1204 10:34:48.238367 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077ce354-b1d3-40d2-bf19-0eee3d474753" containerName="watcher-api" Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.238385 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="077ce354-b1d3-40d2-bf19-0eee3d474753" containerName="watcher-api" Dec 04 10:34:48 crc kubenswrapper[4831]: E1204 10:34:48.238405 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e" containerName="placement-db-sync" Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.238413 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e" containerName="placement-db-sync" Dec 04 10:34:48 crc kubenswrapper[4831]: E1204 10:34:48.238440 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077ce354-b1d3-40d2-bf19-0eee3d474753" containerName="watcher-api-log" Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.238449 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="077ce354-b1d3-40d2-bf19-0eee3d474753" containerName="watcher-api-log" Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.238707 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="077ce354-b1d3-40d2-bf19-0eee3d474753" containerName="watcher-api" Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.238719 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e" containerName="placement-db-sync" Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.238738 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="077ce354-b1d3-40d2-bf19-0eee3d474753" containerName="watcher-api-log" Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.240123 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.242401 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.245731 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.246022 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.298823 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eab9136-8caf-4dc6-81e4-3d8544e3ad94-config-data\") pod \"watcher-api-0\" (UID: \"5eab9136-8caf-4dc6-81e4-3d8544e3ad94\") " pod="openstack/watcher-api-0" Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.298883 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/5eab9136-8caf-4dc6-81e4-3d8544e3ad94-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"5eab9136-8caf-4dc6-81e4-3d8544e3ad94\") " pod="openstack/watcher-api-0" Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.298915 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5eab9136-8caf-4dc6-81e4-3d8544e3ad94-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"5eab9136-8caf-4dc6-81e4-3d8544e3ad94\") " pod="openstack/watcher-api-0" Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.298975 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5eab9136-8caf-4dc6-81e4-3d8544e3ad94-public-tls-certs\") pod \"watcher-api-0\" (UID: \"5eab9136-8caf-4dc6-81e4-3d8544e3ad94\") " pod="openstack/watcher-api-0" Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.299068 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5eab9136-8caf-4dc6-81e4-3d8544e3ad94-logs\") pod \"watcher-api-0\" (UID: \"5eab9136-8caf-4dc6-81e4-3d8544e3ad94\") " pod="openstack/watcher-api-0" Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.299136 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjvnl\" (UniqueName: \"kubernetes.io/projected/5eab9136-8caf-4dc6-81e4-3d8544e3ad94-kube-api-access-sjvnl\") pod \"watcher-api-0\" (UID: \"5eab9136-8caf-4dc6-81e4-3d8544e3ad94\") " pod="openstack/watcher-api-0" Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.299160 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eab9136-8caf-4dc6-81e4-3d8544e3ad94-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"5eab9136-8caf-4dc6-81e4-3d8544e3ad94\") " pod="openstack/watcher-api-0" Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.324429 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.400114 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eab9136-8caf-4dc6-81e4-3d8544e3ad94-config-data\") pod \"watcher-api-0\" (UID: \"5eab9136-8caf-4dc6-81e4-3d8544e3ad94\") " pod="openstack/watcher-api-0" Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.400156 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/5eab9136-8caf-4dc6-81e4-3d8544e3ad94-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"5eab9136-8caf-4dc6-81e4-3d8544e3ad94\") " pod="openstack/watcher-api-0" Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.400177 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5eab9136-8caf-4dc6-81e4-3d8544e3ad94-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"5eab9136-8caf-4dc6-81e4-3d8544e3ad94\") " pod="openstack/watcher-api-0" Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.400216 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5eab9136-8caf-4dc6-81e4-3d8544e3ad94-public-tls-certs\") pod \"watcher-api-0\" (UID: \"5eab9136-8caf-4dc6-81e4-3d8544e3ad94\") " pod="openstack/watcher-api-0" Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.400237 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5eab9136-8caf-4dc6-81e4-3d8544e3ad94-logs\") pod \"watcher-api-0\" (UID: \"5eab9136-8caf-4dc6-81e4-3d8544e3ad94\") " pod="openstack/watcher-api-0" Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.400272 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjvnl\" (UniqueName: \"kubernetes.io/projected/5eab9136-8caf-4dc6-81e4-3d8544e3ad94-kube-api-access-sjvnl\") pod \"watcher-api-0\" (UID: \"5eab9136-8caf-4dc6-81e4-3d8544e3ad94\") " pod="openstack/watcher-api-0" Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.400288 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eab9136-8caf-4dc6-81e4-3d8544e3ad94-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"5eab9136-8caf-4dc6-81e4-3d8544e3ad94\") " pod="openstack/watcher-api-0" Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.403284 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5eab9136-8caf-4dc6-81e4-3d8544e3ad94-logs\") pod \"watcher-api-0\" (UID: \"5eab9136-8caf-4dc6-81e4-3d8544e3ad94\") " pod="openstack/watcher-api-0" Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.403694 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eab9136-8caf-4dc6-81e4-3d8544e3ad94-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"5eab9136-8caf-4dc6-81e4-3d8544e3ad94\") " pod="openstack/watcher-api-0" Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.405437 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eab9136-8caf-4dc6-81e4-3d8544e3ad94-config-data\") pod \"watcher-api-0\" (UID: \"5eab9136-8caf-4dc6-81e4-3d8544e3ad94\") " pod="openstack/watcher-api-0" Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.406123 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5eab9136-8caf-4dc6-81e4-3d8544e3ad94-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"5eab9136-8caf-4dc6-81e4-3d8544e3ad94\") " pod="openstack/watcher-api-0" Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.406857 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5eab9136-8caf-4dc6-81e4-3d8544e3ad94-public-tls-certs\") pod \"watcher-api-0\" (UID: \"5eab9136-8caf-4dc6-81e4-3d8544e3ad94\") " pod="openstack/watcher-api-0" Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.409973 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/5eab9136-8caf-4dc6-81e4-3d8544e3ad94-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"5eab9136-8caf-4dc6-81e4-3d8544e3ad94\") " pod="openstack/watcher-api-0" Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.420047 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjvnl\" (UniqueName: \"kubernetes.io/projected/5eab9136-8caf-4dc6-81e4-3d8544e3ad94-kube-api-access-sjvnl\") pod \"watcher-api-0\" (UID: \"5eab9136-8caf-4dc6-81e4-3d8544e3ad94\") " pod="openstack/watcher-api-0" Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.614097 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.974100 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-744bfc5f58-hs9c9"] Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.975563 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-744bfc5f58-hs9c9" Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.980101 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.980205 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.980390 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.981775 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-hmp6k" Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.982022 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 04 10:34:48 crc kubenswrapper[4831]: I1204 10:34:48.990903 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-744bfc5f58-hs9c9"] Dec 04 10:34:49 crc kubenswrapper[4831]: I1204 10:34:49.013638 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24ad7f65-67eb-4f94-9d64-2d14e393c978-internal-tls-certs\") pod \"placement-744bfc5f58-hs9c9\" (UID: \"24ad7f65-67eb-4f94-9d64-2d14e393c978\") " pod="openstack/placement-744bfc5f58-hs9c9" Dec 04 10:34:49 crc kubenswrapper[4831]: I1204 10:34:49.014039 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft5b9\" (UniqueName: \"kubernetes.io/projected/24ad7f65-67eb-4f94-9d64-2d14e393c978-kube-api-access-ft5b9\") pod \"placement-744bfc5f58-hs9c9\" (UID: \"24ad7f65-67eb-4f94-9d64-2d14e393c978\") " pod="openstack/placement-744bfc5f58-hs9c9" Dec 04 10:34:49 crc kubenswrapper[4831]: I1204 10:34:49.014277 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24ad7f65-67eb-4f94-9d64-2d14e393c978-logs\") pod \"placement-744bfc5f58-hs9c9\" (UID: \"24ad7f65-67eb-4f94-9d64-2d14e393c978\") " pod="openstack/placement-744bfc5f58-hs9c9" Dec 04 10:34:49 crc kubenswrapper[4831]: I1204 10:34:49.014428 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24ad7f65-67eb-4f94-9d64-2d14e393c978-config-data\") pod \"placement-744bfc5f58-hs9c9\" (UID: \"24ad7f65-67eb-4f94-9d64-2d14e393c978\") " pod="openstack/placement-744bfc5f58-hs9c9" Dec 04 10:34:49 crc kubenswrapper[4831]: I1204 10:34:49.014546 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24ad7f65-67eb-4f94-9d64-2d14e393c978-combined-ca-bundle\") pod \"placement-744bfc5f58-hs9c9\" (UID: \"24ad7f65-67eb-4f94-9d64-2d14e393c978\") " pod="openstack/placement-744bfc5f58-hs9c9" Dec 04 10:34:49 crc kubenswrapper[4831]: I1204 10:34:49.014683 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24ad7f65-67eb-4f94-9d64-2d14e393c978-scripts\") pod \"placement-744bfc5f58-hs9c9\" (UID: \"24ad7f65-67eb-4f94-9d64-2d14e393c978\") " pod="openstack/placement-744bfc5f58-hs9c9" Dec 04 10:34:49 crc kubenswrapper[4831]: I1204 10:34:49.014771 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/24ad7f65-67eb-4f94-9d64-2d14e393c978-public-tls-certs\") pod \"placement-744bfc5f58-hs9c9\" (UID: \"24ad7f65-67eb-4f94-9d64-2d14e393c978\") " pod="openstack/placement-744bfc5f58-hs9c9" Dec 04 10:34:49 crc kubenswrapper[4831]: I1204 10:34:49.115797 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24ad7f65-67eb-4f94-9d64-2d14e393c978-scripts\") pod \"placement-744bfc5f58-hs9c9\" (UID: \"24ad7f65-67eb-4f94-9d64-2d14e393c978\") " pod="openstack/placement-744bfc5f58-hs9c9" Dec 04 10:34:49 crc kubenswrapper[4831]: I1204 10:34:49.115849 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/24ad7f65-67eb-4f94-9d64-2d14e393c978-public-tls-certs\") pod \"placement-744bfc5f58-hs9c9\" (UID: \"24ad7f65-67eb-4f94-9d64-2d14e393c978\") " pod="openstack/placement-744bfc5f58-hs9c9" Dec 04 10:34:49 crc kubenswrapper[4831]: I1204 10:34:49.115897 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24ad7f65-67eb-4f94-9d64-2d14e393c978-internal-tls-certs\") pod \"placement-744bfc5f58-hs9c9\" (UID: \"24ad7f65-67eb-4f94-9d64-2d14e393c978\") " pod="openstack/placement-744bfc5f58-hs9c9" Dec 04 10:34:49 crc kubenswrapper[4831]: I1204 10:34:49.115947 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft5b9\" (UniqueName: \"kubernetes.io/projected/24ad7f65-67eb-4f94-9d64-2d14e393c978-kube-api-access-ft5b9\") pod \"placement-744bfc5f58-hs9c9\" (UID: \"24ad7f65-67eb-4f94-9d64-2d14e393c978\") " pod="openstack/placement-744bfc5f58-hs9c9" Dec 04 10:34:49 crc kubenswrapper[4831]: I1204 10:34:49.115966 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24ad7f65-67eb-4f94-9d64-2d14e393c978-logs\") pod \"placement-744bfc5f58-hs9c9\" (UID: \"24ad7f65-67eb-4f94-9d64-2d14e393c978\") " pod="openstack/placement-744bfc5f58-hs9c9" Dec 04 10:34:49 crc kubenswrapper[4831]: I1204 10:34:49.115985 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24ad7f65-67eb-4f94-9d64-2d14e393c978-config-data\") pod \"placement-744bfc5f58-hs9c9\" (UID: \"24ad7f65-67eb-4f94-9d64-2d14e393c978\") " pod="openstack/placement-744bfc5f58-hs9c9" Dec 04 10:34:49 crc kubenswrapper[4831]: I1204 10:34:49.116028 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24ad7f65-67eb-4f94-9d64-2d14e393c978-combined-ca-bundle\") pod \"placement-744bfc5f58-hs9c9\" (UID: \"24ad7f65-67eb-4f94-9d64-2d14e393c978\") " pod="openstack/placement-744bfc5f58-hs9c9" Dec 04 10:34:49 crc kubenswrapper[4831]: I1204 10:34:49.121819 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24ad7f65-67eb-4f94-9d64-2d14e393c978-logs\") pod \"placement-744bfc5f58-hs9c9\" (UID: \"24ad7f65-67eb-4f94-9d64-2d14e393c978\") " pod="openstack/placement-744bfc5f58-hs9c9" Dec 04 10:34:49 crc kubenswrapper[4831]: I1204 10:34:49.122304 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24ad7f65-67eb-4f94-9d64-2d14e393c978-scripts\") pod \"placement-744bfc5f58-hs9c9\" (UID: \"24ad7f65-67eb-4f94-9d64-2d14e393c978\") " pod="openstack/placement-744bfc5f58-hs9c9" Dec 04 10:34:49 crc kubenswrapper[4831]: I1204 10:34:49.123508 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24ad7f65-67eb-4f94-9d64-2d14e393c978-combined-ca-bundle\") pod \"placement-744bfc5f58-hs9c9\" (UID: \"24ad7f65-67eb-4f94-9d64-2d14e393c978\") " pod="openstack/placement-744bfc5f58-hs9c9" Dec 04 10:34:49 crc kubenswrapper[4831]: I1204 10:34:49.123652 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24ad7f65-67eb-4f94-9d64-2d14e393c978-config-data\") pod \"placement-744bfc5f58-hs9c9\" (UID: \"24ad7f65-67eb-4f94-9d64-2d14e393c978\") " pod="openstack/placement-744bfc5f58-hs9c9" Dec 04 10:34:49 crc kubenswrapper[4831]: I1204 10:34:49.124954 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24ad7f65-67eb-4f94-9d64-2d14e393c978-internal-tls-certs\") pod \"placement-744bfc5f58-hs9c9\" (UID: \"24ad7f65-67eb-4f94-9d64-2d14e393c978\") " pod="openstack/placement-744bfc5f58-hs9c9" Dec 04 10:34:49 crc kubenswrapper[4831]: I1204 10:34:49.133236 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft5b9\" (UniqueName: \"kubernetes.io/projected/24ad7f65-67eb-4f94-9d64-2d14e393c978-kube-api-access-ft5b9\") pod \"placement-744bfc5f58-hs9c9\" (UID: \"24ad7f65-67eb-4f94-9d64-2d14e393c978\") " pod="openstack/placement-744bfc5f58-hs9c9" Dec 04 10:34:49 crc kubenswrapper[4831]: I1204 10:34:49.134045 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/24ad7f65-67eb-4f94-9d64-2d14e393c978-public-tls-certs\") pod \"placement-744bfc5f58-hs9c9\" (UID: \"24ad7f65-67eb-4f94-9d64-2d14e393c978\") " pod="openstack/placement-744bfc5f58-hs9c9" Dec 04 10:34:49 crc kubenswrapper[4831]: I1204 10:34:49.289793 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="077ce354-b1d3-40d2-bf19-0eee3d474753" path="/var/lib/kubelet/pods/077ce354-b1d3-40d2-bf19-0eee3d474753/volumes" Dec 04 10:34:49 crc kubenswrapper[4831]: I1204 10:34:49.296912 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-744bfc5f58-hs9c9" Dec 04 10:34:51 crc kubenswrapper[4831]: I1204 10:34:51.342705 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="077ce354-b1d3-40d2-bf19-0eee3d474753" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.157:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 10:34:51 crc kubenswrapper[4831]: I1204 10:34:51.342820 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="077ce354-b1d3-40d2-bf19-0eee3d474753" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.157:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 10:34:51 crc kubenswrapper[4831]: I1204 10:34:51.554643 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6694d6d998-wgcht" podUID="42db10e0-67a3-49d3-b6c5-8f48e31775e7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.153:8443: connect: connection refused" Dec 04 10:34:51 crc kubenswrapper[4831]: I1204 10:34:51.904536 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6f799f5dc-hv6fj" Dec 04 10:34:52 crc kubenswrapper[4831]: I1204 10:34:52.430017 4831 scope.go:117] "RemoveContainer" containerID="4d14cc312ce3fdd5254c5eeef2f36ed49f38e1d7d481bf48b33eb6616558c934" Dec 04 10:34:52 crc kubenswrapper[4831]: I1204 10:34:52.481616 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f77cc7d69-g6vcb" Dec 04 10:34:52 crc kubenswrapper[4831]: I1204 10:34:52.580772 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85393e48-dbc6-41c7-9419-4baf00f072db-config-data\") pod \"85393e48-dbc6-41c7-9419-4baf00f072db\" (UID: \"85393e48-dbc6-41c7-9419-4baf00f072db\") " Dec 04 10:34:52 crc kubenswrapper[4831]: I1204 10:34:52.580879 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85393e48-dbc6-41c7-9419-4baf00f072db-scripts\") pod \"85393e48-dbc6-41c7-9419-4baf00f072db\" (UID: \"85393e48-dbc6-41c7-9419-4baf00f072db\") " Dec 04 10:34:52 crc kubenswrapper[4831]: I1204 10:34:52.580942 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/85393e48-dbc6-41c7-9419-4baf00f072db-horizon-secret-key\") pod \"85393e48-dbc6-41c7-9419-4baf00f072db\" (UID: \"85393e48-dbc6-41c7-9419-4baf00f072db\") " Dec 04 10:34:52 crc kubenswrapper[4831]: I1204 10:34:52.580982 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmzzf\" (UniqueName: \"kubernetes.io/projected/85393e48-dbc6-41c7-9419-4baf00f072db-kube-api-access-kmzzf\") pod \"85393e48-dbc6-41c7-9419-4baf00f072db\" (UID: \"85393e48-dbc6-41c7-9419-4baf00f072db\") " Dec 04 10:34:52 crc kubenswrapper[4831]: I1204 10:34:52.581025 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85393e48-dbc6-41c7-9419-4baf00f072db-logs\") pod \"85393e48-dbc6-41c7-9419-4baf00f072db\" (UID: \"85393e48-dbc6-41c7-9419-4baf00f072db\") " Dec 04 10:34:52 crc kubenswrapper[4831]: I1204 10:34:52.583023 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85393e48-dbc6-41c7-9419-4baf00f072db-logs" (OuterVolumeSpecName: "logs") pod "85393e48-dbc6-41c7-9419-4baf00f072db" (UID: "85393e48-dbc6-41c7-9419-4baf00f072db"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:34:52 crc kubenswrapper[4831]: I1204 10:34:52.585646 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85393e48-dbc6-41c7-9419-4baf00f072db-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "85393e48-dbc6-41c7-9419-4baf00f072db" (UID: "85393e48-dbc6-41c7-9419-4baf00f072db"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:52 crc kubenswrapper[4831]: I1204 10:34:52.586361 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85393e48-dbc6-41c7-9419-4baf00f072db-kube-api-access-kmzzf" (OuterVolumeSpecName: "kube-api-access-kmzzf") pod "85393e48-dbc6-41c7-9419-4baf00f072db" (UID: "85393e48-dbc6-41c7-9419-4baf00f072db"). InnerVolumeSpecName "kube-api-access-kmzzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:34:52 crc kubenswrapper[4831]: I1204 10:34:52.605620 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85393e48-dbc6-41c7-9419-4baf00f072db-config-data" (OuterVolumeSpecName: "config-data") pod "85393e48-dbc6-41c7-9419-4baf00f072db" (UID: "85393e48-dbc6-41c7-9419-4baf00f072db"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:34:52 crc kubenswrapper[4831]: I1204 10:34:52.607855 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85393e48-dbc6-41c7-9419-4baf00f072db-scripts" (OuterVolumeSpecName: "scripts") pod "85393e48-dbc6-41c7-9419-4baf00f072db" (UID: "85393e48-dbc6-41c7-9419-4baf00f072db"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:34:52 crc kubenswrapper[4831]: I1204 10:34:52.683111 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85393e48-dbc6-41c7-9419-4baf00f072db-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:52 crc kubenswrapper[4831]: I1204 10:34:52.683141 4831 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/85393e48-dbc6-41c7-9419-4baf00f072db-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:52 crc kubenswrapper[4831]: I1204 10:34:52.683154 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmzzf\" (UniqueName: \"kubernetes.io/projected/85393e48-dbc6-41c7-9419-4baf00f072db-kube-api-access-kmzzf\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:52 crc kubenswrapper[4831]: I1204 10:34:52.683163 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85393e48-dbc6-41c7-9419-4baf00f072db-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:52 crc kubenswrapper[4831]: I1204 10:34:52.683171 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85393e48-dbc6-41c7-9419-4baf00f072db-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:52 crc kubenswrapper[4831]: I1204 10:34:52.891593 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f77cc7d69-g6vcb" Dec 04 10:34:52 crc kubenswrapper[4831]: I1204 10:34:52.891639 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f77cc7d69-g6vcb" event={"ID":"85393e48-dbc6-41c7-9419-4baf00f072db","Type":"ContainerDied","Data":"e73ac3f72a023789976fd027e5019729d742db81e03acf7e43e140d1704cea71"} Dec 04 10:34:52 crc kubenswrapper[4831]: I1204 10:34:52.895985 4831 generic.go:334] "Generic (PLEG): container finished" podID="41899055-8db6-4cdb-a9da-2bbb143b9f3f" containerID="3955d53461eddec4027820d5075fab1744fdfab5ddf7f0c3d04e497e833aa4b9" exitCode=0 Dec 04 10:34:52 crc kubenswrapper[4831]: I1204 10:34:52.896037 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-q6jc5" event={"ID":"41899055-8db6-4cdb-a9da-2bbb143b9f3f","Type":"ContainerDied","Data":"3955d53461eddec4027820d5075fab1744fdfab5ddf7f0c3d04e497e833aa4b9"} Dec 04 10:34:52 crc kubenswrapper[4831]: I1204 10:34:52.931005 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6f77cc7d69-g6vcb"] Dec 04 10:34:52 crc kubenswrapper[4831]: I1204 10:34:52.939165 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6f77cc7d69-g6vcb"] Dec 04 10:34:53 crc kubenswrapper[4831]: I1204 10:34:53.294563 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cb786f76c-8xlwx" Dec 04 10:34:53 crc kubenswrapper[4831]: I1204 10:34:53.301361 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85393e48-dbc6-41c7-9419-4baf00f072db" path="/var/lib/kubelet/pods/85393e48-dbc6-41c7-9419-4baf00f072db/volumes" Dec 04 10:34:53 crc kubenswrapper[4831]: I1204 10:34:53.399309 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e41bbb94-b986-4268-8040-77542216b905-config-data\") pod \"e41bbb94-b986-4268-8040-77542216b905\" (UID: \"e41bbb94-b986-4268-8040-77542216b905\") " Dec 04 10:34:53 crc kubenswrapper[4831]: I1204 10:34:53.399492 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e41bbb94-b986-4268-8040-77542216b905-horizon-secret-key\") pod \"e41bbb94-b986-4268-8040-77542216b905\" (UID: \"e41bbb94-b986-4268-8040-77542216b905\") " Dec 04 10:34:53 crc kubenswrapper[4831]: I1204 10:34:53.399541 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cj74\" (UniqueName: \"kubernetes.io/projected/e41bbb94-b986-4268-8040-77542216b905-kube-api-access-6cj74\") pod \"e41bbb94-b986-4268-8040-77542216b905\" (UID: \"e41bbb94-b986-4268-8040-77542216b905\") " Dec 04 10:34:53 crc kubenswrapper[4831]: I1204 10:34:53.399600 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e41bbb94-b986-4268-8040-77542216b905-scripts\") pod \"e41bbb94-b986-4268-8040-77542216b905\" (UID: \"e41bbb94-b986-4268-8040-77542216b905\") " Dec 04 10:34:53 crc kubenswrapper[4831]: I1204 10:34:53.399653 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e41bbb94-b986-4268-8040-77542216b905-logs\") pod \"e41bbb94-b986-4268-8040-77542216b905\" (UID: \"e41bbb94-b986-4268-8040-77542216b905\") " Dec 04 10:34:53 crc kubenswrapper[4831]: I1204 10:34:53.405173 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e41bbb94-b986-4268-8040-77542216b905-logs" (OuterVolumeSpecName: "logs") pod "e41bbb94-b986-4268-8040-77542216b905" (UID: "e41bbb94-b986-4268-8040-77542216b905"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:34:53 crc kubenswrapper[4831]: I1204 10:34:53.406618 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e41bbb94-b986-4268-8040-77542216b905-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e41bbb94-b986-4268-8040-77542216b905" (UID: "e41bbb94-b986-4268-8040-77542216b905"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:53 crc kubenswrapper[4831]: I1204 10:34:53.411950 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e41bbb94-b986-4268-8040-77542216b905-kube-api-access-6cj74" (OuterVolumeSpecName: "kube-api-access-6cj74") pod "e41bbb94-b986-4268-8040-77542216b905" (UID: "e41bbb94-b986-4268-8040-77542216b905"). InnerVolumeSpecName "kube-api-access-6cj74". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:34:53 crc kubenswrapper[4831]: I1204 10:34:53.429167 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e41bbb94-b986-4268-8040-77542216b905-scripts" (OuterVolumeSpecName: "scripts") pod "e41bbb94-b986-4268-8040-77542216b905" (UID: "e41bbb94-b986-4268-8040-77542216b905"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:34:53 crc kubenswrapper[4831]: I1204 10:34:53.431323 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e41bbb94-b986-4268-8040-77542216b905-config-data" (OuterVolumeSpecName: "config-data") pod "e41bbb94-b986-4268-8040-77542216b905" (UID: "e41bbb94-b986-4268-8040-77542216b905"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:34:53 crc kubenswrapper[4831]: I1204 10:34:53.501885 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e41bbb94-b986-4268-8040-77542216b905-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:53 crc kubenswrapper[4831]: I1204 10:34:53.501920 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e41bbb94-b986-4268-8040-77542216b905-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:53 crc kubenswrapper[4831]: I1204 10:34:53.501929 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e41bbb94-b986-4268-8040-77542216b905-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:53 crc kubenswrapper[4831]: I1204 10:34:53.501937 4831 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e41bbb94-b986-4268-8040-77542216b905-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:53 crc kubenswrapper[4831]: I1204 10:34:53.501949 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cj74\" (UniqueName: \"kubernetes.io/projected/e41bbb94-b986-4268-8040-77542216b905-kube-api-access-6cj74\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:53 crc kubenswrapper[4831]: E1204 10:34:53.819619 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/ubi9/httpd-24:latest" Dec 04 10:34:53 crc kubenswrapper[4831]: E1204 10:34:53.820133 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:registry.redhat.io/ubi9/httpd-24:latest,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vq52w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(b769c363-a026-4bf4-9b56-3d1452b6847d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 10:34:53 crc kubenswrapper[4831]: E1204 10:34:53.821488 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="b769c363-a026-4bf4-9b56-3d1452b6847d" Dec 04 10:34:53 crc kubenswrapper[4831]: I1204 10:34:53.890362 4831 scope.go:117] "RemoveContainer" containerID="13382f8529ff796d8a356c27250577db1aff612b0eec44dd5ec22a8084db6f59" Dec 04 10:34:53 crc kubenswrapper[4831]: I1204 10:34:53.923001 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cb786f76c-8xlwx" Dec 04 10:34:53 crc kubenswrapper[4831]: I1204 10:34:53.923707 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cb786f76c-8xlwx" event={"ID":"e41bbb94-b986-4268-8040-77542216b905","Type":"ContainerDied","Data":"07637652f60eb1d3bdcddc42ab98e28f52f5e2b9ed7d75a32fc515fa0eb5aec0"} Dec 04 10:34:53 crc kubenswrapper[4831]: I1204 10:34:53.933346 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b769c363-a026-4bf4-9b56-3d1452b6847d" containerName="ceilometer-central-agent" containerID="cri-o://021d8e6d61418e963a5e960d7a12ba82d80b9c251032566a717d19caa4e03441" gracePeriod=30 Dec 04 10:34:53 crc kubenswrapper[4831]: I1204 10:34:53.933895 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b769c363-a026-4bf4-9b56-3d1452b6847d" containerName="sg-core" containerID="cri-o://8f26f07839607b082f46d9f85dba97955a75c334c66d497c223ed587ce5b9dd4" gracePeriod=30 Dec 04 10:34:53 crc kubenswrapper[4831]: I1204 10:34:53.933999 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b769c363-a026-4bf4-9b56-3d1452b6847d" containerName="ceilometer-notification-agent" containerID="cri-o://7c81ef06f0d2ef4ac48cdbdbbce8ccf5febd339c9121f55b37b94aff84afb322" gracePeriod=30 Dec 04 10:34:54 crc kubenswrapper[4831]: I1204 10:34:54.050105 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5cb786f76c-8xlwx"] Dec 04 10:34:54 crc kubenswrapper[4831]: I1204 10:34:54.067208 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5cb786f76c-8xlwx"] Dec 04 10:34:54 crc kubenswrapper[4831]: I1204 10:34:54.177566 4831 scope.go:117] "RemoveContainer" containerID="ced500055055bcb942e672ef9b1d0182950bde3c7bee1e55888858a7e706bad5" Dec 04 10:34:54 crc kubenswrapper[4831]: I1204 10:34:54.353134 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-69bc569cc4-dg5gj"] Dec 04 10:34:54 crc kubenswrapper[4831]: I1204 10:34:54.376869 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b9767fcff-cwqsr"] Dec 04 10:34:54 crc kubenswrapper[4831]: I1204 10:34:54.386994 4831 scope.go:117] "RemoveContainer" containerID="936c2a23b071f14add02e0c514fe8771dabd15308495a12484cd9404a1590f1d" Dec 04 10:34:54 crc kubenswrapper[4831]: I1204 10:34:54.392601 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-78b878b7bb-lxbbq"] Dec 04 10:34:54 crc kubenswrapper[4831]: W1204 10:34:54.421808 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddead6584_8c6b_4231_b0c6_54d83d05c250.slice/crio-21b47eab97a5eebba75054d79a7d89add6595d8174256394cca5126fbfb94be0 WatchSource:0}: Error finding container 21b47eab97a5eebba75054d79a7d89add6595d8174256394cca5126fbfb94be0: Status 404 returned error can't find the container with id 21b47eab97a5eebba75054d79a7d89add6595d8174256394cca5126fbfb94be0 Dec 04 10:34:54 crc kubenswrapper[4831]: W1204 10:34:54.422143 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod734f14e8_f267_4c7c_a5eb_d76457ec9d69.slice/crio-b7a8617562701d991c7579b6e7a711ba949df14514e01251895f39638ef6bfc1 WatchSource:0}: Error finding container b7a8617562701d991c7579b6e7a711ba949df14514e01251895f39638ef6bfc1: Status 404 returned error can't find the container with id b7a8617562701d991c7579b6e7a711ba949df14514e01251895f39638ef6bfc1 Dec 04 10:34:54 crc kubenswrapper[4831]: W1204 10:34:54.425436 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3851f8b_c385_4852_8339_c9cb8f4586e5.slice/crio-6458d2bdb35ae592156499722b835506c6b8e8fe5cab916bc59019a7d05298e7 WatchSource:0}: Error finding container 6458d2bdb35ae592156499722b835506c6b8e8fe5cab916bc59019a7d05298e7: Status 404 returned error can't find the container with id 6458d2bdb35ae592156499722b835506c6b8e8fe5cab916bc59019a7d05298e7 Dec 04 10:34:54 crc kubenswrapper[4831]: I1204 10:34:54.578405 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-545d99f8dd-55gfz"] Dec 04 10:34:54 crc kubenswrapper[4831]: I1204 10:34:54.598962 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6896bbf7f5-dvgps"] Dec 04 10:34:54 crc kubenswrapper[4831]: I1204 10:34:54.666635 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-q6jc5" Dec 04 10:34:54 crc kubenswrapper[4831]: I1204 10:34:54.727695 4831 scope.go:117] "RemoveContainer" containerID="abeef5ea612be1debf78fcd95ef3690d3ede7c824d0834ae5bda84e1dd93c780" Dec 04 10:34:54 crc kubenswrapper[4831]: I1204 10:34:54.755673 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-744bfc5f58-hs9c9"] Dec 04 10:34:54 crc kubenswrapper[4831]: I1204 10:34:54.768099 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Dec 04 10:34:54 crc kubenswrapper[4831]: I1204 10:34:54.832319 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/41899055-8db6-4cdb-a9da-2bbb143b9f3f-config\") pod \"41899055-8db6-4cdb-a9da-2bbb143b9f3f\" (UID: \"41899055-8db6-4cdb-a9da-2bbb143b9f3f\") " Dec 04 10:34:54 crc kubenswrapper[4831]: I1204 10:34:54.832384 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41899055-8db6-4cdb-a9da-2bbb143b9f3f-combined-ca-bundle\") pod \"41899055-8db6-4cdb-a9da-2bbb143b9f3f\" (UID: \"41899055-8db6-4cdb-a9da-2bbb143b9f3f\") " Dec 04 10:34:54 crc kubenswrapper[4831]: I1204 10:34:54.832517 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpd6b\" (UniqueName: \"kubernetes.io/projected/41899055-8db6-4cdb-a9da-2bbb143b9f3f-kube-api-access-hpd6b\") pod \"41899055-8db6-4cdb-a9da-2bbb143b9f3f\" (UID: \"41899055-8db6-4cdb-a9da-2bbb143b9f3f\") " Dec 04 10:34:54 crc kubenswrapper[4831]: I1204 10:34:54.842614 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41899055-8db6-4cdb-a9da-2bbb143b9f3f-kube-api-access-hpd6b" (OuterVolumeSpecName: "kube-api-access-hpd6b") pod "41899055-8db6-4cdb-a9da-2bbb143b9f3f" (UID: "41899055-8db6-4cdb-a9da-2bbb143b9f3f"). InnerVolumeSpecName "kube-api-access-hpd6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:34:54 crc kubenswrapper[4831]: I1204 10:34:54.915359 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41899055-8db6-4cdb-a9da-2bbb143b9f3f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41899055-8db6-4cdb-a9da-2bbb143b9f3f" (UID: "41899055-8db6-4cdb-a9da-2bbb143b9f3f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:54 crc kubenswrapper[4831]: I1204 10:34:54.920767 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41899055-8db6-4cdb-a9da-2bbb143b9f3f-config" (OuterVolumeSpecName: "config") pod "41899055-8db6-4cdb-a9da-2bbb143b9f3f" (UID: "41899055-8db6-4cdb-a9da-2bbb143b9f3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:54 crc kubenswrapper[4831]: I1204 10:34:54.936630 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpd6b\" (UniqueName: \"kubernetes.io/projected/41899055-8db6-4cdb-a9da-2bbb143b9f3f-kube-api-access-hpd6b\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:54 crc kubenswrapper[4831]: I1204 10:34:54.936671 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/41899055-8db6-4cdb-a9da-2bbb143b9f3f-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:54 crc kubenswrapper[4831]: I1204 10:34:54.936683 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41899055-8db6-4cdb-a9da-2bbb143b9f3f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:54 crc kubenswrapper[4831]: I1204 10:34:54.952034 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6896bbf7f5-dvgps" event={"ID":"8c3dabff-2635-4e29-9651-8df5d84838f9","Type":"ContainerStarted","Data":"443b77760164b6ef8a1767e2cb2028395e881489bab2d7de6f52795cbf951b01"} Dec 04 10:34:54 crc kubenswrapper[4831]: I1204 10:34:54.955342 4831 generic.go:334] "Generic (PLEG): container finished" podID="b769c363-a026-4bf4-9b56-3d1452b6847d" containerID="8f26f07839607b082f46d9f85dba97955a75c334c66d497c223ed587ce5b9dd4" exitCode=2 Dec 04 10:34:54 crc kubenswrapper[4831]: I1204 10:34:54.955365 4831 generic.go:334] "Generic (PLEG): container finished" podID="b769c363-a026-4bf4-9b56-3d1452b6847d" containerID="7c81ef06f0d2ef4ac48cdbdbbce8ccf5febd339c9121f55b37b94aff84afb322" exitCode=0 Dec 04 10:34:54 crc kubenswrapper[4831]: I1204 10:34:54.955371 4831 generic.go:334] "Generic (PLEG): container finished" podID="b769c363-a026-4bf4-9b56-3d1452b6847d" containerID="021d8e6d61418e963a5e960d7a12ba82d80b9c251032566a717d19caa4e03441" exitCode=0 Dec 04 10:34:54 crc kubenswrapper[4831]: I1204 10:34:54.955405 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b769c363-a026-4bf4-9b56-3d1452b6847d","Type":"ContainerDied","Data":"8f26f07839607b082f46d9f85dba97955a75c334c66d497c223ed587ce5b9dd4"} Dec 04 10:34:54 crc kubenswrapper[4831]: I1204 10:34:54.955423 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b769c363-a026-4bf4-9b56-3d1452b6847d","Type":"ContainerDied","Data":"7c81ef06f0d2ef4ac48cdbdbbce8ccf5febd339c9121f55b37b94aff84afb322"} Dec 04 10:34:54 crc kubenswrapper[4831]: I1204 10:34:54.955432 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b769c363-a026-4bf4-9b56-3d1452b6847d","Type":"ContainerDied","Data":"021d8e6d61418e963a5e960d7a12ba82d80b9c251032566a717d19caa4e03441"} Dec 04 10:34:54 crc kubenswrapper[4831]: I1204 10:34:54.957070 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-69bc569cc4-dg5gj" event={"ID":"734f14e8-f267-4c7c-a5eb-d76457ec9d69","Type":"ContainerStarted","Data":"b7a8617562701d991c7579b6e7a711ba949df14514e01251895f39638ef6bfc1"} Dec 04 10:34:54 crc kubenswrapper[4831]: I1204 10:34:54.963133 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"6040f79c-a151-4681-a758-d2741bff68b6","Type":"ContainerStarted","Data":"bac90f7915063288de67a2e139c40837b5708a4e13373e20cb3ac91496de72b3"} Dec 04 10:34:54 crc kubenswrapper[4831]: I1204 10:34:54.968924 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-545d99f8dd-55gfz" event={"ID":"be37a16a-5f7e-4f97-b71d-ee344177919c","Type":"ContainerStarted","Data":"71f5a152442f7f92c12464a98c106ae920c407a47b36f18c73a5483549259c64"} Dec 04 10:34:54 crc kubenswrapper[4831]: I1204 10:34:54.968953 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-545d99f8dd-55gfz" event={"ID":"be37a16a-5f7e-4f97-b71d-ee344177919c","Type":"ContainerStarted","Data":"552f7a751bd7b367ae51aabd39eca1765da50a5dafba09c31a07128eb31c2095"} Dec 04 10:34:54 crc kubenswrapper[4831]: I1204 10:34:54.971271 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-78b878b7bb-lxbbq" event={"ID":"dead6584-8c6b-4231-b0c6-54d83d05c250","Type":"ContainerStarted","Data":"6f1bf867e4f0b9209a8aadb0e75a5629ce14eacfb6468b3cdc4087f0d2e03609"} Dec 04 10:34:54 crc kubenswrapper[4831]: I1204 10:34:54.971292 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-78b878b7bb-lxbbq" event={"ID":"dead6584-8c6b-4231-b0c6-54d83d05c250","Type":"ContainerStarted","Data":"21b47eab97a5eebba75054d79a7d89add6595d8174256394cca5126fbfb94be0"} Dec 04 10:34:54 crc kubenswrapper[4831]: I1204 10:34:54.973385 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-q6jc5" event={"ID":"41899055-8db6-4cdb-a9da-2bbb143b9f3f","Type":"ContainerDied","Data":"e1f772ec096859627654955f2c6c91d0fe6fa9818d8f0b540d61604d41528c2a"} Dec 04 10:34:54 crc kubenswrapper[4831]: I1204 10:34:54.973405 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1f772ec096859627654955f2c6c91d0fe6fa9818d8f0b540d61604d41528c2a" Dec 04 10:34:54 crc kubenswrapper[4831]: I1204 10:34:54.973460 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-q6jc5" Dec 04 10:34:54 crc kubenswrapper[4831]: I1204 10:34:54.981283 4831 scope.go:117] "RemoveContainer" containerID="db4983df9038a909cc6c9e8bd5cddb1e6d9ff4f8542855af2578e2773b6584f9" Dec 04 10:34:54 crc kubenswrapper[4831]: I1204 10:34:54.986275 4831 generic.go:334] "Generic (PLEG): container finished" podID="b3851f8b-c385-4852-8339-c9cb8f4586e5" containerID="4df4761576383142bd0ba848fb81f6cbc5e439c5f6787f7c356318ebb95a277f" exitCode=0 Dec 04 10:34:54 crc kubenswrapper[4831]: I1204 10:34:54.986377 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9767fcff-cwqsr" event={"ID":"b3851f8b-c385-4852-8339-c9cb8f4586e5","Type":"ContainerDied","Data":"4df4761576383142bd0ba848fb81f6cbc5e439c5f6787f7c356318ebb95a277f"} Dec 04 10:34:54 crc kubenswrapper[4831]: I1204 10:34:54.986445 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9767fcff-cwqsr" event={"ID":"b3851f8b-c385-4852-8339-c9cb8f4586e5","Type":"ContainerStarted","Data":"6458d2bdb35ae592156499722b835506c6b8e8fe5cab916bc59019a7d05298e7"} Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.050523 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.139635 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b769c363-a026-4bf4-9b56-3d1452b6847d-config-data\") pod \"b769c363-a026-4bf4-9b56-3d1452b6847d\" (UID: \"b769c363-a026-4bf4-9b56-3d1452b6847d\") " Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.139861 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b769c363-a026-4bf4-9b56-3d1452b6847d-run-httpd\") pod \"b769c363-a026-4bf4-9b56-3d1452b6847d\" (UID: \"b769c363-a026-4bf4-9b56-3d1452b6847d\") " Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.139932 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b769c363-a026-4bf4-9b56-3d1452b6847d-combined-ca-bundle\") pod \"b769c363-a026-4bf4-9b56-3d1452b6847d\" (UID: \"b769c363-a026-4bf4-9b56-3d1452b6847d\") " Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.140032 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b769c363-a026-4bf4-9b56-3d1452b6847d-sg-core-conf-yaml\") pod \"b769c363-a026-4bf4-9b56-3d1452b6847d\" (UID: \"b769c363-a026-4bf4-9b56-3d1452b6847d\") " Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.144426 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b769c363-a026-4bf4-9b56-3d1452b6847d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b769c363-a026-4bf4-9b56-3d1452b6847d" (UID: "b769c363-a026-4bf4-9b56-3d1452b6847d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.182122 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b9767fcff-cwqsr"] Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.223022 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7656fdcbd7-2jzrb"] Dec 04 10:34:55 crc kubenswrapper[4831]: E1204 10:34:55.224042 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85393e48-dbc6-41c7-9419-4baf00f072db" containerName="horizon" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.224090 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="85393e48-dbc6-41c7-9419-4baf00f072db" containerName="horizon" Dec 04 10:34:55 crc kubenswrapper[4831]: E1204 10:34:55.224109 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e41bbb94-b986-4268-8040-77542216b905" containerName="horizon-log" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.224116 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e41bbb94-b986-4268-8040-77542216b905" containerName="horizon-log" Dec 04 10:34:55 crc kubenswrapper[4831]: E1204 10:34:55.224137 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b769c363-a026-4bf4-9b56-3d1452b6847d" containerName="sg-core" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.224146 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="b769c363-a026-4bf4-9b56-3d1452b6847d" containerName="sg-core" Dec 04 10:34:55 crc kubenswrapper[4831]: E1204 10:34:55.224165 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41899055-8db6-4cdb-a9da-2bbb143b9f3f" containerName="neutron-db-sync" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.224172 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="41899055-8db6-4cdb-a9da-2bbb143b9f3f" containerName="neutron-db-sync" Dec 04 10:34:55 crc kubenswrapper[4831]: E1204 10:34:55.224184 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85393e48-dbc6-41c7-9419-4baf00f072db" containerName="horizon-log" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.224192 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="85393e48-dbc6-41c7-9419-4baf00f072db" containerName="horizon-log" Dec 04 10:34:55 crc kubenswrapper[4831]: E1204 10:34:55.224209 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b769c363-a026-4bf4-9b56-3d1452b6847d" containerName="ceilometer-central-agent" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.224217 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="b769c363-a026-4bf4-9b56-3d1452b6847d" containerName="ceilometer-central-agent" Dec 04 10:34:55 crc kubenswrapper[4831]: E1204 10:34:55.224236 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e41bbb94-b986-4268-8040-77542216b905" containerName="horizon" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.224244 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e41bbb94-b986-4268-8040-77542216b905" containerName="horizon" Dec 04 10:34:55 crc kubenswrapper[4831]: E1204 10:34:55.224256 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b769c363-a026-4bf4-9b56-3d1452b6847d" containerName="ceilometer-notification-agent" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.224264 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="b769c363-a026-4bf4-9b56-3d1452b6847d" containerName="ceilometer-notification-agent" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.224494 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="41899055-8db6-4cdb-a9da-2bbb143b9f3f" containerName="neutron-db-sync" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.224514 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e41bbb94-b986-4268-8040-77542216b905" containerName="horizon" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.224530 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="85393e48-dbc6-41c7-9419-4baf00f072db" containerName="horizon-log" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.224542 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e41bbb94-b986-4268-8040-77542216b905" containerName="horizon-log" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.224552 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="b769c363-a026-4bf4-9b56-3d1452b6847d" containerName="ceilometer-central-agent" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.224569 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="85393e48-dbc6-41c7-9419-4baf00f072db" containerName="horizon" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.224577 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="b769c363-a026-4bf4-9b56-3d1452b6847d" containerName="sg-core" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.224590 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="b769c363-a026-4bf4-9b56-3d1452b6847d" containerName="ceilometer-notification-agent" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.228187 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7656fdcbd7-2jzrb" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.243086 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b769c363-a026-4bf4-9b56-3d1452b6847d-scripts\") pod \"b769c363-a026-4bf4-9b56-3d1452b6847d\" (UID: \"b769c363-a026-4bf4-9b56-3d1452b6847d\") " Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.243674 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vq52w\" (UniqueName: \"kubernetes.io/projected/b769c363-a026-4bf4-9b56-3d1452b6847d-kube-api-access-vq52w\") pod \"b769c363-a026-4bf4-9b56-3d1452b6847d\" (UID: \"b769c363-a026-4bf4-9b56-3d1452b6847d\") " Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.243820 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b769c363-a026-4bf4-9b56-3d1452b6847d-log-httpd\") pod \"b769c363-a026-4bf4-9b56-3d1452b6847d\" (UID: \"b769c363-a026-4bf4-9b56-3d1452b6847d\") " Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.244641 4831 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b769c363-a026-4bf4-9b56-3d1452b6847d-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.247377 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b769c363-a026-4bf4-9b56-3d1452b6847d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b769c363-a026-4bf4-9b56-3d1452b6847d" (UID: "b769c363-a026-4bf4-9b56-3d1452b6847d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.253112 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.260679 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.256167 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b769c363-a026-4bf4-9b56-3d1452b6847d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b769c363-a026-4bf4-9b56-3d1452b6847d" (UID: "b769c363-a026-4bf4-9b56-3d1452b6847d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.267229 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b769c363-a026-4bf4-9b56-3d1452b6847d-kube-api-access-vq52w" (OuterVolumeSpecName: "kube-api-access-vq52w") pod "b769c363-a026-4bf4-9b56-3d1452b6847d" (UID: "b769c363-a026-4bf4-9b56-3d1452b6847d"). InnerVolumeSpecName "kube-api-access-vq52w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.267374 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b769c363-a026-4bf4-9b56-3d1452b6847d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b769c363-a026-4bf4-9b56-3d1452b6847d" (UID: "b769c363-a026-4bf4-9b56-3d1452b6847d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.270106 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b769c363-a026-4bf4-9b56-3d1452b6847d-scripts" (OuterVolumeSpecName: "scripts") pod "b769c363-a026-4bf4-9b56-3d1452b6847d" (UID: "b769c363-a026-4bf4-9b56-3d1452b6847d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.295455 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-nkdbx" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.299121 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.299172 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.357221 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d52tn\" (UniqueName: \"kubernetes.io/projected/a7a16dd7-b208-4a3a-9309-0ba292ed12fe-kube-api-access-d52tn\") pod \"dnsmasq-dns-7656fdcbd7-2jzrb\" (UID: \"a7a16dd7-b208-4a3a-9309-0ba292ed12fe\") " pod="openstack/dnsmasq-dns-7656fdcbd7-2jzrb" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.357256 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e13429a8-0182-4bfb-99b2-d1941a1e9af6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"e13429a8-0182-4bfb-99b2-d1941a1e9af6\") " pod="openstack/openstackclient" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.357305 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a7a16dd7-b208-4a3a-9309-0ba292ed12fe-ovsdbserver-sb\") pod \"dnsmasq-dns-7656fdcbd7-2jzrb\" (UID: \"a7a16dd7-b208-4a3a-9309-0ba292ed12fe\") " pod="openstack/dnsmasq-dns-7656fdcbd7-2jzrb" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.357324 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqrfs\" (UniqueName: \"kubernetes.io/projected/e13429a8-0182-4bfb-99b2-d1941a1e9af6-kube-api-access-wqrfs\") pod \"openstackclient\" (UID: \"e13429a8-0182-4bfb-99b2-d1941a1e9af6\") " pod="openstack/openstackclient" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.357348 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a7a16dd7-b208-4a3a-9309-0ba292ed12fe-dns-swift-storage-0\") pod \"dnsmasq-dns-7656fdcbd7-2jzrb\" (UID: \"a7a16dd7-b208-4a3a-9309-0ba292ed12fe\") " pod="openstack/dnsmasq-dns-7656fdcbd7-2jzrb" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.357412 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e13429a8-0182-4bfb-99b2-d1941a1e9af6-openstack-config-secret\") pod \"openstackclient\" (UID: \"e13429a8-0182-4bfb-99b2-d1941a1e9af6\") " pod="openstack/openstackclient" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.357489 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7a16dd7-b208-4a3a-9309-0ba292ed12fe-config\") pod \"dnsmasq-dns-7656fdcbd7-2jzrb\" (UID: \"a7a16dd7-b208-4a3a-9309-0ba292ed12fe\") " pod="openstack/dnsmasq-dns-7656fdcbd7-2jzrb" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.357507 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e13429a8-0182-4bfb-99b2-d1941a1e9af6-openstack-config\") pod \"openstackclient\" (UID: \"e13429a8-0182-4bfb-99b2-d1941a1e9af6\") " pod="openstack/openstackclient" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.357538 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a7a16dd7-b208-4a3a-9309-0ba292ed12fe-dns-svc\") pod \"dnsmasq-dns-7656fdcbd7-2jzrb\" (UID: \"a7a16dd7-b208-4a3a-9309-0ba292ed12fe\") " pod="openstack/dnsmasq-dns-7656fdcbd7-2jzrb" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.357574 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a7a16dd7-b208-4a3a-9309-0ba292ed12fe-ovsdbserver-nb\") pod \"dnsmasq-dns-7656fdcbd7-2jzrb\" (UID: \"a7a16dd7-b208-4a3a-9309-0ba292ed12fe\") " pod="openstack/dnsmasq-dns-7656fdcbd7-2jzrb" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.357641 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b769c363-a026-4bf4-9b56-3d1452b6847d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.357653 4831 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b769c363-a026-4bf4-9b56-3d1452b6847d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.357680 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b769c363-a026-4bf4-9b56-3d1452b6847d-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.357694 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vq52w\" (UniqueName: \"kubernetes.io/projected/b769c363-a026-4bf4-9b56-3d1452b6847d-kube-api-access-vq52w\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.357706 4831 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b769c363-a026-4bf4-9b56-3d1452b6847d-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.371962 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e41bbb94-b986-4268-8040-77542216b905" path="/var/lib/kubelet/pods/e41bbb94-b986-4268-8040-77542216b905/volumes" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.396225 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7656fdcbd7-2jzrb"] Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.396261 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.460177 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a7a16dd7-b208-4a3a-9309-0ba292ed12fe-ovsdbserver-sb\") pod \"dnsmasq-dns-7656fdcbd7-2jzrb\" (UID: \"a7a16dd7-b208-4a3a-9309-0ba292ed12fe\") " pod="openstack/dnsmasq-dns-7656fdcbd7-2jzrb" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.460218 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqrfs\" (UniqueName: \"kubernetes.io/projected/e13429a8-0182-4bfb-99b2-d1941a1e9af6-kube-api-access-wqrfs\") pod \"openstackclient\" (UID: \"e13429a8-0182-4bfb-99b2-d1941a1e9af6\") " pod="openstack/openstackclient" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.460265 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a7a16dd7-b208-4a3a-9309-0ba292ed12fe-dns-swift-storage-0\") pod \"dnsmasq-dns-7656fdcbd7-2jzrb\" (UID: \"a7a16dd7-b208-4a3a-9309-0ba292ed12fe\") " pod="openstack/dnsmasq-dns-7656fdcbd7-2jzrb" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.460297 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e13429a8-0182-4bfb-99b2-d1941a1e9af6-openstack-config-secret\") pod \"openstackclient\" (UID: \"e13429a8-0182-4bfb-99b2-d1941a1e9af6\") " pod="openstack/openstackclient" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.460398 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7a16dd7-b208-4a3a-9309-0ba292ed12fe-config\") pod \"dnsmasq-dns-7656fdcbd7-2jzrb\" (UID: \"a7a16dd7-b208-4a3a-9309-0ba292ed12fe\") " pod="openstack/dnsmasq-dns-7656fdcbd7-2jzrb" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.460423 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e13429a8-0182-4bfb-99b2-d1941a1e9af6-openstack-config\") pod \"openstackclient\" (UID: \"e13429a8-0182-4bfb-99b2-d1941a1e9af6\") " pod="openstack/openstackclient" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.460454 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a7a16dd7-b208-4a3a-9309-0ba292ed12fe-dns-svc\") pod \"dnsmasq-dns-7656fdcbd7-2jzrb\" (UID: \"a7a16dd7-b208-4a3a-9309-0ba292ed12fe\") " pod="openstack/dnsmasq-dns-7656fdcbd7-2jzrb" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.460504 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a7a16dd7-b208-4a3a-9309-0ba292ed12fe-ovsdbserver-nb\") pod \"dnsmasq-dns-7656fdcbd7-2jzrb\" (UID: \"a7a16dd7-b208-4a3a-9309-0ba292ed12fe\") " pod="openstack/dnsmasq-dns-7656fdcbd7-2jzrb" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.460613 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d52tn\" (UniqueName: \"kubernetes.io/projected/a7a16dd7-b208-4a3a-9309-0ba292ed12fe-kube-api-access-d52tn\") pod \"dnsmasq-dns-7656fdcbd7-2jzrb\" (UID: \"a7a16dd7-b208-4a3a-9309-0ba292ed12fe\") " pod="openstack/dnsmasq-dns-7656fdcbd7-2jzrb" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.460629 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e13429a8-0182-4bfb-99b2-d1941a1e9af6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"e13429a8-0182-4bfb-99b2-d1941a1e9af6\") " pod="openstack/openstackclient" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.463253 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a7a16dd7-b208-4a3a-9309-0ba292ed12fe-ovsdbserver-sb\") pod \"dnsmasq-dns-7656fdcbd7-2jzrb\" (UID: \"a7a16dd7-b208-4a3a-9309-0ba292ed12fe\") " pod="openstack/dnsmasq-dns-7656fdcbd7-2jzrb" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.464328 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a7a16dd7-b208-4a3a-9309-0ba292ed12fe-dns-swift-storage-0\") pod \"dnsmasq-dns-7656fdcbd7-2jzrb\" (UID: \"a7a16dd7-b208-4a3a-9309-0ba292ed12fe\") " pod="openstack/dnsmasq-dns-7656fdcbd7-2jzrb" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.464941 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a7a16dd7-b208-4a3a-9309-0ba292ed12fe-ovsdbserver-nb\") pod \"dnsmasq-dns-7656fdcbd7-2jzrb\" (UID: \"a7a16dd7-b208-4a3a-9309-0ba292ed12fe\") " pod="openstack/dnsmasq-dns-7656fdcbd7-2jzrb" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.465582 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7a16dd7-b208-4a3a-9309-0ba292ed12fe-config\") pod \"dnsmasq-dns-7656fdcbd7-2jzrb\" (UID: \"a7a16dd7-b208-4a3a-9309-0ba292ed12fe\") " pod="openstack/dnsmasq-dns-7656fdcbd7-2jzrb" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.466298 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a7a16dd7-b208-4a3a-9309-0ba292ed12fe-dns-svc\") pod \"dnsmasq-dns-7656fdcbd7-2jzrb\" (UID: \"a7a16dd7-b208-4a3a-9309-0ba292ed12fe\") " pod="openstack/dnsmasq-dns-7656fdcbd7-2jzrb" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.466312 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e13429a8-0182-4bfb-99b2-d1941a1e9af6-openstack-config\") pod \"openstackclient\" (UID: \"e13429a8-0182-4bfb-99b2-d1941a1e9af6\") " pod="openstack/openstackclient" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.466604 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-79f849bb84-btxkg"] Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.468152 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79f849bb84-btxkg" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.474364 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.474482 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-cf59p" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.474575 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.475183 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.481313 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e13429a8-0182-4bfb-99b2-d1941a1e9af6-openstack-config-secret\") pod \"openstackclient\" (UID: \"e13429a8-0182-4bfb-99b2-d1941a1e9af6\") " pod="openstack/openstackclient" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.484350 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e13429a8-0182-4bfb-99b2-d1941a1e9af6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"e13429a8-0182-4bfb-99b2-d1941a1e9af6\") " pod="openstack/openstackclient" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.493334 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d52tn\" (UniqueName: \"kubernetes.io/projected/a7a16dd7-b208-4a3a-9309-0ba292ed12fe-kube-api-access-d52tn\") pod \"dnsmasq-dns-7656fdcbd7-2jzrb\" (UID: \"a7a16dd7-b208-4a3a-9309-0ba292ed12fe\") " pod="openstack/dnsmasq-dns-7656fdcbd7-2jzrb" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.496878 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-79f849bb84-btxkg"] Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.517074 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqrfs\" (UniqueName: \"kubernetes.io/projected/e13429a8-0182-4bfb-99b2-d1941a1e9af6-kube-api-access-wqrfs\") pod \"openstackclient\" (UID: \"e13429a8-0182-4bfb-99b2-d1941a1e9af6\") " pod="openstack/openstackclient" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.517831 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b769c363-a026-4bf4-9b56-3d1452b6847d-config-data" (OuterVolumeSpecName: "config-data") pod "b769c363-a026-4bf4-9b56-3d1452b6847d" (UID: "b769c363-a026-4bf4-9b56-3d1452b6847d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.568112 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a02d0bff-55e1-4de5-95e4-98d65018cbf0-combined-ca-bundle\") pod \"neutron-79f849bb84-btxkg\" (UID: \"a02d0bff-55e1-4de5-95e4-98d65018cbf0\") " pod="openstack/neutron-79f849bb84-btxkg" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.568155 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a02d0bff-55e1-4de5-95e4-98d65018cbf0-httpd-config\") pod \"neutron-79f849bb84-btxkg\" (UID: \"a02d0bff-55e1-4de5-95e4-98d65018cbf0\") " pod="openstack/neutron-79f849bb84-btxkg" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.568181 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a02d0bff-55e1-4de5-95e4-98d65018cbf0-ovndb-tls-certs\") pod \"neutron-79f849bb84-btxkg\" (UID: \"a02d0bff-55e1-4de5-95e4-98d65018cbf0\") " pod="openstack/neutron-79f849bb84-btxkg" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.568199 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a02d0bff-55e1-4de5-95e4-98d65018cbf0-config\") pod \"neutron-79f849bb84-btxkg\" (UID: \"a02d0bff-55e1-4de5-95e4-98d65018cbf0\") " pod="openstack/neutron-79f849bb84-btxkg" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.568243 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lf6ft\" (UniqueName: \"kubernetes.io/projected/a02d0bff-55e1-4de5-95e4-98d65018cbf0-kube-api-access-lf6ft\") pod \"neutron-79f849bb84-btxkg\" (UID: \"a02d0bff-55e1-4de5-95e4-98d65018cbf0\") " pod="openstack/neutron-79f849bb84-btxkg" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.568316 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b769c363-a026-4bf4-9b56-3d1452b6847d-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.607768 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7656fdcbd7-2jzrb" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.669944 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a02d0bff-55e1-4de5-95e4-98d65018cbf0-combined-ca-bundle\") pod \"neutron-79f849bb84-btxkg\" (UID: \"a02d0bff-55e1-4de5-95e4-98d65018cbf0\") " pod="openstack/neutron-79f849bb84-btxkg" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.670321 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a02d0bff-55e1-4de5-95e4-98d65018cbf0-httpd-config\") pod \"neutron-79f849bb84-btxkg\" (UID: \"a02d0bff-55e1-4de5-95e4-98d65018cbf0\") " pod="openstack/neutron-79f849bb84-btxkg" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.670365 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a02d0bff-55e1-4de5-95e4-98d65018cbf0-config\") pod \"neutron-79f849bb84-btxkg\" (UID: \"a02d0bff-55e1-4de5-95e4-98d65018cbf0\") " pod="openstack/neutron-79f849bb84-btxkg" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.670389 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a02d0bff-55e1-4de5-95e4-98d65018cbf0-ovndb-tls-certs\") pod \"neutron-79f849bb84-btxkg\" (UID: \"a02d0bff-55e1-4de5-95e4-98d65018cbf0\") " pod="openstack/neutron-79f849bb84-btxkg" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.670470 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lf6ft\" (UniqueName: \"kubernetes.io/projected/a02d0bff-55e1-4de5-95e4-98d65018cbf0-kube-api-access-lf6ft\") pod \"neutron-79f849bb84-btxkg\" (UID: \"a02d0bff-55e1-4de5-95e4-98d65018cbf0\") " pod="openstack/neutron-79f849bb84-btxkg" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.676558 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a02d0bff-55e1-4de5-95e4-98d65018cbf0-config\") pod \"neutron-79f849bb84-btxkg\" (UID: \"a02d0bff-55e1-4de5-95e4-98d65018cbf0\") " pod="openstack/neutron-79f849bb84-btxkg" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.677421 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a02d0bff-55e1-4de5-95e4-98d65018cbf0-combined-ca-bundle\") pod \"neutron-79f849bb84-btxkg\" (UID: \"a02d0bff-55e1-4de5-95e4-98d65018cbf0\") " pod="openstack/neutron-79f849bb84-btxkg" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.684277 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a02d0bff-55e1-4de5-95e4-98d65018cbf0-ovndb-tls-certs\") pod \"neutron-79f849bb84-btxkg\" (UID: \"a02d0bff-55e1-4de5-95e4-98d65018cbf0\") " pod="openstack/neutron-79f849bb84-btxkg" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.690046 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a02d0bff-55e1-4de5-95e4-98d65018cbf0-httpd-config\") pod \"neutron-79f849bb84-btxkg\" (UID: \"a02d0bff-55e1-4de5-95e4-98d65018cbf0\") " pod="openstack/neutron-79f849bb84-btxkg" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.700562 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.701512 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lf6ft\" (UniqueName: \"kubernetes.io/projected/a02d0bff-55e1-4de5-95e4-98d65018cbf0-kube-api-access-lf6ft\") pod \"neutron-79f849bb84-btxkg\" (UID: \"a02d0bff-55e1-4de5-95e4-98d65018cbf0\") " pod="openstack/neutron-79f849bb84-btxkg" Dec 04 10:34:55 crc kubenswrapper[4831]: I1204 10:34:55.860292 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79f849bb84-btxkg" Dec 04 10:34:56 crc kubenswrapper[4831]: I1204 10:34:56.034501 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b769c363-a026-4bf4-9b56-3d1452b6847d","Type":"ContainerDied","Data":"f2c3e0a23f31bf8a111eff6abb6fe99f64eb8ffd0065a656e683fda674ca4796"} Dec 04 10:34:56 crc kubenswrapper[4831]: I1204 10:34:56.034700 4831 scope.go:117] "RemoveContainer" containerID="8f26f07839607b082f46d9f85dba97955a75c334c66d497c223ed587ce5b9dd4" Dec 04 10:34:56 crc kubenswrapper[4831]: I1204 10:34:56.034802 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:34:56 crc kubenswrapper[4831]: I1204 10:34:56.053622 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-744bfc5f58-hs9c9" event={"ID":"24ad7f65-67eb-4f94-9d64-2d14e393c978","Type":"ContainerStarted","Data":"1a0f2209823c2270b57a790409fa78873d40f779054df74912aa5d87ad1f6e2e"} Dec 04 10:34:56 crc kubenswrapper[4831]: I1204 10:34:56.062122 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"5eab9136-8caf-4dc6-81e4-3d8544e3ad94","Type":"ContainerStarted","Data":"5c35c68e8da7ed36c9faf5b443349c9aab5f576d705e8e8a5d568e91114065d4"} Dec 04 10:34:56 crc kubenswrapper[4831]: I1204 10:34:56.062169 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"5eab9136-8caf-4dc6-81e4-3d8544e3ad94","Type":"ContainerStarted","Data":"71efc360ebe4f9cf0b19cec96b9914504d31dc2c05582555fe460345403531c4"} Dec 04 10:34:56 crc kubenswrapper[4831]: I1204 10:34:56.112233 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:34:56 crc kubenswrapper[4831]: I1204 10:34:56.123324 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:34:56 crc kubenswrapper[4831]: I1204 10:34:56.134424 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:34:56 crc kubenswrapper[4831]: I1204 10:34:56.136510 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:34:56 crc kubenswrapper[4831]: I1204 10:34:56.143086 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 10:34:56 crc kubenswrapper[4831]: I1204 10:34:56.143268 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 10:34:56 crc kubenswrapper[4831]: I1204 10:34:56.147576 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:34:56 crc kubenswrapper[4831]: I1204 10:34:56.307187 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f14967a3-96b4-46ce-8685-0b644a080cc8-scripts\") pod \"ceilometer-0\" (UID: \"f14967a3-96b4-46ce-8685-0b644a080cc8\") " pod="openstack/ceilometer-0" Dec 04 10:34:56 crc kubenswrapper[4831]: I1204 10:34:56.307268 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f14967a3-96b4-46ce-8685-0b644a080cc8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f14967a3-96b4-46ce-8685-0b644a080cc8\") " pod="openstack/ceilometer-0" Dec 04 10:34:56 crc kubenswrapper[4831]: I1204 10:34:56.307391 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f14967a3-96b4-46ce-8685-0b644a080cc8-run-httpd\") pod \"ceilometer-0\" (UID: \"f14967a3-96b4-46ce-8685-0b644a080cc8\") " pod="openstack/ceilometer-0" Dec 04 10:34:56 crc kubenswrapper[4831]: I1204 10:34:56.307437 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f14967a3-96b4-46ce-8685-0b644a080cc8-log-httpd\") pod \"ceilometer-0\" (UID: \"f14967a3-96b4-46ce-8685-0b644a080cc8\") " pod="openstack/ceilometer-0" Dec 04 10:34:56 crc kubenswrapper[4831]: I1204 10:34:56.307468 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2hfl\" (UniqueName: \"kubernetes.io/projected/f14967a3-96b4-46ce-8685-0b644a080cc8-kube-api-access-b2hfl\") pod \"ceilometer-0\" (UID: \"f14967a3-96b4-46ce-8685-0b644a080cc8\") " pod="openstack/ceilometer-0" Dec 04 10:34:56 crc kubenswrapper[4831]: I1204 10:34:56.307503 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f14967a3-96b4-46ce-8685-0b644a080cc8-config-data\") pod \"ceilometer-0\" (UID: \"f14967a3-96b4-46ce-8685-0b644a080cc8\") " pod="openstack/ceilometer-0" Dec 04 10:34:56 crc kubenswrapper[4831]: I1204 10:34:56.307607 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f14967a3-96b4-46ce-8685-0b644a080cc8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f14967a3-96b4-46ce-8685-0b644a080cc8\") " pod="openstack/ceilometer-0" Dec 04 10:34:56 crc kubenswrapper[4831]: I1204 10:34:56.416798 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f14967a3-96b4-46ce-8685-0b644a080cc8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f14967a3-96b4-46ce-8685-0b644a080cc8\") " pod="openstack/ceilometer-0" Dec 04 10:34:56 crc kubenswrapper[4831]: I1204 10:34:56.416968 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f14967a3-96b4-46ce-8685-0b644a080cc8-scripts\") pod \"ceilometer-0\" (UID: \"f14967a3-96b4-46ce-8685-0b644a080cc8\") " pod="openstack/ceilometer-0" Dec 04 10:34:56 crc kubenswrapper[4831]: I1204 10:34:56.417002 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f14967a3-96b4-46ce-8685-0b644a080cc8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f14967a3-96b4-46ce-8685-0b644a080cc8\") " pod="openstack/ceilometer-0" Dec 04 10:34:56 crc kubenswrapper[4831]: I1204 10:34:56.417231 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f14967a3-96b4-46ce-8685-0b644a080cc8-run-httpd\") pod \"ceilometer-0\" (UID: \"f14967a3-96b4-46ce-8685-0b644a080cc8\") " pod="openstack/ceilometer-0" Dec 04 10:34:56 crc kubenswrapper[4831]: I1204 10:34:56.417292 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f14967a3-96b4-46ce-8685-0b644a080cc8-log-httpd\") pod \"ceilometer-0\" (UID: \"f14967a3-96b4-46ce-8685-0b644a080cc8\") " pod="openstack/ceilometer-0" Dec 04 10:34:56 crc kubenswrapper[4831]: I1204 10:34:56.417324 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2hfl\" (UniqueName: \"kubernetes.io/projected/f14967a3-96b4-46ce-8685-0b644a080cc8-kube-api-access-b2hfl\") pod \"ceilometer-0\" (UID: \"f14967a3-96b4-46ce-8685-0b644a080cc8\") " pod="openstack/ceilometer-0" Dec 04 10:34:56 crc kubenswrapper[4831]: I1204 10:34:56.417343 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f14967a3-96b4-46ce-8685-0b644a080cc8-config-data\") pod \"ceilometer-0\" (UID: \"f14967a3-96b4-46ce-8685-0b644a080cc8\") " pod="openstack/ceilometer-0" Dec 04 10:34:56 crc kubenswrapper[4831]: I1204 10:34:56.419997 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f14967a3-96b4-46ce-8685-0b644a080cc8-log-httpd\") pod \"ceilometer-0\" (UID: \"f14967a3-96b4-46ce-8685-0b644a080cc8\") " pod="openstack/ceilometer-0" Dec 04 10:34:56 crc kubenswrapper[4831]: I1204 10:34:56.420218 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f14967a3-96b4-46ce-8685-0b644a080cc8-run-httpd\") pod \"ceilometer-0\" (UID: \"f14967a3-96b4-46ce-8685-0b644a080cc8\") " pod="openstack/ceilometer-0" Dec 04 10:34:56 crc kubenswrapper[4831]: I1204 10:34:56.429368 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f14967a3-96b4-46ce-8685-0b644a080cc8-scripts\") pod \"ceilometer-0\" (UID: \"f14967a3-96b4-46ce-8685-0b644a080cc8\") " pod="openstack/ceilometer-0" Dec 04 10:34:56 crc kubenswrapper[4831]: I1204 10:34:56.430083 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f14967a3-96b4-46ce-8685-0b644a080cc8-config-data\") pod \"ceilometer-0\" (UID: \"f14967a3-96b4-46ce-8685-0b644a080cc8\") " pod="openstack/ceilometer-0" Dec 04 10:34:56 crc kubenswrapper[4831]: I1204 10:34:56.441089 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f14967a3-96b4-46ce-8685-0b644a080cc8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f14967a3-96b4-46ce-8685-0b644a080cc8\") " pod="openstack/ceilometer-0" Dec 04 10:34:56 crc kubenswrapper[4831]: I1204 10:34:56.441637 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f14967a3-96b4-46ce-8685-0b644a080cc8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f14967a3-96b4-46ce-8685-0b644a080cc8\") " pod="openstack/ceilometer-0" Dec 04 10:34:56 crc kubenswrapper[4831]: I1204 10:34:56.445386 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2hfl\" (UniqueName: \"kubernetes.io/projected/f14967a3-96b4-46ce-8685-0b644a080cc8-kube-api-access-b2hfl\") pod \"ceilometer-0\" (UID: \"f14967a3-96b4-46ce-8685-0b644a080cc8\") " pod="openstack/ceilometer-0" Dec 04 10:34:56 crc kubenswrapper[4831]: I1204 10:34:56.469095 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:34:56 crc kubenswrapper[4831]: E1204 10:34:56.990123 4831 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Dec 04 10:34:56 crc kubenswrapper[4831]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/b3851f8b-c385-4852-8339-c9cb8f4586e5/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 04 10:34:56 crc kubenswrapper[4831]: > podSandboxID="6458d2bdb35ae592156499722b835506c6b8e8fe5cab916bc59019a7d05298e7" Dec 04 10:34:56 crc kubenswrapper[4831]: E1204 10:34:56.990367 4831 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 04 10:34:56 crc kubenswrapper[4831]: container &Container{Name:dnsmasq-dns,Image:38.102.83.47:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n66fh577h59bh56chd8h5cbh56bh595h5bdhf5h5b9h77h66chc9h9bh5bfh588h654h589h5c6hbdh68ch75hc6h57bh8fh698h86hb7h7h54h549q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-swift-storage-0,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-swift-storage-0,SubPath:dns-swift-storage-0,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f9vm5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6b9767fcff-cwqsr_openstack(b3851f8b-c385-4852-8339-c9cb8f4586e5): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/b3851f8b-c385-4852-8339-c9cb8f4586e5/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 04 10:34:56 crc kubenswrapper[4831]: > logger="UnhandledError" Dec 04 10:34:56 crc kubenswrapper[4831]: E1204 10:34:56.991699 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/b3851f8b-c385-4852-8339-c9cb8f4586e5/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-6b9767fcff-cwqsr" podUID="b3851f8b-c385-4852-8339-c9cb8f4586e5" Dec 04 10:34:57 crc kubenswrapper[4831]: I1204 10:34:57.062920 4831 scope.go:117] "RemoveContainer" containerID="7c81ef06f0d2ef4ac48cdbdbbce8ccf5febd339c9121f55b37b94aff84afb322" Dec 04 10:34:57 crc kubenswrapper[4831]: I1204 10:34:57.122882 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-78b878b7bb-lxbbq" event={"ID":"dead6584-8c6b-4231-b0c6-54d83d05c250","Type":"ContainerStarted","Data":"b6e51fa9d4897f475b021dffe5e4a28b265b4b2bb9bcd16fb86b3b6b11988b45"} Dec 04 10:34:57 crc kubenswrapper[4831]: I1204 10:34:57.123361 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-78b878b7bb-lxbbq" Dec 04 10:34:57 crc kubenswrapper[4831]: I1204 10:34:57.123632 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-78b878b7bb-lxbbq" Dec 04 10:34:57 crc kubenswrapper[4831]: I1204 10:34:57.154261 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-78b878b7bb-lxbbq" podStartSLOduration=15.154242965 podStartE2EDuration="15.154242965s" podCreationTimestamp="2025-12-04 10:34:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:34:57.141807199 +0000 UTC m=+1194.090982513" watchObservedRunningTime="2025-12-04 10:34:57.154242965 +0000 UTC m=+1194.103418279" Dec 04 10:34:57 crc kubenswrapper[4831]: I1204 10:34:57.170850 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-744bfc5f58-hs9c9" event={"ID":"24ad7f65-67eb-4f94-9d64-2d14e393c978","Type":"ContainerStarted","Data":"9acafc6125c874a0bbb7ad6cf358ea2143408598fdd38a4b95e287d7e2937257"} Dec 04 10:34:57 crc kubenswrapper[4831]: I1204 10:34:57.181318 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-dmcxs" event={"ID":"94129f00-4043-4552-9724-feef1585cd20","Type":"ContainerStarted","Data":"0932a40e56e000a340375dbe475e2b4e990f48ccb1c48929447bb5a112edb0b3"} Dec 04 10:34:57 crc kubenswrapper[4831]: I1204 10:34:57.186321 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-545d99f8dd-55gfz" event={"ID":"be37a16a-5f7e-4f97-b71d-ee344177919c","Type":"ContainerStarted","Data":"128236ce2adfca9eacb943496711e32ee6b4f03660b05db1105554aed5dbb6d7"} Dec 04 10:34:57 crc kubenswrapper[4831]: I1204 10:34:57.220600 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-dmcxs" podStartSLOduration=6.442556574 podStartE2EDuration="42.220572419s" podCreationTimestamp="2025-12-04 10:34:15 +0000 UTC" firstStartedPulling="2025-12-04 10:34:18.403611606 +0000 UTC m=+1155.352786920" lastFinishedPulling="2025-12-04 10:34:54.181627461 +0000 UTC m=+1191.130802765" observedRunningTime="2025-12-04 10:34:57.21193096 +0000 UTC m=+1194.161106274" watchObservedRunningTime="2025-12-04 10:34:57.220572419 +0000 UTC m=+1194.169747733" Dec 04 10:34:57 crc kubenswrapper[4831]: I1204 10:34:57.240066 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-545d99f8dd-55gfz" podStartSLOduration=18.240052324 podStartE2EDuration="18.240052324s" podCreationTimestamp="2025-12-04 10:34:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:34:57.239316375 +0000 UTC m=+1194.188491689" watchObservedRunningTime="2025-12-04 10:34:57.240052324 +0000 UTC m=+1194.189227638" Dec 04 10:34:57 crc kubenswrapper[4831]: I1204 10:34:57.290971 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b769c363-a026-4bf4-9b56-3d1452b6847d" path="/var/lib/kubelet/pods/b769c363-a026-4bf4-9b56-3d1452b6847d/volumes" Dec 04 10:34:57 crc kubenswrapper[4831]: I1204 10:34:57.583530 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7656fdcbd7-2jzrb"] Dec 04 10:34:57 crc kubenswrapper[4831]: I1204 10:34:57.972228 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-66c4c75c85-69mpg"] Dec 04 10:34:57 crc kubenswrapper[4831]: I1204 10:34:57.974273 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66c4c75c85-69mpg" Dec 04 10:34:57 crc kubenswrapper[4831]: I1204 10:34:57.976943 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 04 10:34:57 crc kubenswrapper[4831]: I1204 10:34:57.977179 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 04 10:34:57 crc kubenswrapper[4831]: I1204 10:34:57.990766 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-66c4c75c85-69mpg"] Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.066154 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2f2179f9-7122-438d-85cc-012b724ccae8-httpd-config\") pod \"neutron-66c4c75c85-69mpg\" (UID: \"2f2179f9-7122-438d-85cc-012b724ccae8\") " pod="openstack/neutron-66c4c75c85-69mpg" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.066210 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f2179f9-7122-438d-85cc-012b724ccae8-combined-ca-bundle\") pod \"neutron-66c4c75c85-69mpg\" (UID: \"2f2179f9-7122-438d-85cc-012b724ccae8\") " pod="openstack/neutron-66c4c75c85-69mpg" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.066273 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2f2179f9-7122-438d-85cc-012b724ccae8-config\") pod \"neutron-66c4c75c85-69mpg\" (UID: \"2f2179f9-7122-438d-85cc-012b724ccae8\") " pod="openstack/neutron-66c4c75c85-69mpg" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.066360 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f2179f9-7122-438d-85cc-012b724ccae8-internal-tls-certs\") pod \"neutron-66c4c75c85-69mpg\" (UID: \"2f2179f9-7122-438d-85cc-012b724ccae8\") " pod="openstack/neutron-66c4c75c85-69mpg" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.066417 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f2179f9-7122-438d-85cc-012b724ccae8-public-tls-certs\") pod \"neutron-66c4c75c85-69mpg\" (UID: \"2f2179f9-7122-438d-85cc-012b724ccae8\") " pod="openstack/neutron-66c4c75c85-69mpg" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.066462 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbtxv\" (UniqueName: \"kubernetes.io/projected/2f2179f9-7122-438d-85cc-012b724ccae8-kube-api-access-tbtxv\") pod \"neutron-66c4c75c85-69mpg\" (UID: \"2f2179f9-7122-438d-85cc-012b724ccae8\") " pod="openstack/neutron-66c4c75c85-69mpg" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.066494 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f2179f9-7122-438d-85cc-012b724ccae8-ovndb-tls-certs\") pod \"neutron-66c4c75c85-69mpg\" (UID: \"2f2179f9-7122-438d-85cc-012b724ccae8\") " pod="openstack/neutron-66c4c75c85-69mpg" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.148207 4831 scope.go:117] "RemoveContainer" containerID="021d8e6d61418e963a5e960d7a12ba82d80b9c251032566a717d19caa4e03441" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.171033 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f2179f9-7122-438d-85cc-012b724ccae8-internal-tls-certs\") pod \"neutron-66c4c75c85-69mpg\" (UID: \"2f2179f9-7122-438d-85cc-012b724ccae8\") " pod="openstack/neutron-66c4c75c85-69mpg" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.171837 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f2179f9-7122-438d-85cc-012b724ccae8-public-tls-certs\") pod \"neutron-66c4c75c85-69mpg\" (UID: \"2f2179f9-7122-438d-85cc-012b724ccae8\") " pod="openstack/neutron-66c4c75c85-69mpg" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.172096 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbtxv\" (UniqueName: \"kubernetes.io/projected/2f2179f9-7122-438d-85cc-012b724ccae8-kube-api-access-tbtxv\") pod \"neutron-66c4c75c85-69mpg\" (UID: \"2f2179f9-7122-438d-85cc-012b724ccae8\") " pod="openstack/neutron-66c4c75c85-69mpg" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.172282 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f2179f9-7122-438d-85cc-012b724ccae8-ovndb-tls-certs\") pod \"neutron-66c4c75c85-69mpg\" (UID: \"2f2179f9-7122-438d-85cc-012b724ccae8\") " pod="openstack/neutron-66c4c75c85-69mpg" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.172508 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2f2179f9-7122-438d-85cc-012b724ccae8-httpd-config\") pod \"neutron-66c4c75c85-69mpg\" (UID: \"2f2179f9-7122-438d-85cc-012b724ccae8\") " pod="openstack/neutron-66c4c75c85-69mpg" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.172680 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f2179f9-7122-438d-85cc-012b724ccae8-combined-ca-bundle\") pod \"neutron-66c4c75c85-69mpg\" (UID: \"2f2179f9-7122-438d-85cc-012b724ccae8\") " pod="openstack/neutron-66c4c75c85-69mpg" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.172939 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2f2179f9-7122-438d-85cc-012b724ccae8-config\") pod \"neutron-66c4c75c85-69mpg\" (UID: \"2f2179f9-7122-438d-85cc-012b724ccae8\") " pod="openstack/neutron-66c4c75c85-69mpg" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.184902 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2f2179f9-7122-438d-85cc-012b724ccae8-config\") pod \"neutron-66c4c75c85-69mpg\" (UID: \"2f2179f9-7122-438d-85cc-012b724ccae8\") " pod="openstack/neutron-66c4c75c85-69mpg" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.186685 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f2179f9-7122-438d-85cc-012b724ccae8-public-tls-certs\") pod \"neutron-66c4c75c85-69mpg\" (UID: \"2f2179f9-7122-438d-85cc-012b724ccae8\") " pod="openstack/neutron-66c4c75c85-69mpg" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.188467 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2f2179f9-7122-438d-85cc-012b724ccae8-httpd-config\") pod \"neutron-66c4c75c85-69mpg\" (UID: \"2f2179f9-7122-438d-85cc-012b724ccae8\") " pod="openstack/neutron-66c4c75c85-69mpg" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.189461 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f2179f9-7122-438d-85cc-012b724ccae8-ovndb-tls-certs\") pod \"neutron-66c4c75c85-69mpg\" (UID: \"2f2179f9-7122-438d-85cc-012b724ccae8\") " pod="openstack/neutron-66c4c75c85-69mpg" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.189729 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f2179f9-7122-438d-85cc-012b724ccae8-internal-tls-certs\") pod \"neutron-66c4c75c85-69mpg\" (UID: \"2f2179f9-7122-438d-85cc-012b724ccae8\") " pod="openstack/neutron-66c4c75c85-69mpg" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.190988 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbtxv\" (UniqueName: \"kubernetes.io/projected/2f2179f9-7122-438d-85cc-012b724ccae8-kube-api-access-tbtxv\") pod \"neutron-66c4c75c85-69mpg\" (UID: \"2f2179f9-7122-438d-85cc-012b724ccae8\") " pod="openstack/neutron-66c4c75c85-69mpg" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.192072 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f2179f9-7122-438d-85cc-012b724ccae8-combined-ca-bundle\") pod \"neutron-66c4c75c85-69mpg\" (UID: \"2f2179f9-7122-438d-85cc-012b724ccae8\") " pod="openstack/neutron-66c4c75c85-69mpg" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.226622 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"5eab9136-8caf-4dc6-81e4-3d8544e3ad94","Type":"ContainerStarted","Data":"a2d49533fd2339a4db175303b8fa1d1e5c97d4970d7acb838348e518cc9cd027"} Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.228180 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.269845 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=10.269829499 podStartE2EDuration="10.269829499s" podCreationTimestamp="2025-12-04 10:34:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:34:58.252424737 +0000 UTC m=+1195.201600071" watchObservedRunningTime="2025-12-04 10:34:58.269829499 +0000 UTC m=+1195.219004813" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.315221 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9767fcff-cwqsr" event={"ID":"b3851f8b-c385-4852-8339-c9cb8f4586e5","Type":"ContainerDied","Data":"6458d2bdb35ae592156499722b835506c6b8e8fe5cab916bc59019a7d05298e7"} Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.315264 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6458d2bdb35ae592156499722b835506c6b8e8fe5cab916bc59019a7d05298e7" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.324478 4831 generic.go:334] "Generic (PLEG): container finished" podID="6040f79c-a151-4681-a758-d2741bff68b6" containerID="bac90f7915063288de67a2e139c40837b5708a4e13373e20cb3ac91496de72b3" exitCode=1 Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.324551 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"6040f79c-a151-4681-a758-d2741bff68b6","Type":"ContainerDied","Data":"bac90f7915063288de67a2e139c40837b5708a4e13373e20cb3ac91496de72b3"} Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.325168 4831 scope.go:117] "RemoveContainer" containerID="bac90f7915063288de67a2e139c40837b5708a4e13373e20cb3ac91496de72b3" Dec 04 10:34:58 crc kubenswrapper[4831]: E1204 10:34:58.325469 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(6040f79c-a151-4681-a758-d2741bff68b6)\"" pod="openstack/watcher-decision-engine-0" podUID="6040f79c-a151-4681-a758-d2741bff68b6" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.335138 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66c4c75c85-69mpg" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.345552 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7656fdcbd7-2jzrb" event={"ID":"a7a16dd7-b208-4a3a-9309-0ba292ed12fe","Type":"ContainerStarted","Data":"44c9d2d596f73dd964f999f2ed0d4ad78628af94bde3de97b9fcf9b5864fbacf"} Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.345592 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-545d99f8dd-55gfz" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.346031 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-545d99f8dd-55gfz" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.358223 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9767fcff-cwqsr" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.389209 4831 scope.go:117] "RemoveContainer" containerID="7751384cd6e87417fdd57ce1a33bf4f4b451bc1ea43414d0e681fdb437c59e89" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.390139 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3851f8b-c385-4852-8339-c9cb8f4586e5-ovsdbserver-nb\") pod \"b3851f8b-c385-4852-8339-c9cb8f4586e5\" (UID: \"b3851f8b-c385-4852-8339-c9cb8f4586e5\") " Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.390187 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b3851f8b-c385-4852-8339-c9cb8f4586e5-dns-swift-storage-0\") pod \"b3851f8b-c385-4852-8339-c9cb8f4586e5\" (UID: \"b3851f8b-c385-4852-8339-c9cb8f4586e5\") " Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.390263 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3851f8b-c385-4852-8339-c9cb8f4586e5-ovsdbserver-sb\") pod \"b3851f8b-c385-4852-8339-c9cb8f4586e5\" (UID: \"b3851f8b-c385-4852-8339-c9cb8f4586e5\") " Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.390311 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9vm5\" (UniqueName: \"kubernetes.io/projected/b3851f8b-c385-4852-8339-c9cb8f4586e5-kube-api-access-f9vm5\") pod \"b3851f8b-c385-4852-8339-c9cb8f4586e5\" (UID: \"b3851f8b-c385-4852-8339-c9cb8f4586e5\") " Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.390385 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3851f8b-c385-4852-8339-c9cb8f4586e5-config\") pod \"b3851f8b-c385-4852-8339-c9cb8f4586e5\" (UID: \"b3851f8b-c385-4852-8339-c9cb8f4586e5\") " Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.390424 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3851f8b-c385-4852-8339-c9cb8f4586e5-dns-svc\") pod \"b3851f8b-c385-4852-8339-c9cb8f4586e5\" (UID: \"b3851f8b-c385-4852-8339-c9cb8f4586e5\") " Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.400270 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3851f8b-c385-4852-8339-c9cb8f4586e5-kube-api-access-f9vm5" (OuterVolumeSpecName: "kube-api-access-f9vm5") pod "b3851f8b-c385-4852-8339-c9cb8f4586e5" (UID: "b3851f8b-c385-4852-8339-c9cb8f4586e5"). InnerVolumeSpecName "kube-api-access-f9vm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.494836 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9vm5\" (UniqueName: \"kubernetes.io/projected/b3851f8b-c385-4852-8339-c9cb8f4586e5-kube-api-access-f9vm5\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.615687 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.615914 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.662008 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3851f8b-c385-4852-8339-c9cb8f4586e5-config" (OuterVolumeSpecName: "config") pod "b3851f8b-c385-4852-8339-c9cb8f4586e5" (UID: "b3851f8b-c385-4852-8339-c9cb8f4586e5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.674190 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.707385 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3851f8b-c385-4852-8339-c9cb8f4586e5-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.723092 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3851f8b-c385-4852-8339-c9cb8f4586e5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b3851f8b-c385-4852-8339-c9cb8f4586e5" (UID: "b3851f8b-c385-4852-8339-c9cb8f4586e5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.761253 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3851f8b-c385-4852-8339-c9cb8f4586e5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b3851f8b-c385-4852-8339-c9cb8f4586e5" (UID: "b3851f8b-c385-4852-8339-c9cb8f4586e5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.761439 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3851f8b-c385-4852-8339-c9cb8f4586e5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b3851f8b-c385-4852-8339-c9cb8f4586e5" (UID: "b3851f8b-c385-4852-8339-c9cb8f4586e5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.781155 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3851f8b-c385-4852-8339-c9cb8f4586e5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b3851f8b-c385-4852-8339-c9cb8f4586e5" (UID: "b3851f8b-c385-4852-8339-c9cb8f4586e5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.809640 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3851f8b-c385-4852-8339-c9cb8f4586e5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.809701 4831 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b3851f8b-c385-4852-8339-c9cb8f4586e5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.809714 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3851f8b-c385-4852-8339-c9cb8f4586e5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.809725 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3851f8b-c385-4852-8339-c9cb8f4586e5-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.867160 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-79f849bb84-btxkg"] Dec 04 10:34:58 crc kubenswrapper[4831]: I1204 10:34:58.943476 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:34:59 crc kubenswrapper[4831]: I1204 10:34:59.298437 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-66c4c75c85-69mpg"] Dec 04 10:34:59 crc kubenswrapper[4831]: I1204 10:34:59.360069 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-744bfc5f58-hs9c9" event={"ID":"24ad7f65-67eb-4f94-9d64-2d14e393c978","Type":"ContainerStarted","Data":"9f7bd02f2c9df5f04701405fa4befa3130801b1b9870025151d91d6ca427f88a"} Dec 04 10:34:59 crc kubenswrapper[4831]: I1204 10:34:59.361338 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-744bfc5f58-hs9c9" Dec 04 10:34:59 crc kubenswrapper[4831]: I1204 10:34:59.361375 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-744bfc5f58-hs9c9" Dec 04 10:34:59 crc kubenswrapper[4831]: I1204 10:34:59.367293 4831 generic.go:334] "Generic (PLEG): container finished" podID="a7a16dd7-b208-4a3a-9309-0ba292ed12fe" containerID="294914b80011213e69f277fa86f5427efde413629150503558c967245506f485" exitCode=0 Dec 04 10:34:59 crc kubenswrapper[4831]: I1204 10:34:59.367376 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7656fdcbd7-2jzrb" event={"ID":"a7a16dd7-b208-4a3a-9309-0ba292ed12fe","Type":"ContainerDied","Data":"294914b80011213e69f277fa86f5427efde413629150503558c967245506f485"} Dec 04 10:34:59 crc kubenswrapper[4831]: I1204 10:34:59.373109 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66c4c75c85-69mpg" event={"ID":"2f2179f9-7122-438d-85cc-012b724ccae8","Type":"ContainerStarted","Data":"9510b035c7e545ec9e5f00f77be9754338878a0ee09d3cbdf71f9717e1d7569b"} Dec 04 10:34:59 crc kubenswrapper[4831]: I1204 10:34:59.378477 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79f849bb84-btxkg" event={"ID":"a02d0bff-55e1-4de5-95e4-98d65018cbf0","Type":"ContainerStarted","Data":"c78af9fedf96b9361597f1e3c2127dce97c3295d889d879db9bc8d9d35aeecb0"} Dec 04 10:34:59 crc kubenswrapper[4831]: I1204 10:34:59.378522 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79f849bb84-btxkg" event={"ID":"a02d0bff-55e1-4de5-95e4-98d65018cbf0","Type":"ContainerStarted","Data":"95648dbaee9e864ba2ccf55317ebcdb751ed7d8234452c93f34ad216526502b4"} Dec 04 10:34:59 crc kubenswrapper[4831]: I1204 10:34:59.394825 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-744bfc5f58-hs9c9" podStartSLOduration=11.394802083 podStartE2EDuration="11.394802083s" podCreationTimestamp="2025-12-04 10:34:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:34:59.38326398 +0000 UTC m=+1196.332439314" watchObservedRunningTime="2025-12-04 10:34:59.394802083 +0000 UTC m=+1196.343977417" Dec 04 10:34:59 crc kubenswrapper[4831]: I1204 10:34:59.436632 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-69bc569cc4-dg5gj" event={"ID":"734f14e8-f267-4c7c-a5eb-d76457ec9d69","Type":"ContainerStarted","Data":"21fabf9e4d841a5963a9d80c4280d643470a93f7e3d6f88a33d974cb80520242"} Dec 04 10:34:59 crc kubenswrapper[4831]: I1204 10:34:59.436700 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-69bc569cc4-dg5gj" event={"ID":"734f14e8-f267-4c7c-a5eb-d76457ec9d69","Type":"ContainerStarted","Data":"3eae23560a016bf9d07b9e0eb3ef5be4e4d4f226ce920b4607c94e76f49b98c7"} Dec 04 10:34:59 crc kubenswrapper[4831]: I1204 10:34:59.455056 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6896bbf7f5-dvgps" event={"ID":"8c3dabff-2635-4e29-9651-8df5d84838f9","Type":"ContainerStarted","Data":"5c9837382db81ed3262d63ae0b1bbc74b091489f255df1c6313a71a90f1525f7"} Dec 04 10:34:59 crc kubenswrapper[4831]: I1204 10:34:59.460559 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-69bc569cc4-dg5gj" podStartSLOduration=16.781988329 podStartE2EDuration="20.460533612s" podCreationTimestamp="2025-12-04 10:34:39 +0000 UTC" firstStartedPulling="2025-12-04 10:34:54.425596118 +0000 UTC m=+1191.374771442" lastFinishedPulling="2025-12-04 10:34:58.104141411 +0000 UTC m=+1195.053316725" observedRunningTime="2025-12-04 10:34:59.456319765 +0000 UTC m=+1196.405495079" watchObservedRunningTime="2025-12-04 10:34:59.460533612 +0000 UTC m=+1196.409708926" Dec 04 10:34:59 crc kubenswrapper[4831]: I1204 10:34:59.464588 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"e13429a8-0182-4bfb-99b2-d1941a1e9af6","Type":"ContainerStarted","Data":"5f3f2a17a7dea4a5c1f13428aaf2bbe86e1e96d82e9933d826ec77dc0e7e974d"} Dec 04 10:34:59 crc kubenswrapper[4831]: I1204 10:34:59.483385 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9767fcff-cwqsr" Dec 04 10:34:59 crc kubenswrapper[4831]: I1204 10:34:59.484283 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f14967a3-96b4-46ce-8685-0b644a080cc8","Type":"ContainerStarted","Data":"c4e5baced66da659af4ca27729007be4ba62d7cc44df7500186c46d2ff071c45"} Dec 04 10:34:59 crc kubenswrapper[4831]: I1204 10:34:59.616693 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/watcher-api-0" podUID="5eab9136-8caf-4dc6-81e4-3d8544e3ad94" containerName="watcher-api-log" probeResult="failure" output="Get \"https://10.217.0.168:9322/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 10:35:00 crc kubenswrapper[4831]: I1204 10:35:00.016274 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b9767fcff-cwqsr"] Dec 04 10:35:00 crc kubenswrapper[4831]: I1204 10:35:00.027746 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b9767fcff-cwqsr"] Dec 04 10:35:00 crc kubenswrapper[4831]: I1204 10:35:00.522022 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-g5bvk" event={"ID":"18f48aa7-65d1-41ce-bc0d-4973db8b7abe","Type":"ContainerStarted","Data":"374ed1560d678f11ec4bed60c79dbd54071ebb83505a5e220a5dee9f726db39f"} Dec 04 10:35:00 crc kubenswrapper[4831]: I1204 10:35:00.549052 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7656fdcbd7-2jzrb" event={"ID":"a7a16dd7-b208-4a3a-9309-0ba292ed12fe","Type":"ContainerStarted","Data":"36db02e40224a63052c46c613debcf4d8c24d650642a0d056f187b6071ea0748"} Dec 04 10:35:00 crc kubenswrapper[4831]: I1204 10:35:00.549424 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7656fdcbd7-2jzrb" Dec 04 10:35:00 crc kubenswrapper[4831]: I1204 10:35:00.561838 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66c4c75c85-69mpg" event={"ID":"2f2179f9-7122-438d-85cc-012b724ccae8","Type":"ContainerStarted","Data":"6dbedc2d19ef19a798560d6d9d53fba32efb8f77418b1172f7b1de952dca0c68"} Dec 04 10:35:00 crc kubenswrapper[4831]: I1204 10:35:00.577566 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-g5bvk" podStartSLOduration=5.779871997 podStartE2EDuration="37.577542934s" podCreationTimestamp="2025-12-04 10:34:23 +0000 UTC" firstStartedPulling="2025-12-04 10:34:26.827323303 +0000 UTC m=+1163.776498617" lastFinishedPulling="2025-12-04 10:34:58.62499424 +0000 UTC m=+1195.574169554" observedRunningTime="2025-12-04 10:35:00.547951902 +0000 UTC m=+1197.497127216" watchObservedRunningTime="2025-12-04 10:35:00.577542934 +0000 UTC m=+1197.526718248" Dec 04 10:35:00 crc kubenswrapper[4831]: I1204 10:35:00.596765 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7656fdcbd7-2jzrb" podStartSLOduration=5.596745072 podStartE2EDuration="5.596745072s" podCreationTimestamp="2025-12-04 10:34:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:35:00.568230307 +0000 UTC m=+1197.517405611" watchObservedRunningTime="2025-12-04 10:35:00.596745072 +0000 UTC m=+1197.545920386" Dec 04 10:35:00 crc kubenswrapper[4831]: I1204 10:35:00.582870 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6896bbf7f5-dvgps" event={"ID":"8c3dabff-2635-4e29-9651-8df5d84838f9","Type":"ContainerStarted","Data":"ea9a0958214cea0d3497c60985a909056d6f1d0bda0792c6a969a26b366c998f"} Dec 04 10:35:00 crc kubenswrapper[4831]: I1204 10:35:00.600888 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79f849bb84-btxkg" event={"ID":"a02d0bff-55e1-4de5-95e4-98d65018cbf0","Type":"ContainerStarted","Data":"a555dd93ad49421b95a6d98a6065de3c3559238fc0d523fcb3ca02d4764348c3"} Dec 04 10:35:00 crc kubenswrapper[4831]: I1204 10:35:00.601912 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-79f849bb84-btxkg" Dec 04 10:35:00 crc kubenswrapper[4831]: I1204 10:35:00.639414 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f14967a3-96b4-46ce-8685-0b644a080cc8","Type":"ContainerStarted","Data":"f97e802035facbe195f98e253e44014237f087da841b85a9abe981efc0004f8e"} Dec 04 10:35:00 crc kubenswrapper[4831]: I1204 10:35:00.639457 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f14967a3-96b4-46ce-8685-0b644a080cc8","Type":"ContainerStarted","Data":"616e82a49829757f1f4dac4e5f428b0b3b339bf0e27da23b7cf9003f49b33e0c"} Dec 04 10:35:00 crc kubenswrapper[4831]: I1204 10:35:00.639954 4831 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 10:35:00 crc kubenswrapper[4831]: I1204 10:35:00.649004 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6896bbf7f5-dvgps" podStartSLOduration=18.024586332 podStartE2EDuration="21.648974348s" podCreationTimestamp="2025-12-04 10:34:39 +0000 UTC" firstStartedPulling="2025-12-04 10:34:54.586480255 +0000 UTC m=+1191.535655569" lastFinishedPulling="2025-12-04 10:34:58.210868271 +0000 UTC m=+1195.160043585" observedRunningTime="2025-12-04 10:35:00.616147864 +0000 UTC m=+1197.565323188" watchObservedRunningTime="2025-12-04 10:35:00.648974348 +0000 UTC m=+1197.598149662" Dec 04 10:35:00 crc kubenswrapper[4831]: I1204 10:35:00.653531 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-79f849bb84-btxkg" podStartSLOduration=5.653516314 podStartE2EDuration="5.653516314s" podCreationTimestamp="2025-12-04 10:34:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:35:00.644635498 +0000 UTC m=+1197.593810812" watchObservedRunningTime="2025-12-04 10:35:00.653516314 +0000 UTC m=+1197.602691628" Dec 04 10:35:00 crc kubenswrapper[4831]: I1204 10:35:00.961746 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 04 10:35:00 crc kubenswrapper[4831]: I1204 10:35:00.961799 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 04 10:35:00 crc kubenswrapper[4831]: I1204 10:35:00.962405 4831 scope.go:117] "RemoveContainer" containerID="bac90f7915063288de67a2e139c40837b5708a4e13373e20cb3ac91496de72b3" Dec 04 10:35:00 crc kubenswrapper[4831]: E1204 10:35:00.962715 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(6040f79c-a151-4681-a758-d2741bff68b6)\"" pod="openstack/watcher-decision-engine-0" podUID="6040f79c-a151-4681-a758-d2741bff68b6" Dec 04 10:35:01 crc kubenswrapper[4831]: I1204 10:35:01.302637 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3851f8b-c385-4852-8339-c9cb8f4586e5" path="/var/lib/kubelet/pods/b3851f8b-c385-4852-8339-c9cb8f4586e5/volumes" Dec 04 10:35:01 crc kubenswrapper[4831]: I1204 10:35:01.555306 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6694d6d998-wgcht" podUID="42db10e0-67a3-49d3-b6c5-8f48e31775e7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.153:8443: connect: connection refused" Dec 04 10:35:01 crc kubenswrapper[4831]: I1204 10:35:01.555409 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6694d6d998-wgcht" Dec 04 10:35:01 crc kubenswrapper[4831]: I1204 10:35:01.662736 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66c4c75c85-69mpg" event={"ID":"2f2179f9-7122-438d-85cc-012b724ccae8","Type":"ContainerStarted","Data":"aab9270ea0ba602947b70622a8e5a47deea2c19534217d419f880307ae13179b"} Dec 04 10:35:01 crc kubenswrapper[4831]: I1204 10:35:01.664798 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-66c4c75c85-69mpg" Dec 04 10:35:01 crc kubenswrapper[4831]: I1204 10:35:01.703110 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-66c4c75c85-69mpg" podStartSLOduration=4.703087391 podStartE2EDuration="4.703087391s" podCreationTimestamp="2025-12-04 10:34:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:35:01.690359938 +0000 UTC m=+1198.639535272" watchObservedRunningTime="2025-12-04 10:35:01.703087391 +0000 UTC m=+1198.652262695" Dec 04 10:35:02 crc kubenswrapper[4831]: I1204 10:35:02.774296 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Dec 04 10:35:02 crc kubenswrapper[4831]: I1204 10:35:02.919319 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-545d99f8dd-55gfz" Dec 04 10:35:02 crc kubenswrapper[4831]: I1204 10:35:02.919744 4831 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 10:35:02 crc kubenswrapper[4831]: I1204 10:35:02.942413 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-545d99f8dd-55gfz" Dec 04 10:35:04 crc kubenswrapper[4831]: I1204 10:35:04.288244 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-78b878b7bb-lxbbq" Dec 04 10:35:04 crc kubenswrapper[4831]: I1204 10:35:04.356985 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-8c9449df7-jllzg"] Dec 04 10:35:04 crc kubenswrapper[4831]: E1204 10:35:04.357501 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3851f8b-c385-4852-8339-c9cb8f4586e5" containerName="init" Dec 04 10:35:04 crc kubenswrapper[4831]: I1204 10:35:04.357513 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3851f8b-c385-4852-8339-c9cb8f4586e5" containerName="init" Dec 04 10:35:04 crc kubenswrapper[4831]: I1204 10:35:04.357707 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3851f8b-c385-4852-8339-c9cb8f4586e5" containerName="init" Dec 04 10:35:04 crc kubenswrapper[4831]: I1204 10:35:04.359223 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-8c9449df7-jllzg" Dec 04 10:35:04 crc kubenswrapper[4831]: I1204 10:35:04.364001 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 04 10:35:04 crc kubenswrapper[4831]: I1204 10:35:04.364346 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 04 10:35:04 crc kubenswrapper[4831]: I1204 10:35:04.364498 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 04 10:35:04 crc kubenswrapper[4831]: I1204 10:35:04.373031 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-8c9449df7-jllzg"] Dec 04 10:35:04 crc kubenswrapper[4831]: I1204 10:35:04.482850 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0-internal-tls-certs\") pod \"swift-proxy-8c9449df7-jllzg\" (UID: \"4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0\") " pod="openstack/swift-proxy-8c9449df7-jllzg" Dec 04 10:35:04 crc kubenswrapper[4831]: I1204 10:35:04.482957 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0-run-httpd\") pod \"swift-proxy-8c9449df7-jllzg\" (UID: \"4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0\") " pod="openstack/swift-proxy-8c9449df7-jllzg" Dec 04 10:35:04 crc kubenswrapper[4831]: I1204 10:35:04.482986 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0-log-httpd\") pod \"swift-proxy-8c9449df7-jllzg\" (UID: \"4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0\") " pod="openstack/swift-proxy-8c9449df7-jllzg" Dec 04 10:35:04 crc kubenswrapper[4831]: I1204 10:35:04.483016 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dxm7\" (UniqueName: \"kubernetes.io/projected/4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0-kube-api-access-6dxm7\") pod \"swift-proxy-8c9449df7-jllzg\" (UID: \"4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0\") " pod="openstack/swift-proxy-8c9449df7-jllzg" Dec 04 10:35:04 crc kubenswrapper[4831]: I1204 10:35:04.483067 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0-public-tls-certs\") pod \"swift-proxy-8c9449df7-jllzg\" (UID: \"4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0\") " pod="openstack/swift-proxy-8c9449df7-jllzg" Dec 04 10:35:04 crc kubenswrapper[4831]: I1204 10:35:04.483155 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0-config-data\") pod \"swift-proxy-8c9449df7-jllzg\" (UID: \"4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0\") " pod="openstack/swift-proxy-8c9449df7-jllzg" Dec 04 10:35:04 crc kubenswrapper[4831]: I1204 10:35:04.483292 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0-combined-ca-bundle\") pod \"swift-proxy-8c9449df7-jllzg\" (UID: \"4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0\") " pod="openstack/swift-proxy-8c9449df7-jllzg" Dec 04 10:35:04 crc kubenswrapper[4831]: I1204 10:35:04.483341 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0-etc-swift\") pod \"swift-proxy-8c9449df7-jllzg\" (UID: \"4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0\") " pod="openstack/swift-proxy-8c9449df7-jllzg" Dec 04 10:35:04 crc kubenswrapper[4831]: I1204 10:35:04.483815 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-78b878b7bb-lxbbq" Dec 04 10:35:04 crc kubenswrapper[4831]: I1204 10:35:04.563262 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-545d99f8dd-55gfz"] Dec 04 10:35:04 crc kubenswrapper[4831]: I1204 10:35:04.587839 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0-internal-tls-certs\") pod \"swift-proxy-8c9449df7-jllzg\" (UID: \"4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0\") " pod="openstack/swift-proxy-8c9449df7-jllzg" Dec 04 10:35:04 crc kubenswrapper[4831]: I1204 10:35:04.587909 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0-run-httpd\") pod \"swift-proxy-8c9449df7-jllzg\" (UID: \"4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0\") " pod="openstack/swift-proxy-8c9449df7-jllzg" Dec 04 10:35:04 crc kubenswrapper[4831]: I1204 10:35:04.587931 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0-log-httpd\") pod \"swift-proxy-8c9449df7-jllzg\" (UID: \"4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0\") " pod="openstack/swift-proxy-8c9449df7-jllzg" Dec 04 10:35:04 crc kubenswrapper[4831]: I1204 10:35:04.587949 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dxm7\" (UniqueName: \"kubernetes.io/projected/4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0-kube-api-access-6dxm7\") pod \"swift-proxy-8c9449df7-jllzg\" (UID: \"4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0\") " pod="openstack/swift-proxy-8c9449df7-jllzg" Dec 04 10:35:04 crc kubenswrapper[4831]: I1204 10:35:04.587978 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0-public-tls-certs\") pod \"swift-proxy-8c9449df7-jllzg\" (UID: \"4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0\") " pod="openstack/swift-proxy-8c9449df7-jllzg" Dec 04 10:35:04 crc kubenswrapper[4831]: I1204 10:35:04.588033 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0-config-data\") pod \"swift-proxy-8c9449df7-jllzg\" (UID: \"4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0\") " pod="openstack/swift-proxy-8c9449df7-jllzg" Dec 04 10:35:04 crc kubenswrapper[4831]: I1204 10:35:04.588088 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0-combined-ca-bundle\") pod \"swift-proxy-8c9449df7-jllzg\" (UID: \"4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0\") " pod="openstack/swift-proxy-8c9449df7-jllzg" Dec 04 10:35:04 crc kubenswrapper[4831]: I1204 10:35:04.588118 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0-etc-swift\") pod \"swift-proxy-8c9449df7-jllzg\" (UID: \"4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0\") " pod="openstack/swift-proxy-8c9449df7-jllzg" Dec 04 10:35:04 crc kubenswrapper[4831]: I1204 10:35:04.588867 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0-run-httpd\") pod \"swift-proxy-8c9449df7-jllzg\" (UID: \"4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0\") " pod="openstack/swift-proxy-8c9449df7-jllzg" Dec 04 10:35:04 crc kubenswrapper[4831]: I1204 10:35:04.589350 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0-log-httpd\") pod \"swift-proxy-8c9449df7-jllzg\" (UID: \"4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0\") " pod="openstack/swift-proxy-8c9449df7-jllzg" Dec 04 10:35:04 crc kubenswrapper[4831]: I1204 10:35:04.597727 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0-combined-ca-bundle\") pod \"swift-proxy-8c9449df7-jllzg\" (UID: \"4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0\") " pod="openstack/swift-proxy-8c9449df7-jllzg" Dec 04 10:35:04 crc kubenswrapper[4831]: I1204 10:35:04.601949 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0-config-data\") pod \"swift-proxy-8c9449df7-jllzg\" (UID: \"4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0\") " pod="openstack/swift-proxy-8c9449df7-jllzg" Dec 04 10:35:04 crc kubenswrapper[4831]: I1204 10:35:04.605634 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0-internal-tls-certs\") pod \"swift-proxy-8c9449df7-jllzg\" (UID: \"4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0\") " pod="openstack/swift-proxy-8c9449df7-jllzg" Dec 04 10:35:04 crc kubenswrapper[4831]: I1204 10:35:04.612763 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0-etc-swift\") pod \"swift-proxy-8c9449df7-jllzg\" (UID: \"4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0\") " pod="openstack/swift-proxy-8c9449df7-jllzg" Dec 04 10:35:04 crc kubenswrapper[4831]: I1204 10:35:04.613432 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0-public-tls-certs\") pod \"swift-proxy-8c9449df7-jllzg\" (UID: \"4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0\") " pod="openstack/swift-proxy-8c9449df7-jllzg" Dec 04 10:35:04 crc kubenswrapper[4831]: I1204 10:35:04.639333 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dxm7\" (UniqueName: \"kubernetes.io/projected/4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0-kube-api-access-6dxm7\") pod \"swift-proxy-8c9449df7-jllzg\" (UID: \"4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0\") " pod="openstack/swift-proxy-8c9449df7-jllzg" Dec 04 10:35:04 crc kubenswrapper[4831]: I1204 10:35:04.707925 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-545d99f8dd-55gfz" podUID="be37a16a-5f7e-4f97-b71d-ee344177919c" containerName="barbican-api-log" containerID="cri-o://71f5a152442f7f92c12464a98c106ae920c407a47b36f18c73a5483549259c64" gracePeriod=30 Dec 04 10:35:04 crc kubenswrapper[4831]: I1204 10:35:04.708434 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-545d99f8dd-55gfz" podUID="be37a16a-5f7e-4f97-b71d-ee344177919c" containerName="barbican-api" containerID="cri-o://128236ce2adfca9eacb943496711e32ee6b4f03660b05db1105554aed5dbb6d7" gracePeriod=30 Dec 04 10:35:04 crc kubenswrapper[4831]: I1204 10:35:04.714535 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-545d99f8dd-55gfz" podUID="be37a16a-5f7e-4f97-b71d-ee344177919c" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": EOF" Dec 04 10:35:04 crc kubenswrapper[4831]: I1204 10:35:04.714595 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-545d99f8dd-55gfz" podUID="be37a16a-5f7e-4f97-b71d-ee344177919c" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": EOF" Dec 04 10:35:04 crc kubenswrapper[4831]: I1204 10:35:04.714805 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-545d99f8dd-55gfz" podUID="be37a16a-5f7e-4f97-b71d-ee344177919c" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": EOF" Dec 04 10:35:04 crc kubenswrapper[4831]: I1204 10:35:04.714926 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-545d99f8dd-55gfz" podUID="be37a16a-5f7e-4f97-b71d-ee344177919c" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": EOF" Dec 04 10:35:04 crc kubenswrapper[4831]: I1204 10:35:04.726440 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-8c9449df7-jllzg" Dec 04 10:35:05 crc kubenswrapper[4831]: I1204 10:35:05.611930 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7656fdcbd7-2jzrb" Dec 04 10:35:05 crc kubenswrapper[4831]: I1204 10:35:05.684025 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6ddf665d6c-8wdxb"] Dec 04 10:35:05 crc kubenswrapper[4831]: I1204 10:35:05.684293 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6ddf665d6c-8wdxb" podUID="e40b774b-77a1-455f-90c3-a4fe7ff1c9ad" containerName="dnsmasq-dns" containerID="cri-o://93fa1f9d1c6c393f66567187f37aed9234757c1848a2067184607761bb26b064" gracePeriod=10 Dec 04 10:35:05 crc kubenswrapper[4831]: I1204 10:35:05.753465 4831 generic.go:334] "Generic (PLEG): container finished" podID="be37a16a-5f7e-4f97-b71d-ee344177919c" containerID="71f5a152442f7f92c12464a98c106ae920c407a47b36f18c73a5483549259c64" exitCode=143 Dec 04 10:35:05 crc kubenswrapper[4831]: I1204 10:35:05.753523 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-545d99f8dd-55gfz" event={"ID":"be37a16a-5f7e-4f97-b71d-ee344177919c","Type":"ContainerDied","Data":"71f5a152442f7f92c12464a98c106ae920c407a47b36f18c73a5483549259c64"} Dec 04 10:35:05 crc kubenswrapper[4831]: I1204 10:35:05.759636 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-8c9449df7-jllzg"] Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.126403 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.306456 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ddf665d6c-8wdxb" Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.434308 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e40b774b-77a1-455f-90c3-a4fe7ff1c9ad-dns-svc\") pod \"e40b774b-77a1-455f-90c3-a4fe7ff1c9ad\" (UID: \"e40b774b-77a1-455f-90c3-a4fe7ff1c9ad\") " Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.434466 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e40b774b-77a1-455f-90c3-a4fe7ff1c9ad-ovsdbserver-sb\") pod \"e40b774b-77a1-455f-90c3-a4fe7ff1c9ad\" (UID: \"e40b774b-77a1-455f-90c3-a4fe7ff1c9ad\") " Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.434516 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e40b774b-77a1-455f-90c3-a4fe7ff1c9ad-config\") pod \"e40b774b-77a1-455f-90c3-a4fe7ff1c9ad\" (UID: \"e40b774b-77a1-455f-90c3-a4fe7ff1c9ad\") " Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.434542 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmv96\" (UniqueName: \"kubernetes.io/projected/e40b774b-77a1-455f-90c3-a4fe7ff1c9ad-kube-api-access-pmv96\") pod \"e40b774b-77a1-455f-90c3-a4fe7ff1c9ad\" (UID: \"e40b774b-77a1-455f-90c3-a4fe7ff1c9ad\") " Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.434572 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e40b774b-77a1-455f-90c3-a4fe7ff1c9ad-ovsdbserver-nb\") pod \"e40b774b-77a1-455f-90c3-a4fe7ff1c9ad\" (UID: \"e40b774b-77a1-455f-90c3-a4fe7ff1c9ad\") " Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.434631 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e40b774b-77a1-455f-90c3-a4fe7ff1c9ad-dns-swift-storage-0\") pod \"e40b774b-77a1-455f-90c3-a4fe7ff1c9ad\" (UID: \"e40b774b-77a1-455f-90c3-a4fe7ff1c9ad\") " Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.446919 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e40b774b-77a1-455f-90c3-a4fe7ff1c9ad-kube-api-access-pmv96" (OuterVolumeSpecName: "kube-api-access-pmv96") pod "e40b774b-77a1-455f-90c3-a4fe7ff1c9ad" (UID: "e40b774b-77a1-455f-90c3-a4fe7ff1c9ad"). InnerVolumeSpecName "kube-api-access-pmv96". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.476275 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6694d6d998-wgcht" Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.525292 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e40b774b-77a1-455f-90c3-a4fe7ff1c9ad-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e40b774b-77a1-455f-90c3-a4fe7ff1c9ad" (UID: "e40b774b-77a1-455f-90c3-a4fe7ff1c9ad"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.534159 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e40b774b-77a1-455f-90c3-a4fe7ff1c9ad-config" (OuterVolumeSpecName: "config") pod "e40b774b-77a1-455f-90c3-a4fe7ff1c9ad" (UID: "e40b774b-77a1-455f-90c3-a4fe7ff1c9ad"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.536915 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmv96\" (UniqueName: \"kubernetes.io/projected/e40b774b-77a1-455f-90c3-a4fe7ff1c9ad-kube-api-access-pmv96\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.536943 4831 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e40b774b-77a1-455f-90c3-a4fe7ff1c9ad-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.536954 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e40b774b-77a1-455f-90c3-a4fe7ff1c9ad-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.558261 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e40b774b-77a1-455f-90c3-a4fe7ff1c9ad-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e40b774b-77a1-455f-90c3-a4fe7ff1c9ad" (UID: "e40b774b-77a1-455f-90c3-a4fe7ff1c9ad"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.558913 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e40b774b-77a1-455f-90c3-a4fe7ff1c9ad-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e40b774b-77a1-455f-90c3-a4fe7ff1c9ad" (UID: "e40b774b-77a1-455f-90c3-a4fe7ff1c9ad"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.565374 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e40b774b-77a1-455f-90c3-a4fe7ff1c9ad-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e40b774b-77a1-455f-90c3-a4fe7ff1c9ad" (UID: "e40b774b-77a1-455f-90c3-a4fe7ff1c9ad"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.637397 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42db10e0-67a3-49d3-b6c5-8f48e31775e7-config-data\") pod \"42db10e0-67a3-49d3-b6c5-8f48e31775e7\" (UID: \"42db10e0-67a3-49d3-b6c5-8f48e31775e7\") " Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.637441 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwhgl\" (UniqueName: \"kubernetes.io/projected/42db10e0-67a3-49d3-b6c5-8f48e31775e7-kube-api-access-gwhgl\") pod \"42db10e0-67a3-49d3-b6c5-8f48e31775e7\" (UID: \"42db10e0-67a3-49d3-b6c5-8f48e31775e7\") " Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.637494 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42db10e0-67a3-49d3-b6c5-8f48e31775e7-logs\") pod \"42db10e0-67a3-49d3-b6c5-8f48e31775e7\" (UID: \"42db10e0-67a3-49d3-b6c5-8f48e31775e7\") " Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.637573 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/42db10e0-67a3-49d3-b6c5-8f48e31775e7-horizon-secret-key\") pod \"42db10e0-67a3-49d3-b6c5-8f48e31775e7\" (UID: \"42db10e0-67a3-49d3-b6c5-8f48e31775e7\") " Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.637612 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/42db10e0-67a3-49d3-b6c5-8f48e31775e7-scripts\") pod \"42db10e0-67a3-49d3-b6c5-8f48e31775e7\" (UID: \"42db10e0-67a3-49d3-b6c5-8f48e31775e7\") " Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.637644 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/42db10e0-67a3-49d3-b6c5-8f48e31775e7-horizon-tls-certs\") pod \"42db10e0-67a3-49d3-b6c5-8f48e31775e7\" (UID: \"42db10e0-67a3-49d3-b6c5-8f48e31775e7\") " Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.637701 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42db10e0-67a3-49d3-b6c5-8f48e31775e7-combined-ca-bundle\") pod \"42db10e0-67a3-49d3-b6c5-8f48e31775e7\" (UID: \"42db10e0-67a3-49d3-b6c5-8f48e31775e7\") " Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.638090 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e40b774b-77a1-455f-90c3-a4fe7ff1c9ad-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.638109 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e40b774b-77a1-455f-90c3-a4fe7ff1c9ad-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.638118 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e40b774b-77a1-455f-90c3-a4fe7ff1c9ad-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.638680 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42db10e0-67a3-49d3-b6c5-8f48e31775e7-logs" (OuterVolumeSpecName: "logs") pod "42db10e0-67a3-49d3-b6c5-8f48e31775e7" (UID: "42db10e0-67a3-49d3-b6c5-8f48e31775e7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.645268 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42db10e0-67a3-49d3-b6c5-8f48e31775e7-kube-api-access-gwhgl" (OuterVolumeSpecName: "kube-api-access-gwhgl") pod "42db10e0-67a3-49d3-b6c5-8f48e31775e7" (UID: "42db10e0-67a3-49d3-b6c5-8f48e31775e7"). InnerVolumeSpecName "kube-api-access-gwhgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.650028 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42db10e0-67a3-49d3-b6c5-8f48e31775e7-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "42db10e0-67a3-49d3-b6c5-8f48e31775e7" (UID: "42db10e0-67a3-49d3-b6c5-8f48e31775e7"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.685456 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42db10e0-67a3-49d3-b6c5-8f48e31775e7-config-data" (OuterVolumeSpecName: "config-data") pod "42db10e0-67a3-49d3-b6c5-8f48e31775e7" (UID: "42db10e0-67a3-49d3-b6c5-8f48e31775e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.718245 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42db10e0-67a3-49d3-b6c5-8f48e31775e7-scripts" (OuterVolumeSpecName: "scripts") pod "42db10e0-67a3-49d3-b6c5-8f48e31775e7" (UID: "42db10e0-67a3-49d3-b6c5-8f48e31775e7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.731797 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42db10e0-67a3-49d3-b6c5-8f48e31775e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "42db10e0-67a3-49d3-b6c5-8f48e31775e7" (UID: "42db10e0-67a3-49d3-b6c5-8f48e31775e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.743158 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42db10e0-67a3-49d3-b6c5-8f48e31775e7-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.743193 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwhgl\" (UniqueName: \"kubernetes.io/projected/42db10e0-67a3-49d3-b6c5-8f48e31775e7-kube-api-access-gwhgl\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.743205 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42db10e0-67a3-49d3-b6c5-8f48e31775e7-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.743215 4831 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/42db10e0-67a3-49d3-b6c5-8f48e31775e7-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.743226 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/42db10e0-67a3-49d3-b6c5-8f48e31775e7-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.743236 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42db10e0-67a3-49d3-b6c5-8f48e31775e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.746889 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42db10e0-67a3-49d3-b6c5-8f48e31775e7-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "42db10e0-67a3-49d3-b6c5-8f48e31775e7" (UID: "42db10e0-67a3-49d3-b6c5-8f48e31775e7"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.791140 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-8c9449df7-jllzg" event={"ID":"4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0","Type":"ContainerStarted","Data":"bc6da550ab2bb6bd2d8edfefd85fcc26864743ab27e43602983bdfc5540aa04f"} Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.791190 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-8c9449df7-jllzg" event={"ID":"4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0","Type":"ContainerStarted","Data":"7cb0bd9fcbc366ce0679d8587d20618f6e8215db98b563ee925380acad94151d"} Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.791202 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-8c9449df7-jllzg" event={"ID":"4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0","Type":"ContainerStarted","Data":"89c20e8a75bbfe00afa82c98de92393b7d64fd3ea000c6f51881bcba6b315106"} Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.792481 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-8c9449df7-jllzg" Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.792519 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-8c9449df7-jllzg" Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.795146 4831 generic.go:334] "Generic (PLEG): container finished" podID="e40b774b-77a1-455f-90c3-a4fe7ff1c9ad" containerID="93fa1f9d1c6c393f66567187f37aed9234757c1848a2067184607761bb26b064" exitCode=0 Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.795199 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ddf665d6c-8wdxb" event={"ID":"e40b774b-77a1-455f-90c3-a4fe7ff1c9ad","Type":"ContainerDied","Data":"93fa1f9d1c6c393f66567187f37aed9234757c1848a2067184607761bb26b064"} Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.795221 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ddf665d6c-8wdxb" event={"ID":"e40b774b-77a1-455f-90c3-a4fe7ff1c9ad","Type":"ContainerDied","Data":"7478e31b15553d2de1dd23e54b7e18fe6046fb9e0131bd67ed1bb96d0522697b"} Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.795241 4831 scope.go:117] "RemoveContainer" containerID="93fa1f9d1c6c393f66567187f37aed9234757c1848a2067184607761bb26b064" Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.795364 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ddf665d6c-8wdxb" Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.829401 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f14967a3-96b4-46ce-8685-0b644a080cc8","Type":"ContainerStarted","Data":"7a1ce6d425ab8f08be9e438ae877996cc4b941431730ec001a53308b161f2b87"} Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.837150 4831 generic.go:334] "Generic (PLEG): container finished" podID="42db10e0-67a3-49d3-b6c5-8f48e31775e7" containerID="7ee53d79998a754dec4a0a0360839ddcc132026c78ae90554a04a9bb064b610a" exitCode=137 Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.837195 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6694d6d998-wgcht" event={"ID":"42db10e0-67a3-49d3-b6c5-8f48e31775e7","Type":"ContainerDied","Data":"7ee53d79998a754dec4a0a0360839ddcc132026c78ae90554a04a9bb064b610a"} Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.837218 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6694d6d998-wgcht" event={"ID":"42db10e0-67a3-49d3-b6c5-8f48e31775e7","Type":"ContainerDied","Data":"64dc7716d32e5689512dbff216be683df65b38feec19853322823f863328f562"} Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.837274 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6694d6d998-wgcht" Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.844526 4831 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/42db10e0-67a3-49d3-b6c5-8f48e31775e7-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.850615 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-8c9449df7-jllzg" podStartSLOduration=2.850592075 podStartE2EDuration="2.850592075s" podCreationTimestamp="2025-12-04 10:35:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:35:06.841473683 +0000 UTC m=+1203.790648997" watchObservedRunningTime="2025-12-04 10:35:06.850592075 +0000 UTC m=+1203.799767379" Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.921275 4831 scope.go:117] "RemoveContainer" containerID="babba742bc0c252166bceb9fdddd621b9493270a5f6e6c907a55cac89a927268" Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.932728 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6ddf665d6c-8wdxb"] Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.942743 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6ddf665d6c-8wdxb"] Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.951993 4831 scope.go:117] "RemoveContainer" containerID="93fa1f9d1c6c393f66567187f37aed9234757c1848a2067184607761bb26b064" Dec 04 10:35:06 crc kubenswrapper[4831]: E1204 10:35:06.952685 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93fa1f9d1c6c393f66567187f37aed9234757c1848a2067184607761bb26b064\": container with ID starting with 93fa1f9d1c6c393f66567187f37aed9234757c1848a2067184607761bb26b064 not found: ID does not exist" containerID="93fa1f9d1c6c393f66567187f37aed9234757c1848a2067184607761bb26b064" Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.952955 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93fa1f9d1c6c393f66567187f37aed9234757c1848a2067184607761bb26b064"} err="failed to get container status \"93fa1f9d1c6c393f66567187f37aed9234757c1848a2067184607761bb26b064\": rpc error: code = NotFound desc = could not find container \"93fa1f9d1c6c393f66567187f37aed9234757c1848a2067184607761bb26b064\": container with ID starting with 93fa1f9d1c6c393f66567187f37aed9234757c1848a2067184607761bb26b064 not found: ID does not exist" Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.952996 4831 scope.go:117] "RemoveContainer" containerID="babba742bc0c252166bceb9fdddd621b9493270a5f6e6c907a55cac89a927268" Dec 04 10:35:06 crc kubenswrapper[4831]: E1204 10:35:06.953861 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"babba742bc0c252166bceb9fdddd621b9493270a5f6e6c907a55cac89a927268\": container with ID starting with babba742bc0c252166bceb9fdddd621b9493270a5f6e6c907a55cac89a927268 not found: ID does not exist" containerID="babba742bc0c252166bceb9fdddd621b9493270a5f6e6c907a55cac89a927268" Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.953995 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"babba742bc0c252166bceb9fdddd621b9493270a5f6e6c907a55cac89a927268"} err="failed to get container status \"babba742bc0c252166bceb9fdddd621b9493270a5f6e6c907a55cac89a927268\": rpc error: code = NotFound desc = could not find container \"babba742bc0c252166bceb9fdddd621b9493270a5f6e6c907a55cac89a927268\": container with ID starting with babba742bc0c252166bceb9fdddd621b9493270a5f6e6c907a55cac89a927268 not found: ID does not exist" Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.954147 4831 scope.go:117] "RemoveContainer" containerID="5c5695bb0b5c313b73a355c41b184d4d28fd82fedf874d90f04b2d87b2c1843a" Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.954927 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6694d6d998-wgcht"] Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.969533 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6694d6d998-wgcht"] Dec 04 10:35:06 crc kubenswrapper[4831]: I1204 10:35:06.988305 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-744bfc5f58-hs9c9" Dec 04 10:35:07 crc kubenswrapper[4831]: I1204 10:35:07.211269 4831 scope.go:117] "RemoveContainer" containerID="7ee53d79998a754dec4a0a0360839ddcc132026c78ae90554a04a9bb064b610a" Dec 04 10:35:07 crc kubenswrapper[4831]: I1204 10:35:07.252206 4831 scope.go:117] "RemoveContainer" containerID="5c5695bb0b5c313b73a355c41b184d4d28fd82fedf874d90f04b2d87b2c1843a" Dec 04 10:35:07 crc kubenswrapper[4831]: E1204 10:35:07.252742 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c5695bb0b5c313b73a355c41b184d4d28fd82fedf874d90f04b2d87b2c1843a\": container with ID starting with 5c5695bb0b5c313b73a355c41b184d4d28fd82fedf874d90f04b2d87b2c1843a not found: ID does not exist" containerID="5c5695bb0b5c313b73a355c41b184d4d28fd82fedf874d90f04b2d87b2c1843a" Dec 04 10:35:07 crc kubenswrapper[4831]: I1204 10:35:07.252779 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c5695bb0b5c313b73a355c41b184d4d28fd82fedf874d90f04b2d87b2c1843a"} err="failed to get container status \"5c5695bb0b5c313b73a355c41b184d4d28fd82fedf874d90f04b2d87b2c1843a\": rpc error: code = NotFound desc = could not find container \"5c5695bb0b5c313b73a355c41b184d4d28fd82fedf874d90f04b2d87b2c1843a\": container with ID starting with 5c5695bb0b5c313b73a355c41b184d4d28fd82fedf874d90f04b2d87b2c1843a not found: ID does not exist" Dec 04 10:35:07 crc kubenswrapper[4831]: I1204 10:35:07.252806 4831 scope.go:117] "RemoveContainer" containerID="7ee53d79998a754dec4a0a0360839ddcc132026c78ae90554a04a9bb064b610a" Dec 04 10:35:07 crc kubenswrapper[4831]: E1204 10:35:07.253322 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ee53d79998a754dec4a0a0360839ddcc132026c78ae90554a04a9bb064b610a\": container with ID starting with 7ee53d79998a754dec4a0a0360839ddcc132026c78ae90554a04a9bb064b610a not found: ID does not exist" containerID="7ee53d79998a754dec4a0a0360839ddcc132026c78ae90554a04a9bb064b610a" Dec 04 10:35:07 crc kubenswrapper[4831]: I1204 10:35:07.253344 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ee53d79998a754dec4a0a0360839ddcc132026c78ae90554a04a9bb064b610a"} err="failed to get container status \"7ee53d79998a754dec4a0a0360839ddcc132026c78ae90554a04a9bb064b610a\": rpc error: code = NotFound desc = could not find container \"7ee53d79998a754dec4a0a0360839ddcc132026c78ae90554a04a9bb064b610a\": container with ID starting with 7ee53d79998a754dec4a0a0360839ddcc132026c78ae90554a04a9bb064b610a not found: ID does not exist" Dec 04 10:35:07 crc kubenswrapper[4831]: I1204 10:35:07.295977 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42db10e0-67a3-49d3-b6c5-8f48e31775e7" path="/var/lib/kubelet/pods/42db10e0-67a3-49d3-b6c5-8f48e31775e7/volumes" Dec 04 10:35:07 crc kubenswrapper[4831]: I1204 10:35:07.296582 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e40b774b-77a1-455f-90c3-a4fe7ff1c9ad" path="/var/lib/kubelet/pods/e40b774b-77a1-455f-90c3-a4fe7ff1c9ad/volumes" Dec 04 10:35:08 crc kubenswrapper[4831]: I1204 10:35:08.620578 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-744bfc5f58-hs9c9" Dec 04 10:35:08 crc kubenswrapper[4831]: I1204 10:35:08.626685 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-545d99f8dd-55gfz" podUID="be37a16a-5f7e-4f97-b71d-ee344177919c" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": read tcp 10.217.0.2:47956->10.217.0.166:9311: read: connection reset by peer" Dec 04 10:35:08 crc kubenswrapper[4831]: I1204 10:35:08.626739 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-545d99f8dd-55gfz" podUID="be37a16a-5f7e-4f97-b71d-ee344177919c" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": read tcp 10.217.0.2:47962->10.217.0.166:9311: read: connection reset by peer" Dec 04 10:35:08 crc kubenswrapper[4831]: I1204 10:35:08.627228 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-545d99f8dd-55gfz" podUID="be37a16a-5f7e-4f97-b71d-ee344177919c" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": dial tcp 10.217.0.166:9311: connect: connection refused" Dec 04 10:35:08 crc kubenswrapper[4831]: I1204 10:35:08.627356 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-545d99f8dd-55gfz" Dec 04 10:35:08 crc kubenswrapper[4831]: I1204 10:35:08.630221 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Dec 04 10:35:08 crc kubenswrapper[4831]: I1204 10:35:08.679076 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Dec 04 10:35:08 crc kubenswrapper[4831]: I1204 10:35:08.896943 4831 generic.go:334] "Generic (PLEG): container finished" podID="be37a16a-5f7e-4f97-b71d-ee344177919c" containerID="128236ce2adfca9eacb943496711e32ee6b4f03660b05db1105554aed5dbb6d7" exitCode=0 Dec 04 10:35:08 crc kubenswrapper[4831]: I1204 10:35:08.897047 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-545d99f8dd-55gfz" event={"ID":"be37a16a-5f7e-4f97-b71d-ee344177919c","Type":"ContainerDied","Data":"128236ce2adfca9eacb943496711e32ee6b4f03660b05db1105554aed5dbb6d7"} Dec 04 10:35:08 crc kubenswrapper[4831]: I1204 10:35:08.907637 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f14967a3-96b4-46ce-8685-0b644a080cc8","Type":"ContainerStarted","Data":"5034b34510b64120bdd1d44c6caa27737a6f800a9f7581b860b0cb31b33edb6d"} Dec 04 10:35:08 crc kubenswrapper[4831]: I1204 10:35:08.907949 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f14967a3-96b4-46ce-8685-0b644a080cc8" containerName="ceilometer-central-agent" containerID="cri-o://f97e802035facbe195f98e253e44014237f087da841b85a9abe981efc0004f8e" gracePeriod=30 Dec 04 10:35:08 crc kubenswrapper[4831]: I1204 10:35:08.908305 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f14967a3-96b4-46ce-8685-0b644a080cc8" containerName="proxy-httpd" containerID="cri-o://5034b34510b64120bdd1d44c6caa27737a6f800a9f7581b860b0cb31b33edb6d" gracePeriod=30 Dec 04 10:35:08 crc kubenswrapper[4831]: I1204 10:35:08.908501 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f14967a3-96b4-46ce-8685-0b644a080cc8" containerName="sg-core" containerID="cri-o://7a1ce6d425ab8f08be9e438ae877996cc4b941431730ec001a53308b161f2b87" gracePeriod=30 Dec 04 10:35:08 crc kubenswrapper[4831]: I1204 10:35:08.908575 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f14967a3-96b4-46ce-8685-0b644a080cc8" containerName="ceilometer-notification-agent" containerID="cri-o://616e82a49829757f1f4dac4e5f428b0b3b339bf0e27da23b7cf9003f49b33e0c" gracePeriod=30 Dec 04 10:35:08 crc kubenswrapper[4831]: I1204 10:35:08.940644 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.071284415 podStartE2EDuration="12.940201469s" podCreationTimestamp="2025-12-04 10:34:56 +0000 UTC" firstStartedPulling="2025-12-04 10:34:58.974312673 +0000 UTC m=+1195.923487977" lastFinishedPulling="2025-12-04 10:35:07.843229717 +0000 UTC m=+1204.792405031" observedRunningTime="2025-12-04 10:35:08.931082607 +0000 UTC m=+1205.880257921" watchObservedRunningTime="2025-12-04 10:35:08.940201469 +0000 UTC m=+1205.889376793" Dec 04 10:35:09 crc kubenswrapper[4831]: I1204 10:35:09.920306 4831 generic.go:334] "Generic (PLEG): container finished" podID="f14967a3-96b4-46ce-8685-0b644a080cc8" containerID="5034b34510b64120bdd1d44c6caa27737a6f800a9f7581b860b0cb31b33edb6d" exitCode=0 Dec 04 10:35:09 crc kubenswrapper[4831]: I1204 10:35:09.920338 4831 generic.go:334] "Generic (PLEG): container finished" podID="f14967a3-96b4-46ce-8685-0b644a080cc8" containerID="7a1ce6d425ab8f08be9e438ae877996cc4b941431730ec001a53308b161f2b87" exitCode=2 Dec 04 10:35:09 crc kubenswrapper[4831]: I1204 10:35:09.920346 4831 generic.go:334] "Generic (PLEG): container finished" podID="f14967a3-96b4-46ce-8685-0b644a080cc8" containerID="616e82a49829757f1f4dac4e5f428b0b3b339bf0e27da23b7cf9003f49b33e0c" exitCode=0 Dec 04 10:35:09 crc kubenswrapper[4831]: I1204 10:35:09.920353 4831 generic.go:334] "Generic (PLEG): container finished" podID="f14967a3-96b4-46ce-8685-0b644a080cc8" containerID="f97e802035facbe195f98e253e44014237f087da841b85a9abe981efc0004f8e" exitCode=0 Dec 04 10:35:09 crc kubenswrapper[4831]: I1204 10:35:09.920371 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f14967a3-96b4-46ce-8685-0b644a080cc8","Type":"ContainerDied","Data":"5034b34510b64120bdd1d44c6caa27737a6f800a9f7581b860b0cb31b33edb6d"} Dec 04 10:35:09 crc kubenswrapper[4831]: I1204 10:35:09.920395 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f14967a3-96b4-46ce-8685-0b644a080cc8","Type":"ContainerDied","Data":"7a1ce6d425ab8f08be9e438ae877996cc4b941431730ec001a53308b161f2b87"} Dec 04 10:35:09 crc kubenswrapper[4831]: I1204 10:35:09.920406 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f14967a3-96b4-46ce-8685-0b644a080cc8","Type":"ContainerDied","Data":"616e82a49829757f1f4dac4e5f428b0b3b339bf0e27da23b7cf9003f49b33e0c"} Dec 04 10:35:09 crc kubenswrapper[4831]: I1204 10:35:09.920414 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f14967a3-96b4-46ce-8685-0b644a080cc8","Type":"ContainerDied","Data":"f97e802035facbe195f98e253e44014237f087da841b85a9abe981efc0004f8e"} Dec 04 10:35:10 crc kubenswrapper[4831]: I1204 10:35:10.231889 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-545d99f8dd-55gfz" podUID="be37a16a-5f7e-4f97-b71d-ee344177919c" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": dial tcp 10.217.0.166:9311: connect: connection refused" Dec 04 10:35:10 crc kubenswrapper[4831]: I1204 10:35:10.231922 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-545d99f8dd-55gfz" podUID="be37a16a-5f7e-4f97-b71d-ee344177919c" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": dial tcp 10.217.0.166:9311: connect: connection refused" Dec 04 10:35:10 crc kubenswrapper[4831]: I1204 10:35:10.232380 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-545d99f8dd-55gfz" Dec 04 10:35:10 crc kubenswrapper[4831]: I1204 10:35:10.961380 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 04 10:35:10 crc kubenswrapper[4831]: I1204 10:35:10.961738 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Dec 04 10:35:10 crc kubenswrapper[4831]: I1204 10:35:10.962542 4831 scope.go:117] "RemoveContainer" containerID="bac90f7915063288de67a2e139c40837b5708a4e13373e20cb3ac91496de72b3" Dec 04 10:35:10 crc kubenswrapper[4831]: E1204 10:35:10.962911 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(6040f79c-a151-4681-a758-d2741bff68b6)\"" pod="openstack/watcher-decision-engine-0" podUID="6040f79c-a151-4681-a758-d2741bff68b6" Dec 04 10:35:11 crc kubenswrapper[4831]: I1204 10:35:11.958007 4831 generic.go:334] "Generic (PLEG): container finished" podID="94129f00-4043-4552-9724-feef1585cd20" containerID="0932a40e56e000a340375dbe475e2b4e990f48ccb1c48929447bb5a112edb0b3" exitCode=0 Dec 04 10:35:11 crc kubenswrapper[4831]: I1204 10:35:11.958076 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-dmcxs" event={"ID":"94129f00-4043-4552-9724-feef1585cd20","Type":"ContainerDied","Data":"0932a40e56e000a340375dbe475e2b4e990f48ccb1c48929447bb5a112edb0b3"} Dec 04 10:35:13 crc kubenswrapper[4831]: I1204 10:35:13.843639 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-x7lqv"] Dec 04 10:35:13 crc kubenswrapper[4831]: E1204 10:35:13.845653 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e40b774b-77a1-455f-90c3-a4fe7ff1c9ad" containerName="dnsmasq-dns" Dec 04 10:35:13 crc kubenswrapper[4831]: I1204 10:35:13.845771 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e40b774b-77a1-455f-90c3-a4fe7ff1c9ad" containerName="dnsmasq-dns" Dec 04 10:35:13 crc kubenswrapper[4831]: E1204 10:35:13.845851 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42db10e0-67a3-49d3-b6c5-8f48e31775e7" containerName="horizon" Dec 04 10:35:13 crc kubenswrapper[4831]: I1204 10:35:13.845917 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="42db10e0-67a3-49d3-b6c5-8f48e31775e7" containerName="horizon" Dec 04 10:35:13 crc kubenswrapper[4831]: E1204 10:35:13.846018 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e40b774b-77a1-455f-90c3-a4fe7ff1c9ad" containerName="init" Dec 04 10:35:13 crc kubenswrapper[4831]: I1204 10:35:13.846098 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e40b774b-77a1-455f-90c3-a4fe7ff1c9ad" containerName="init" Dec 04 10:35:13 crc kubenswrapper[4831]: E1204 10:35:13.846189 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42db10e0-67a3-49d3-b6c5-8f48e31775e7" containerName="horizon-log" Dec 04 10:35:13 crc kubenswrapper[4831]: I1204 10:35:13.846261 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="42db10e0-67a3-49d3-b6c5-8f48e31775e7" containerName="horizon-log" Dec 04 10:35:13 crc kubenswrapper[4831]: I1204 10:35:13.846561 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="42db10e0-67a3-49d3-b6c5-8f48e31775e7" containerName="horizon-log" Dec 04 10:35:13 crc kubenswrapper[4831]: I1204 10:35:13.846690 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e40b774b-77a1-455f-90c3-a4fe7ff1c9ad" containerName="dnsmasq-dns" Dec 04 10:35:13 crc kubenswrapper[4831]: I1204 10:35:13.846780 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="42db10e0-67a3-49d3-b6c5-8f48e31775e7" containerName="horizon" Dec 04 10:35:13 crc kubenswrapper[4831]: I1204 10:35:13.847772 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-x7lqv" Dec 04 10:35:13 crc kubenswrapper[4831]: I1204 10:35:13.860516 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-x7lqv"] Dec 04 10:35:13 crc kubenswrapper[4831]: I1204 10:35:13.916147 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q7nr\" (UniqueName: \"kubernetes.io/projected/f292b8ac-6250-4a8a-b73e-75c6aeebe9d5-kube-api-access-9q7nr\") pod \"nova-api-db-create-x7lqv\" (UID: \"f292b8ac-6250-4a8a-b73e-75c6aeebe9d5\") " pod="openstack/nova-api-db-create-x7lqv" Dec 04 10:35:13 crc kubenswrapper[4831]: I1204 10:35:13.929823 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-vthqg"] Dec 04 10:35:13 crc kubenswrapper[4831]: I1204 10:35:13.931540 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vthqg" Dec 04 10:35:13 crc kubenswrapper[4831]: I1204 10:35:13.942003 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-vthqg"] Dec 04 10:35:14 crc kubenswrapper[4831]: I1204 10:35:14.023825 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jm88\" (UniqueName: \"kubernetes.io/projected/292354b3-46c7-4c76-b593-dda39380e797-kube-api-access-2jm88\") pod \"nova-cell0-db-create-vthqg\" (UID: \"292354b3-46c7-4c76-b593-dda39380e797\") " pod="openstack/nova-cell0-db-create-vthqg" Dec 04 10:35:14 crc kubenswrapper[4831]: I1204 10:35:14.023893 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q7nr\" (UniqueName: \"kubernetes.io/projected/f292b8ac-6250-4a8a-b73e-75c6aeebe9d5-kube-api-access-9q7nr\") pod \"nova-api-db-create-x7lqv\" (UID: \"f292b8ac-6250-4a8a-b73e-75c6aeebe9d5\") " pod="openstack/nova-api-db-create-x7lqv" Dec 04 10:35:14 crc kubenswrapper[4831]: I1204 10:35:14.044750 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-hd6qr"] Dec 04 10:35:14 crc kubenswrapper[4831]: I1204 10:35:14.046065 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hd6qr" Dec 04 10:35:14 crc kubenswrapper[4831]: I1204 10:35:14.055629 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-hd6qr"] Dec 04 10:35:14 crc kubenswrapper[4831]: I1204 10:35:14.067530 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q7nr\" (UniqueName: \"kubernetes.io/projected/f292b8ac-6250-4a8a-b73e-75c6aeebe9d5-kube-api-access-9q7nr\") pod \"nova-api-db-create-x7lqv\" (UID: \"f292b8ac-6250-4a8a-b73e-75c6aeebe9d5\") " pod="openstack/nova-api-db-create-x7lqv" Dec 04 10:35:14 crc kubenswrapper[4831]: I1204 10:35:14.128228 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlwrv\" (UniqueName: \"kubernetes.io/projected/9066e1c7-0172-4ff8-b3ef-b4f5e316a4d2-kube-api-access-vlwrv\") pod \"nova-cell1-db-create-hd6qr\" (UID: \"9066e1c7-0172-4ff8-b3ef-b4f5e316a4d2\") " pod="openstack/nova-cell1-db-create-hd6qr" Dec 04 10:35:14 crc kubenswrapper[4831]: I1204 10:35:14.128376 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jm88\" (UniqueName: \"kubernetes.io/projected/292354b3-46c7-4c76-b593-dda39380e797-kube-api-access-2jm88\") pod \"nova-cell0-db-create-vthqg\" (UID: \"292354b3-46c7-4c76-b593-dda39380e797\") " pod="openstack/nova-cell0-db-create-vthqg" Dec 04 10:35:14 crc kubenswrapper[4831]: I1204 10:35:14.144489 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jm88\" (UniqueName: \"kubernetes.io/projected/292354b3-46c7-4c76-b593-dda39380e797-kube-api-access-2jm88\") pod \"nova-cell0-db-create-vthqg\" (UID: \"292354b3-46c7-4c76-b593-dda39380e797\") " pod="openstack/nova-cell0-db-create-vthqg" Dec 04 10:35:14 crc kubenswrapper[4831]: I1204 10:35:14.172160 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-x7lqv" Dec 04 10:35:14 crc kubenswrapper[4831]: I1204 10:35:14.230328 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlwrv\" (UniqueName: \"kubernetes.io/projected/9066e1c7-0172-4ff8-b3ef-b4f5e316a4d2-kube-api-access-vlwrv\") pod \"nova-cell1-db-create-hd6qr\" (UID: \"9066e1c7-0172-4ff8-b3ef-b4f5e316a4d2\") " pod="openstack/nova-cell1-db-create-hd6qr" Dec 04 10:35:14 crc kubenswrapper[4831]: I1204 10:35:14.249821 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlwrv\" (UniqueName: \"kubernetes.io/projected/9066e1c7-0172-4ff8-b3ef-b4f5e316a4d2-kube-api-access-vlwrv\") pod \"nova-cell1-db-create-hd6qr\" (UID: \"9066e1c7-0172-4ff8-b3ef-b4f5e316a4d2\") " pod="openstack/nova-cell1-db-create-hd6qr" Dec 04 10:35:14 crc kubenswrapper[4831]: I1204 10:35:14.250697 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vthqg" Dec 04 10:35:14 crc kubenswrapper[4831]: I1204 10:35:14.407608 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hd6qr" Dec 04 10:35:14 crc kubenswrapper[4831]: I1204 10:35:14.733382 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-8c9449df7-jllzg" Dec 04 10:35:14 crc kubenswrapper[4831]: I1204 10:35:14.734653 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-8c9449df7-jllzg" Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.231307 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-545d99f8dd-55gfz" podUID="be37a16a-5f7e-4f97-b71d-ee344177919c" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": dial tcp 10.217.0.166:9311: connect: connection refused" Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.231411 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-545d99f8dd-55gfz" podUID="be37a16a-5f7e-4f97-b71d-ee344177919c" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": dial tcp 10.217.0.166:9311: connect: connection refused" Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.417208 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-dmcxs" Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.516426 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.568934 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxr78\" (UniqueName: \"kubernetes.io/projected/94129f00-4043-4552-9724-feef1585cd20-kube-api-access-pxr78\") pod \"94129f00-4043-4552-9724-feef1585cd20\" (UID: \"94129f00-4043-4552-9724-feef1585cd20\") " Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.569019 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94129f00-4043-4552-9724-feef1585cd20-scripts\") pod \"94129f00-4043-4552-9724-feef1585cd20\" (UID: \"94129f00-4043-4552-9724-feef1585cd20\") " Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.569134 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94129f00-4043-4552-9724-feef1585cd20-config-data\") pod \"94129f00-4043-4552-9724-feef1585cd20\" (UID: \"94129f00-4043-4552-9724-feef1585cd20\") " Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.569198 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/94129f00-4043-4552-9724-feef1585cd20-db-sync-config-data\") pod \"94129f00-4043-4552-9724-feef1585cd20\" (UID: \"94129f00-4043-4552-9724-feef1585cd20\") " Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.569240 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94129f00-4043-4552-9724-feef1585cd20-combined-ca-bundle\") pod \"94129f00-4043-4552-9724-feef1585cd20\" (UID: \"94129f00-4043-4552-9724-feef1585cd20\") " Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.569330 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/94129f00-4043-4552-9724-feef1585cd20-etc-machine-id\") pod \"94129f00-4043-4552-9724-feef1585cd20\" (UID: \"94129f00-4043-4552-9724-feef1585cd20\") " Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.570204 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94129f00-4043-4552-9724-feef1585cd20-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "94129f00-4043-4552-9724-feef1585cd20" (UID: "94129f00-4043-4552-9724-feef1585cd20"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.590399 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94129f00-4043-4552-9724-feef1585cd20-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "94129f00-4043-4552-9724-feef1585cd20" (UID: "94129f00-4043-4552-9724-feef1585cd20"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.593578 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94129f00-4043-4552-9724-feef1585cd20-scripts" (OuterVolumeSpecName: "scripts") pod "94129f00-4043-4552-9724-feef1585cd20" (UID: "94129f00-4043-4552-9724-feef1585cd20"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.601639 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94129f00-4043-4552-9724-feef1585cd20-kube-api-access-pxr78" (OuterVolumeSpecName: "kube-api-access-pxr78") pod "94129f00-4043-4552-9724-feef1585cd20" (UID: "94129f00-4043-4552-9724-feef1585cd20"). InnerVolumeSpecName "kube-api-access-pxr78". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.621216 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-545d99f8dd-55gfz" Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.622857 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94129f00-4043-4552-9724-feef1585cd20-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "94129f00-4043-4552-9724-feef1585cd20" (UID: "94129f00-4043-4552-9724-feef1585cd20"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.630028 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94129f00-4043-4552-9724-feef1585cd20-config-data" (OuterVolumeSpecName: "config-data") pod "94129f00-4043-4552-9724-feef1585cd20" (UID: "94129f00-4043-4552-9724-feef1585cd20"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.671334 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f14967a3-96b4-46ce-8685-0b644a080cc8-log-httpd\") pod \"f14967a3-96b4-46ce-8685-0b644a080cc8\" (UID: \"f14967a3-96b4-46ce-8685-0b644a080cc8\") " Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.671438 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f14967a3-96b4-46ce-8685-0b644a080cc8-scripts\") pod \"f14967a3-96b4-46ce-8685-0b644a080cc8\" (UID: \"f14967a3-96b4-46ce-8685-0b644a080cc8\") " Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.671492 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f14967a3-96b4-46ce-8685-0b644a080cc8-run-httpd\") pod \"f14967a3-96b4-46ce-8685-0b644a080cc8\" (UID: \"f14967a3-96b4-46ce-8685-0b644a080cc8\") " Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.671508 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f14967a3-96b4-46ce-8685-0b644a080cc8-combined-ca-bundle\") pod \"f14967a3-96b4-46ce-8685-0b644a080cc8\" (UID: \"f14967a3-96b4-46ce-8685-0b644a080cc8\") " Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.671585 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2hfl\" (UniqueName: \"kubernetes.io/projected/f14967a3-96b4-46ce-8685-0b644a080cc8-kube-api-access-b2hfl\") pod \"f14967a3-96b4-46ce-8685-0b644a080cc8\" (UID: \"f14967a3-96b4-46ce-8685-0b644a080cc8\") " Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.671653 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f14967a3-96b4-46ce-8685-0b644a080cc8-config-data\") pod \"f14967a3-96b4-46ce-8685-0b644a080cc8\" (UID: \"f14967a3-96b4-46ce-8685-0b644a080cc8\") " Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.671702 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f14967a3-96b4-46ce-8685-0b644a080cc8-sg-core-conf-yaml\") pod \"f14967a3-96b4-46ce-8685-0b644a080cc8\" (UID: \"f14967a3-96b4-46ce-8685-0b644a080cc8\") " Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.672036 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94129f00-4043-4552-9724-feef1585cd20-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.672053 4831 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/94129f00-4043-4552-9724-feef1585cd20-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.672065 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94129f00-4043-4552-9724-feef1585cd20-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.672076 4831 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/94129f00-4043-4552-9724-feef1585cd20-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.672086 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxr78\" (UniqueName: \"kubernetes.io/projected/94129f00-4043-4552-9724-feef1585cd20-kube-api-access-pxr78\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.672096 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94129f00-4043-4552-9724-feef1585cd20-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.672260 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f14967a3-96b4-46ce-8685-0b644a080cc8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f14967a3-96b4-46ce-8685-0b644a080cc8" (UID: "f14967a3-96b4-46ce-8685-0b644a080cc8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.673617 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f14967a3-96b4-46ce-8685-0b644a080cc8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f14967a3-96b4-46ce-8685-0b644a080cc8" (UID: "f14967a3-96b4-46ce-8685-0b644a080cc8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.676811 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f14967a3-96b4-46ce-8685-0b644a080cc8-scripts" (OuterVolumeSpecName: "scripts") pod "f14967a3-96b4-46ce-8685-0b644a080cc8" (UID: "f14967a3-96b4-46ce-8685-0b644a080cc8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.685635 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f14967a3-96b4-46ce-8685-0b644a080cc8-kube-api-access-b2hfl" (OuterVolumeSpecName: "kube-api-access-b2hfl") pod "f14967a3-96b4-46ce-8685-0b644a080cc8" (UID: "f14967a3-96b4-46ce-8685-0b644a080cc8"). InnerVolumeSpecName "kube-api-access-b2hfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.721575 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f14967a3-96b4-46ce-8685-0b644a080cc8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f14967a3-96b4-46ce-8685-0b644a080cc8" (UID: "f14967a3-96b4-46ce-8685-0b644a080cc8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.747619 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-vthqg"] Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.762570 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f14967a3-96b4-46ce-8685-0b644a080cc8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f14967a3-96b4-46ce-8685-0b644a080cc8" (UID: "f14967a3-96b4-46ce-8685-0b644a080cc8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.772479 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-hd6qr"] Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.772709 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be37a16a-5f7e-4f97-b71d-ee344177919c-config-data\") pod \"be37a16a-5f7e-4f97-b71d-ee344177919c\" (UID: \"be37a16a-5f7e-4f97-b71d-ee344177919c\") " Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.772766 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be37a16a-5f7e-4f97-b71d-ee344177919c-logs\") pod \"be37a16a-5f7e-4f97-b71d-ee344177919c\" (UID: \"be37a16a-5f7e-4f97-b71d-ee344177919c\") " Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.772920 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be37a16a-5f7e-4f97-b71d-ee344177919c-config-data-custom\") pod \"be37a16a-5f7e-4f97-b71d-ee344177919c\" (UID: \"be37a16a-5f7e-4f97-b71d-ee344177919c\") " Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.772949 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be37a16a-5f7e-4f97-b71d-ee344177919c-combined-ca-bundle\") pod \"be37a16a-5f7e-4f97-b71d-ee344177919c\" (UID: \"be37a16a-5f7e-4f97-b71d-ee344177919c\") " Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.773042 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prffl\" (UniqueName: \"kubernetes.io/projected/be37a16a-5f7e-4f97-b71d-ee344177919c-kube-api-access-prffl\") pod \"be37a16a-5f7e-4f97-b71d-ee344177919c\" (UID: \"be37a16a-5f7e-4f97-b71d-ee344177919c\") " Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.773385 4831 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f14967a3-96b4-46ce-8685-0b644a080cc8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.773402 4831 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f14967a3-96b4-46ce-8685-0b644a080cc8-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.773411 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f14967a3-96b4-46ce-8685-0b644a080cc8-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.773420 4831 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f14967a3-96b4-46ce-8685-0b644a080cc8-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.773428 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f14967a3-96b4-46ce-8685-0b644a080cc8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.773436 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2hfl\" (UniqueName: \"kubernetes.io/projected/f14967a3-96b4-46ce-8685-0b644a080cc8-kube-api-access-b2hfl\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.773906 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be37a16a-5f7e-4f97-b71d-ee344177919c-logs" (OuterVolumeSpecName: "logs") pod "be37a16a-5f7e-4f97-b71d-ee344177919c" (UID: "be37a16a-5f7e-4f97-b71d-ee344177919c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.777266 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be37a16a-5f7e-4f97-b71d-ee344177919c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "be37a16a-5f7e-4f97-b71d-ee344177919c" (UID: "be37a16a-5f7e-4f97-b71d-ee344177919c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.780860 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be37a16a-5f7e-4f97-b71d-ee344177919c-kube-api-access-prffl" (OuterVolumeSpecName: "kube-api-access-prffl") pod "be37a16a-5f7e-4f97-b71d-ee344177919c" (UID: "be37a16a-5f7e-4f97-b71d-ee344177919c"). InnerVolumeSpecName "kube-api-access-prffl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.809945 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f14967a3-96b4-46ce-8685-0b644a080cc8-config-data" (OuterVolumeSpecName: "config-data") pod "f14967a3-96b4-46ce-8685-0b644a080cc8" (UID: "f14967a3-96b4-46ce-8685-0b644a080cc8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.832262 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be37a16a-5f7e-4f97-b71d-ee344177919c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be37a16a-5f7e-4f97-b71d-ee344177919c" (UID: "be37a16a-5f7e-4f97-b71d-ee344177919c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.843566 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be37a16a-5f7e-4f97-b71d-ee344177919c-config-data" (OuterVolumeSpecName: "config-data") pod "be37a16a-5f7e-4f97-b71d-ee344177919c" (UID: "be37a16a-5f7e-4f97-b71d-ee344177919c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.874883 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be37a16a-5f7e-4f97-b71d-ee344177919c-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.874928 4831 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be37a16a-5f7e-4f97-b71d-ee344177919c-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.874938 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be37a16a-5f7e-4f97-b71d-ee344177919c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.874947 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f14967a3-96b4-46ce-8685-0b644a080cc8-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.874957 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prffl\" (UniqueName: \"kubernetes.io/projected/be37a16a-5f7e-4f97-b71d-ee344177919c-kube-api-access-prffl\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.874968 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be37a16a-5f7e-4f97-b71d-ee344177919c-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.955179 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-x7lqv"] Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.996669 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-545d99f8dd-55gfz" event={"ID":"be37a16a-5f7e-4f97-b71d-ee344177919c","Type":"ContainerDied","Data":"552f7a751bd7b367ae51aabd39eca1765da50a5dafba09c31a07128eb31c2095"} Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.996695 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-545d99f8dd-55gfz" Dec 04 10:35:15 crc kubenswrapper[4831]: I1204 10:35:15.996736 4831 scope.go:117] "RemoveContainer" containerID="128236ce2adfca9eacb943496711e32ee6b4f03660b05db1105554aed5dbb6d7" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.000237 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f14967a3-96b4-46ce-8685-0b644a080cc8","Type":"ContainerDied","Data":"c4e5baced66da659af4ca27729007be4ba62d7cc44df7500186c46d2ff071c45"} Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.000336 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.003387 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hd6qr" event={"ID":"9066e1c7-0172-4ff8-b3ef-b4f5e316a4d2","Type":"ContainerStarted","Data":"c41373f79e32aa972af49c9686a345355f2b1507c776d100bb2e83e3fadd9eed"} Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.003431 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hd6qr" event={"ID":"9066e1c7-0172-4ff8-b3ef-b4f5e316a4d2","Type":"ContainerStarted","Data":"ce2f9f60319266dd6167a561e1f6c620715d4f895ba56cf9529c9a5c95d0032a"} Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.008590 4831 generic.go:334] "Generic (PLEG): container finished" podID="18f48aa7-65d1-41ce-bc0d-4973db8b7abe" containerID="374ed1560d678f11ec4bed60c79dbd54071ebb83505a5e220a5dee9f726db39f" exitCode=0 Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.008646 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-g5bvk" event={"ID":"18f48aa7-65d1-41ce-bc0d-4973db8b7abe","Type":"ContainerDied","Data":"374ed1560d678f11ec4bed60c79dbd54071ebb83505a5e220a5dee9f726db39f"} Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.010444 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"e13429a8-0182-4bfb-99b2-d1941a1e9af6","Type":"ContainerStarted","Data":"d6bae4e9745e1601eda596cb19884f06863d10860d8e1fec379f4bf8413a115f"} Dec 04 10:35:16 crc kubenswrapper[4831]: W1204 10:35:16.012195 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf292b8ac_6250_4a8a_b73e_75c6aeebe9d5.slice/crio-46b72d94d1aa3111d776923f4eeeedce8a41d35a0913464136c56d1074b34db0 WatchSource:0}: Error finding container 46b72d94d1aa3111d776923f4eeeedce8a41d35a0913464136c56d1074b34db0: Status 404 returned error can't find the container with id 46b72d94d1aa3111d776923f4eeeedce8a41d35a0913464136c56d1074b34db0 Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.013027 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-vthqg" event={"ID":"292354b3-46c7-4c76-b593-dda39380e797","Type":"ContainerStarted","Data":"b1f56fd39c8ea9d813961a992c388c6af7b8f455249d842576221b585ac0e10e"} Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.013128 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-vthqg" event={"ID":"292354b3-46c7-4c76-b593-dda39380e797","Type":"ContainerStarted","Data":"b1e4c135cb3a055c1a060be812602729257d425f63eb000cd0e6f8298af201f8"} Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.017090 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-dmcxs" event={"ID":"94129f00-4043-4552-9724-feef1585cd20","Type":"ContainerDied","Data":"12e2ef018400b0e25abb0efd00ab7137f3ef68f1bf719dc0723c4361151114ed"} Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.017123 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12e2ef018400b0e25abb0efd00ab7137f3ef68f1bf719dc0723c4361151114ed" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.017195 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-dmcxs" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.032915 4831 scope.go:117] "RemoveContainer" containerID="71f5a152442f7f92c12464a98c106ae920c407a47b36f18c73a5483549259c64" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.033607 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-hd6qr" podStartSLOduration=2.033586245 podStartE2EDuration="2.033586245s" podCreationTimestamp="2025-12-04 10:35:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:35:16.017273541 +0000 UTC m=+1212.966448855" watchObservedRunningTime="2025-12-04 10:35:16.033586245 +0000 UTC m=+1212.982761559" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.062090 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=4.69276188 podStartE2EDuration="21.061984747s" podCreationTimestamp="2025-12-04 10:34:55 +0000 UTC" firstStartedPulling="2025-12-04 10:34:58.801005201 +0000 UTC m=+1195.750180515" lastFinishedPulling="2025-12-04 10:35:15.170228068 +0000 UTC m=+1212.119403382" observedRunningTime="2025-12-04 10:35:16.041145618 +0000 UTC m=+1212.990320922" watchObservedRunningTime="2025-12-04 10:35:16.061984747 +0000 UTC m=+1213.011160061" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.079689 4831 scope.go:117] "RemoveContainer" containerID="5034b34510b64120bdd1d44c6caa27737a6f800a9f7581b860b0cb31b33edb6d" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.086707 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-vthqg" podStartSLOduration=3.086691734 podStartE2EDuration="3.086691734s" podCreationTimestamp="2025-12-04 10:35:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:35:16.076318751 +0000 UTC m=+1213.025494075" watchObservedRunningTime="2025-12-04 10:35:16.086691734 +0000 UTC m=+1213.035867048" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.109266 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.138471 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.158363 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-545d99f8dd-55gfz"] Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.169057 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:35:16 crc kubenswrapper[4831]: E1204 10:35:16.169601 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f14967a3-96b4-46ce-8685-0b644a080cc8" containerName="ceilometer-central-agent" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.169629 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f14967a3-96b4-46ce-8685-0b644a080cc8" containerName="ceilometer-central-agent" Dec 04 10:35:16 crc kubenswrapper[4831]: E1204 10:35:16.169652 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be37a16a-5f7e-4f97-b71d-ee344177919c" containerName="barbican-api-log" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.169684 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="be37a16a-5f7e-4f97-b71d-ee344177919c" containerName="barbican-api-log" Dec 04 10:35:16 crc kubenswrapper[4831]: E1204 10:35:16.169698 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f14967a3-96b4-46ce-8685-0b644a080cc8" containerName="sg-core" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.169706 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f14967a3-96b4-46ce-8685-0b644a080cc8" containerName="sg-core" Dec 04 10:35:16 crc kubenswrapper[4831]: E1204 10:35:16.169730 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f14967a3-96b4-46ce-8685-0b644a080cc8" containerName="proxy-httpd" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.169737 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f14967a3-96b4-46ce-8685-0b644a080cc8" containerName="proxy-httpd" Dec 04 10:35:16 crc kubenswrapper[4831]: E1204 10:35:16.169749 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be37a16a-5f7e-4f97-b71d-ee344177919c" containerName="barbican-api" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.169762 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="be37a16a-5f7e-4f97-b71d-ee344177919c" containerName="barbican-api" Dec 04 10:35:16 crc kubenswrapper[4831]: E1204 10:35:16.169795 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f14967a3-96b4-46ce-8685-0b644a080cc8" containerName="ceilometer-notification-agent" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.169806 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f14967a3-96b4-46ce-8685-0b644a080cc8" containerName="ceilometer-notification-agent" Dec 04 10:35:16 crc kubenswrapper[4831]: E1204 10:35:16.169818 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94129f00-4043-4552-9724-feef1585cd20" containerName="cinder-db-sync" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.169826 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="94129f00-4043-4552-9724-feef1585cd20" containerName="cinder-db-sync" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.170133 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="94129f00-4043-4552-9724-feef1585cd20" containerName="cinder-db-sync" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.170158 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f14967a3-96b4-46ce-8685-0b644a080cc8" containerName="proxy-httpd" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.170170 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="be37a16a-5f7e-4f97-b71d-ee344177919c" containerName="barbican-api-log" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.170178 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f14967a3-96b4-46ce-8685-0b644a080cc8" containerName="ceilometer-notification-agent" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.170189 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="be37a16a-5f7e-4f97-b71d-ee344177919c" containerName="barbican-api" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.170201 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f14967a3-96b4-46ce-8685-0b644a080cc8" containerName="ceilometer-central-agent" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.170209 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f14967a3-96b4-46ce-8685-0b644a080cc8" containerName="sg-core" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.173827 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.177135 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.180371 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-545d99f8dd-55gfz"] Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.187900 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.198402 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.283627 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fa9407a-9a04-4712-bac8-d89067b14304-scripts\") pod \"ceilometer-0\" (UID: \"7fa9407a-9a04-4712-bac8-d89067b14304\") " pod="openstack/ceilometer-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.283706 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fa9407a-9a04-4712-bac8-d89067b14304-log-httpd\") pod \"ceilometer-0\" (UID: \"7fa9407a-9a04-4712-bac8-d89067b14304\") " pod="openstack/ceilometer-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.283729 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fa9407a-9a04-4712-bac8-d89067b14304-run-httpd\") pod \"ceilometer-0\" (UID: \"7fa9407a-9a04-4712-bac8-d89067b14304\") " pod="openstack/ceilometer-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.283747 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fa9407a-9a04-4712-bac8-d89067b14304-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7fa9407a-9a04-4712-bac8-d89067b14304\") " pod="openstack/ceilometer-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.283799 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7fa9407a-9a04-4712-bac8-d89067b14304-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7fa9407a-9a04-4712-bac8-d89067b14304\") " pod="openstack/ceilometer-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.283817 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skj6x\" (UniqueName: \"kubernetes.io/projected/7fa9407a-9a04-4712-bac8-d89067b14304-kube-api-access-skj6x\") pod \"ceilometer-0\" (UID: \"7fa9407a-9a04-4712-bac8-d89067b14304\") " pod="openstack/ceilometer-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.283891 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fa9407a-9a04-4712-bac8-d89067b14304-config-data\") pod \"ceilometer-0\" (UID: \"7fa9407a-9a04-4712-bac8-d89067b14304\") " pod="openstack/ceilometer-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.344999 4831 scope.go:117] "RemoveContainer" containerID="7a1ce6d425ab8f08be9e438ae877996cc4b941431730ec001a53308b161f2b87" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.375579 4831 scope.go:117] "RemoveContainer" containerID="616e82a49829757f1f4dac4e5f428b0b3b339bf0e27da23b7cf9003f49b33e0c" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.385838 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fa9407a-9a04-4712-bac8-d89067b14304-log-httpd\") pod \"ceilometer-0\" (UID: \"7fa9407a-9a04-4712-bac8-d89067b14304\") " pod="openstack/ceilometer-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.385888 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fa9407a-9a04-4712-bac8-d89067b14304-run-httpd\") pod \"ceilometer-0\" (UID: \"7fa9407a-9a04-4712-bac8-d89067b14304\") " pod="openstack/ceilometer-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.385907 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fa9407a-9a04-4712-bac8-d89067b14304-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7fa9407a-9a04-4712-bac8-d89067b14304\") " pod="openstack/ceilometer-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.385983 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7fa9407a-9a04-4712-bac8-d89067b14304-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7fa9407a-9a04-4712-bac8-d89067b14304\") " pod="openstack/ceilometer-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.386004 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skj6x\" (UniqueName: \"kubernetes.io/projected/7fa9407a-9a04-4712-bac8-d89067b14304-kube-api-access-skj6x\") pod \"ceilometer-0\" (UID: \"7fa9407a-9a04-4712-bac8-d89067b14304\") " pod="openstack/ceilometer-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.386475 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fa9407a-9a04-4712-bac8-d89067b14304-log-httpd\") pod \"ceilometer-0\" (UID: \"7fa9407a-9a04-4712-bac8-d89067b14304\") " pod="openstack/ceilometer-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.386555 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fa9407a-9a04-4712-bac8-d89067b14304-run-httpd\") pod \"ceilometer-0\" (UID: \"7fa9407a-9a04-4712-bac8-d89067b14304\") " pod="openstack/ceilometer-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.386620 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fa9407a-9a04-4712-bac8-d89067b14304-config-data\") pod \"ceilometer-0\" (UID: \"7fa9407a-9a04-4712-bac8-d89067b14304\") " pod="openstack/ceilometer-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.386682 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fa9407a-9a04-4712-bac8-d89067b14304-scripts\") pod \"ceilometer-0\" (UID: \"7fa9407a-9a04-4712-bac8-d89067b14304\") " pod="openstack/ceilometer-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.390463 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fa9407a-9a04-4712-bac8-d89067b14304-scripts\") pod \"ceilometer-0\" (UID: \"7fa9407a-9a04-4712-bac8-d89067b14304\") " pod="openstack/ceilometer-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.391104 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7fa9407a-9a04-4712-bac8-d89067b14304-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7fa9407a-9a04-4712-bac8-d89067b14304\") " pod="openstack/ceilometer-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.391114 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fa9407a-9a04-4712-bac8-d89067b14304-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7fa9407a-9a04-4712-bac8-d89067b14304\") " pod="openstack/ceilometer-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.406995 4831 scope.go:117] "RemoveContainer" containerID="f97e802035facbe195f98e253e44014237f087da841b85a9abe981efc0004f8e" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.408111 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fa9407a-9a04-4712-bac8-d89067b14304-config-data\") pod \"ceilometer-0\" (UID: \"7fa9407a-9a04-4712-bac8-d89067b14304\") " pod="openstack/ceilometer-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.421427 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skj6x\" (UniqueName: \"kubernetes.io/projected/7fa9407a-9a04-4712-bac8-d89067b14304-kube-api-access-skj6x\") pod \"ceilometer-0\" (UID: \"7fa9407a-9a04-4712-bac8-d89067b14304\") " pod="openstack/ceilometer-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.651619 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.656331 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.658094 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.662319 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.662706 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-5f2ml" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.662904 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.663072 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.694817 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.709401 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-678bb56fc5-tf8dc"] Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.711035 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-678bb56fc5-tf8dc" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.748424 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-678bb56fc5-tf8dc"] Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.799958 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72ca3a62-9e01-4f14-8b45-6db4951fb6e6-config-data\") pod \"cinder-scheduler-0\" (UID: \"72ca3a62-9e01-4f14-8b45-6db4951fb6e6\") " pod="openstack/cinder-scheduler-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.800014 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/72ca3a62-9e01-4f14-8b45-6db4951fb6e6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"72ca3a62-9e01-4f14-8b45-6db4951fb6e6\") " pod="openstack/cinder-scheduler-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.800043 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04bc7d54-ace7-4cd2-b867-9ccc94190c9d-ovsdbserver-sb\") pod \"dnsmasq-dns-678bb56fc5-tf8dc\" (UID: \"04bc7d54-ace7-4cd2-b867-9ccc94190c9d\") " pod="openstack/dnsmasq-dns-678bb56fc5-tf8dc" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.800073 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwp95\" (UniqueName: \"kubernetes.io/projected/04bc7d54-ace7-4cd2-b867-9ccc94190c9d-kube-api-access-qwp95\") pod \"dnsmasq-dns-678bb56fc5-tf8dc\" (UID: \"04bc7d54-ace7-4cd2-b867-9ccc94190c9d\") " pod="openstack/dnsmasq-dns-678bb56fc5-tf8dc" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.800102 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04bc7d54-ace7-4cd2-b867-9ccc94190c9d-config\") pod \"dnsmasq-dns-678bb56fc5-tf8dc\" (UID: \"04bc7d54-ace7-4cd2-b867-9ccc94190c9d\") " pod="openstack/dnsmasq-dns-678bb56fc5-tf8dc" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.800154 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65zwx\" (UniqueName: \"kubernetes.io/projected/72ca3a62-9e01-4f14-8b45-6db4951fb6e6-kube-api-access-65zwx\") pod \"cinder-scheduler-0\" (UID: \"72ca3a62-9e01-4f14-8b45-6db4951fb6e6\") " pod="openstack/cinder-scheduler-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.800178 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04bc7d54-ace7-4cd2-b867-9ccc94190c9d-ovsdbserver-nb\") pod \"dnsmasq-dns-678bb56fc5-tf8dc\" (UID: \"04bc7d54-ace7-4cd2-b867-9ccc94190c9d\") " pod="openstack/dnsmasq-dns-678bb56fc5-tf8dc" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.800204 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72ca3a62-9e01-4f14-8b45-6db4951fb6e6-scripts\") pod \"cinder-scheduler-0\" (UID: \"72ca3a62-9e01-4f14-8b45-6db4951fb6e6\") " pod="openstack/cinder-scheduler-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.800232 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04bc7d54-ace7-4cd2-b867-9ccc94190c9d-dns-svc\") pod \"dnsmasq-dns-678bb56fc5-tf8dc\" (UID: \"04bc7d54-ace7-4cd2-b867-9ccc94190c9d\") " pod="openstack/dnsmasq-dns-678bb56fc5-tf8dc" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.800254 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72ca3a62-9e01-4f14-8b45-6db4951fb6e6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"72ca3a62-9e01-4f14-8b45-6db4951fb6e6\") " pod="openstack/cinder-scheduler-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.800309 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/72ca3a62-9e01-4f14-8b45-6db4951fb6e6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"72ca3a62-9e01-4f14-8b45-6db4951fb6e6\") " pod="openstack/cinder-scheduler-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.800367 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04bc7d54-ace7-4cd2-b867-9ccc94190c9d-dns-swift-storage-0\") pod \"dnsmasq-dns-678bb56fc5-tf8dc\" (UID: \"04bc7d54-ace7-4cd2-b867-9ccc94190c9d\") " pod="openstack/dnsmasq-dns-678bb56fc5-tf8dc" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.863822 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.869946 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.872995 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.894876 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.901920 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04bc7d54-ace7-4cd2-b867-9ccc94190c9d-dns-swift-storage-0\") pod \"dnsmasq-dns-678bb56fc5-tf8dc\" (UID: \"04bc7d54-ace7-4cd2-b867-9ccc94190c9d\") " pod="openstack/dnsmasq-dns-678bb56fc5-tf8dc" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.902198 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72ca3a62-9e01-4f14-8b45-6db4951fb6e6-config-data\") pod \"cinder-scheduler-0\" (UID: \"72ca3a62-9e01-4f14-8b45-6db4951fb6e6\") " pod="openstack/cinder-scheduler-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.902239 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04bc7d54-ace7-4cd2-b867-9ccc94190c9d-ovsdbserver-sb\") pod \"dnsmasq-dns-678bb56fc5-tf8dc\" (UID: \"04bc7d54-ace7-4cd2-b867-9ccc94190c9d\") " pod="openstack/dnsmasq-dns-678bb56fc5-tf8dc" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.902284 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/72ca3a62-9e01-4f14-8b45-6db4951fb6e6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"72ca3a62-9e01-4f14-8b45-6db4951fb6e6\") " pod="openstack/cinder-scheduler-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.902497 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwp95\" (UniqueName: \"kubernetes.io/projected/04bc7d54-ace7-4cd2-b867-9ccc94190c9d-kube-api-access-qwp95\") pod \"dnsmasq-dns-678bb56fc5-tf8dc\" (UID: \"04bc7d54-ace7-4cd2-b867-9ccc94190c9d\") " pod="openstack/dnsmasq-dns-678bb56fc5-tf8dc" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.902530 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04bc7d54-ace7-4cd2-b867-9ccc94190c9d-config\") pod \"dnsmasq-dns-678bb56fc5-tf8dc\" (UID: \"04bc7d54-ace7-4cd2-b867-9ccc94190c9d\") " pod="openstack/dnsmasq-dns-678bb56fc5-tf8dc" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.902609 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65zwx\" (UniqueName: \"kubernetes.io/projected/72ca3a62-9e01-4f14-8b45-6db4951fb6e6-kube-api-access-65zwx\") pod \"cinder-scheduler-0\" (UID: \"72ca3a62-9e01-4f14-8b45-6db4951fb6e6\") " pod="openstack/cinder-scheduler-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.902683 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04bc7d54-ace7-4cd2-b867-9ccc94190c9d-ovsdbserver-nb\") pod \"dnsmasq-dns-678bb56fc5-tf8dc\" (UID: \"04bc7d54-ace7-4cd2-b867-9ccc94190c9d\") " pod="openstack/dnsmasq-dns-678bb56fc5-tf8dc" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.902716 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72ca3a62-9e01-4f14-8b45-6db4951fb6e6-scripts\") pod \"cinder-scheduler-0\" (UID: \"72ca3a62-9e01-4f14-8b45-6db4951fb6e6\") " pod="openstack/cinder-scheduler-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.902762 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04bc7d54-ace7-4cd2-b867-9ccc94190c9d-dns-svc\") pod \"dnsmasq-dns-678bb56fc5-tf8dc\" (UID: \"04bc7d54-ace7-4cd2-b867-9ccc94190c9d\") " pod="openstack/dnsmasq-dns-678bb56fc5-tf8dc" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.902796 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72ca3a62-9e01-4f14-8b45-6db4951fb6e6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"72ca3a62-9e01-4f14-8b45-6db4951fb6e6\") " pod="openstack/cinder-scheduler-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.902887 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/72ca3a62-9e01-4f14-8b45-6db4951fb6e6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"72ca3a62-9e01-4f14-8b45-6db4951fb6e6\") " pod="openstack/cinder-scheduler-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.903096 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/72ca3a62-9e01-4f14-8b45-6db4951fb6e6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"72ca3a62-9e01-4f14-8b45-6db4951fb6e6\") " pod="openstack/cinder-scheduler-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.904701 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04bc7d54-ace7-4cd2-b867-9ccc94190c9d-dns-swift-storage-0\") pod \"dnsmasq-dns-678bb56fc5-tf8dc\" (UID: \"04bc7d54-ace7-4cd2-b867-9ccc94190c9d\") " pod="openstack/dnsmasq-dns-678bb56fc5-tf8dc" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.906475 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04bc7d54-ace7-4cd2-b867-9ccc94190c9d-ovsdbserver-sb\") pod \"dnsmasq-dns-678bb56fc5-tf8dc\" (UID: \"04bc7d54-ace7-4cd2-b867-9ccc94190c9d\") " pod="openstack/dnsmasq-dns-678bb56fc5-tf8dc" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.909204 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04bc7d54-ace7-4cd2-b867-9ccc94190c9d-ovsdbserver-nb\") pod \"dnsmasq-dns-678bb56fc5-tf8dc\" (UID: \"04bc7d54-ace7-4cd2-b867-9ccc94190c9d\") " pod="openstack/dnsmasq-dns-678bb56fc5-tf8dc" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.909379 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04bc7d54-ace7-4cd2-b867-9ccc94190c9d-dns-svc\") pod \"dnsmasq-dns-678bb56fc5-tf8dc\" (UID: \"04bc7d54-ace7-4cd2-b867-9ccc94190c9d\") " pod="openstack/dnsmasq-dns-678bb56fc5-tf8dc" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.910402 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72ca3a62-9e01-4f14-8b45-6db4951fb6e6-config-data\") pod \"cinder-scheduler-0\" (UID: \"72ca3a62-9e01-4f14-8b45-6db4951fb6e6\") " pod="openstack/cinder-scheduler-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.910413 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72ca3a62-9e01-4f14-8b45-6db4951fb6e6-scripts\") pod \"cinder-scheduler-0\" (UID: \"72ca3a62-9e01-4f14-8b45-6db4951fb6e6\") " pod="openstack/cinder-scheduler-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.911262 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04bc7d54-ace7-4cd2-b867-9ccc94190c9d-config\") pod \"dnsmasq-dns-678bb56fc5-tf8dc\" (UID: \"04bc7d54-ace7-4cd2-b867-9ccc94190c9d\") " pod="openstack/dnsmasq-dns-678bb56fc5-tf8dc" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.915076 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72ca3a62-9e01-4f14-8b45-6db4951fb6e6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"72ca3a62-9e01-4f14-8b45-6db4951fb6e6\") " pod="openstack/cinder-scheduler-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.911021 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/72ca3a62-9e01-4f14-8b45-6db4951fb6e6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"72ca3a62-9e01-4f14-8b45-6db4951fb6e6\") " pod="openstack/cinder-scheduler-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.931167 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65zwx\" (UniqueName: \"kubernetes.io/projected/72ca3a62-9e01-4f14-8b45-6db4951fb6e6-kube-api-access-65zwx\") pod \"cinder-scheduler-0\" (UID: \"72ca3a62-9e01-4f14-8b45-6db4951fb6e6\") " pod="openstack/cinder-scheduler-0" Dec 04 10:35:16 crc kubenswrapper[4831]: I1204 10:35:16.933651 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwp95\" (UniqueName: \"kubernetes.io/projected/04bc7d54-ace7-4cd2-b867-9ccc94190c9d-kube-api-access-qwp95\") pod \"dnsmasq-dns-678bb56fc5-tf8dc\" (UID: \"04bc7d54-ace7-4cd2-b867-9ccc94190c9d\") " pod="openstack/dnsmasq-dns-678bb56fc5-tf8dc" Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.004363 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8165b5c-aa05-4fbb-9541-7b25c5a75dab-config-data-custom\") pod \"cinder-api-0\" (UID: \"a8165b5c-aa05-4fbb-9541-7b25c5a75dab\") " pod="openstack/cinder-api-0" Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.004432 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8165b5c-aa05-4fbb-9541-7b25c5a75dab-logs\") pod \"cinder-api-0\" (UID: \"a8165b5c-aa05-4fbb-9541-7b25c5a75dab\") " pod="openstack/cinder-api-0" Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.004457 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8165b5c-aa05-4fbb-9541-7b25c5a75dab-config-data\") pod \"cinder-api-0\" (UID: \"a8165b5c-aa05-4fbb-9541-7b25c5a75dab\") " pod="openstack/cinder-api-0" Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.004583 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpf5f\" (UniqueName: \"kubernetes.io/projected/a8165b5c-aa05-4fbb-9541-7b25c5a75dab-kube-api-access-kpf5f\") pod \"cinder-api-0\" (UID: \"a8165b5c-aa05-4fbb-9541-7b25c5a75dab\") " pod="openstack/cinder-api-0" Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.004654 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8165b5c-aa05-4fbb-9541-7b25c5a75dab-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a8165b5c-aa05-4fbb-9541-7b25c5a75dab\") " pod="openstack/cinder-api-0" Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.004734 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8165b5c-aa05-4fbb-9541-7b25c5a75dab-scripts\") pod \"cinder-api-0\" (UID: \"a8165b5c-aa05-4fbb-9541-7b25c5a75dab\") " pod="openstack/cinder-api-0" Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.004750 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a8165b5c-aa05-4fbb-9541-7b25c5a75dab-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a8165b5c-aa05-4fbb-9541-7b25c5a75dab\") " pod="openstack/cinder-api-0" Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.032598 4831 generic.go:334] "Generic (PLEG): container finished" podID="f292b8ac-6250-4a8a-b73e-75c6aeebe9d5" containerID="89f75b3af9abc32d45d9c268e7e64c96787f53d6d6d7e348929c9398347edde5" exitCode=0 Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.032724 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-x7lqv" event={"ID":"f292b8ac-6250-4a8a-b73e-75c6aeebe9d5","Type":"ContainerDied","Data":"89f75b3af9abc32d45d9c268e7e64c96787f53d6d6d7e348929c9398347edde5"} Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.032756 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-x7lqv" event={"ID":"f292b8ac-6250-4a8a-b73e-75c6aeebe9d5","Type":"ContainerStarted","Data":"46b72d94d1aa3111d776923f4eeeedce8a41d35a0913464136c56d1074b34db0"} Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.055206 4831 generic.go:334] "Generic (PLEG): container finished" podID="292354b3-46c7-4c76-b593-dda39380e797" containerID="b1f56fd39c8ea9d813961a992c388c6af7b8f455249d842576221b585ac0e10e" exitCode=0 Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.055267 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-vthqg" event={"ID":"292354b3-46c7-4c76-b593-dda39380e797","Type":"ContainerDied","Data":"b1f56fd39c8ea9d813961a992c388c6af7b8f455249d842576221b585ac0e10e"} Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.093550 4831 generic.go:334] "Generic (PLEG): container finished" podID="9066e1c7-0172-4ff8-b3ef-b4f5e316a4d2" containerID="c41373f79e32aa972af49c9686a345355f2b1507c776d100bb2e83e3fadd9eed" exitCode=0 Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.093666 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hd6qr" event={"ID":"9066e1c7-0172-4ff8-b3ef-b4f5e316a4d2","Type":"ContainerDied","Data":"c41373f79e32aa972af49c9686a345355f2b1507c776d100bb2e83e3fadd9eed"} Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.110781 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8165b5c-aa05-4fbb-9541-7b25c5a75dab-config-data-custom\") pod \"cinder-api-0\" (UID: \"a8165b5c-aa05-4fbb-9541-7b25c5a75dab\") " pod="openstack/cinder-api-0" Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.113554 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.116545 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8165b5c-aa05-4fbb-9541-7b25c5a75dab-logs\") pod \"cinder-api-0\" (UID: \"a8165b5c-aa05-4fbb-9541-7b25c5a75dab\") " pod="openstack/cinder-api-0" Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.116615 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8165b5c-aa05-4fbb-9541-7b25c5a75dab-config-data\") pod \"cinder-api-0\" (UID: \"a8165b5c-aa05-4fbb-9541-7b25c5a75dab\") " pod="openstack/cinder-api-0" Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.116778 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpf5f\" (UniqueName: \"kubernetes.io/projected/a8165b5c-aa05-4fbb-9541-7b25c5a75dab-kube-api-access-kpf5f\") pod \"cinder-api-0\" (UID: \"a8165b5c-aa05-4fbb-9541-7b25c5a75dab\") " pod="openstack/cinder-api-0" Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.116828 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8165b5c-aa05-4fbb-9541-7b25c5a75dab-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a8165b5c-aa05-4fbb-9541-7b25c5a75dab\") " pod="openstack/cinder-api-0" Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.116966 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8165b5c-aa05-4fbb-9541-7b25c5a75dab-scripts\") pod \"cinder-api-0\" (UID: \"a8165b5c-aa05-4fbb-9541-7b25c5a75dab\") " pod="openstack/cinder-api-0" Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.116987 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a8165b5c-aa05-4fbb-9541-7b25c5a75dab-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a8165b5c-aa05-4fbb-9541-7b25c5a75dab\") " pod="openstack/cinder-api-0" Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.117074 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a8165b5c-aa05-4fbb-9541-7b25c5a75dab-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a8165b5c-aa05-4fbb-9541-7b25c5a75dab\") " pod="openstack/cinder-api-0" Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.117371 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8165b5c-aa05-4fbb-9541-7b25c5a75dab-logs\") pod \"cinder-api-0\" (UID: \"a8165b5c-aa05-4fbb-9541-7b25c5a75dab\") " pod="openstack/cinder-api-0" Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.119157 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-678bb56fc5-tf8dc" Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.120065 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8165b5c-aa05-4fbb-9541-7b25c5a75dab-config-data-custom\") pod \"cinder-api-0\" (UID: \"a8165b5c-aa05-4fbb-9541-7b25c5a75dab\") " pod="openstack/cinder-api-0" Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.121533 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8165b5c-aa05-4fbb-9541-7b25c5a75dab-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a8165b5c-aa05-4fbb-9541-7b25c5a75dab\") " pod="openstack/cinder-api-0" Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.124945 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8165b5c-aa05-4fbb-9541-7b25c5a75dab-config-data\") pod \"cinder-api-0\" (UID: \"a8165b5c-aa05-4fbb-9541-7b25c5a75dab\") " pod="openstack/cinder-api-0" Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.126048 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8165b5c-aa05-4fbb-9541-7b25c5a75dab-scripts\") pod \"cinder-api-0\" (UID: \"a8165b5c-aa05-4fbb-9541-7b25c5a75dab\") " pod="openstack/cinder-api-0" Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.139936 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpf5f\" (UniqueName: \"kubernetes.io/projected/a8165b5c-aa05-4fbb-9541-7b25c5a75dab-kube-api-access-kpf5f\") pod \"cinder-api-0\" (UID: \"a8165b5c-aa05-4fbb-9541-7b25c5a75dab\") " pod="openstack/cinder-api-0" Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.206438 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.317951 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be37a16a-5f7e-4f97-b71d-ee344177919c" path="/var/lib/kubelet/pods/be37a16a-5f7e-4f97-b71d-ee344177919c/volumes" Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.318934 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f14967a3-96b4-46ce-8685-0b644a080cc8" path="/var/lib/kubelet/pods/f14967a3-96b4-46ce-8685-0b644a080cc8/volumes" Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.319648 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.713429 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-678bb56fc5-tf8dc"] Dec 04 10:35:17 crc kubenswrapper[4831]: W1204 10:35:17.715790 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04bc7d54_ace7_4cd2_b867_9ccc94190c9d.slice/crio-346acaf7bf950e2341827380b42dc6eb6fe26e8250028da61a75b19ec7319a9e WatchSource:0}: Error finding container 346acaf7bf950e2341827380b42dc6eb6fe26e8250028da61a75b19ec7319a9e: Status 404 returned error can't find the container with id 346acaf7bf950e2341827380b42dc6eb6fe26e8250028da61a75b19ec7319a9e Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.780799 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-g5bvk" Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.897100 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.907575 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.955336 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f48aa7-65d1-41ce-bc0d-4973db8b7abe-combined-ca-bundle\") pod \"18f48aa7-65d1-41ce-bc0d-4973db8b7abe\" (UID: \"18f48aa7-65d1-41ce-bc0d-4973db8b7abe\") " Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.955405 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/18f48aa7-65d1-41ce-bc0d-4973db8b7abe-db-sync-config-data\") pod \"18f48aa7-65d1-41ce-bc0d-4973db8b7abe\" (UID: \"18f48aa7-65d1-41ce-bc0d-4973db8b7abe\") " Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.955586 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f48aa7-65d1-41ce-bc0d-4973db8b7abe-config-data\") pod \"18f48aa7-65d1-41ce-bc0d-4973db8b7abe\" (UID: \"18f48aa7-65d1-41ce-bc0d-4973db8b7abe\") " Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.955647 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpv48\" (UniqueName: \"kubernetes.io/projected/18f48aa7-65d1-41ce-bc0d-4973db8b7abe-kube-api-access-rpv48\") pod \"18f48aa7-65d1-41ce-bc0d-4973db8b7abe\" (UID: \"18f48aa7-65d1-41ce-bc0d-4973db8b7abe\") " Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.961752 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18f48aa7-65d1-41ce-bc0d-4973db8b7abe-kube-api-access-rpv48" (OuterVolumeSpecName: "kube-api-access-rpv48") pod "18f48aa7-65d1-41ce-bc0d-4973db8b7abe" (UID: "18f48aa7-65d1-41ce-bc0d-4973db8b7abe"). InnerVolumeSpecName "kube-api-access-rpv48". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.962722 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18f48aa7-65d1-41ce-bc0d-4973db8b7abe-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "18f48aa7-65d1-41ce-bc0d-4973db8b7abe" (UID: "18f48aa7-65d1-41ce-bc0d-4973db8b7abe"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:17 crc kubenswrapper[4831]: I1204 10:35:17.998354 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18f48aa7-65d1-41ce-bc0d-4973db8b7abe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18f48aa7-65d1-41ce-bc0d-4973db8b7abe" (UID: "18f48aa7-65d1-41ce-bc0d-4973db8b7abe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.012965 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18f48aa7-65d1-41ce-bc0d-4973db8b7abe-config-data" (OuterVolumeSpecName: "config-data") pod "18f48aa7-65d1-41ce-bc0d-4973db8b7abe" (UID: "18f48aa7-65d1-41ce-bc0d-4973db8b7abe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.057973 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f48aa7-65d1-41ce-bc0d-4973db8b7abe-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.058002 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpv48\" (UniqueName: \"kubernetes.io/projected/18f48aa7-65d1-41ce-bc0d-4973db8b7abe-kube-api-access-rpv48\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.058013 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f48aa7-65d1-41ce-bc0d-4973db8b7abe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.058021 4831 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/18f48aa7-65d1-41ce-bc0d-4973db8b7abe-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.128563 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-g5bvk" Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.129707 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-g5bvk" event={"ID":"18f48aa7-65d1-41ce-bc0d-4973db8b7abe","Type":"ContainerDied","Data":"2083b444af07eabf8c87ec1c8bad3d26b613524ebda6f40e4e88f35252cc9b96"} Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.129742 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2083b444af07eabf8c87ec1c8bad3d26b613524ebda6f40e4e88f35252cc9b96" Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.133494 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a8165b5c-aa05-4fbb-9541-7b25c5a75dab","Type":"ContainerStarted","Data":"12de460683e4c03bc34512e482b76b5d7298809b4fd628c9cf984de59df767c1"} Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.135727 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fa9407a-9a04-4712-bac8-d89067b14304","Type":"ContainerStarted","Data":"0513a579e5eccabc0b8e66fb008748bbcdc24aad70905a810665e9e5784cd89a"} Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.135767 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fa9407a-9a04-4712-bac8-d89067b14304","Type":"ContainerStarted","Data":"55d6de675878e9339fa5a8febe11f79629911d2d33419b23452f26a864f1668b"} Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.135779 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fa9407a-9a04-4712-bac8-d89067b14304","Type":"ContainerStarted","Data":"b401e568c3b353c6e76d648da930dd5b0aff62abe12d2b538e33eae5a75ea49d"} Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.145332 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"72ca3a62-9e01-4f14-8b45-6db4951fb6e6","Type":"ContainerStarted","Data":"f5c0f6158fd16b893efde2972c39137b2cdfb5869987b5694af4623a589af274"} Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.167526 4831 generic.go:334] "Generic (PLEG): container finished" podID="04bc7d54-ace7-4cd2-b867-9ccc94190c9d" containerID="193ceedcf09cb420ef02c38ea265ee3fc936a69b3f0e2e2dddeeccd1fc108251" exitCode=0 Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.167648 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-678bb56fc5-tf8dc" event={"ID":"04bc7d54-ace7-4cd2-b867-9ccc94190c9d","Type":"ContainerDied","Data":"193ceedcf09cb420ef02c38ea265ee3fc936a69b3f0e2e2dddeeccd1fc108251"} Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.168107 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-678bb56fc5-tf8dc" event={"ID":"04bc7d54-ace7-4cd2-b867-9ccc94190c9d","Type":"ContainerStarted","Data":"346acaf7bf950e2341827380b42dc6eb6fe26e8250028da61a75b19ec7319a9e"} Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.637190 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-678bb56fc5-tf8dc"] Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.753646 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-665f7b5ff9-klz7d"] Dec 04 10:35:18 crc kubenswrapper[4831]: E1204 10:35:18.754187 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18f48aa7-65d1-41ce-bc0d-4973db8b7abe" containerName="glance-db-sync" Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.754203 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="18f48aa7-65d1-41ce-bc0d-4973db8b7abe" containerName="glance-db-sync" Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.754414 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="18f48aa7-65d1-41ce-bc0d-4973db8b7abe" containerName="glance-db-sync" Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.755687 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-665f7b5ff9-klz7d" Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.768371 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-665f7b5ff9-klz7d"] Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.788362 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg7sl\" (UniqueName: \"kubernetes.io/projected/5173ee92-12de-4849-9659-882e5cfb1566-kube-api-access-rg7sl\") pod \"dnsmasq-dns-665f7b5ff9-klz7d\" (UID: \"5173ee92-12de-4849-9659-882e5cfb1566\") " pod="openstack/dnsmasq-dns-665f7b5ff9-klz7d" Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.788444 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5173ee92-12de-4849-9659-882e5cfb1566-dns-swift-storage-0\") pod \"dnsmasq-dns-665f7b5ff9-klz7d\" (UID: \"5173ee92-12de-4849-9659-882e5cfb1566\") " pod="openstack/dnsmasq-dns-665f7b5ff9-klz7d" Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.788492 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5173ee92-12de-4849-9659-882e5cfb1566-ovsdbserver-sb\") pod \"dnsmasq-dns-665f7b5ff9-klz7d\" (UID: \"5173ee92-12de-4849-9659-882e5cfb1566\") " pod="openstack/dnsmasq-dns-665f7b5ff9-klz7d" Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.788537 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5173ee92-12de-4849-9659-882e5cfb1566-dns-svc\") pod \"dnsmasq-dns-665f7b5ff9-klz7d\" (UID: \"5173ee92-12de-4849-9659-882e5cfb1566\") " pod="openstack/dnsmasq-dns-665f7b5ff9-klz7d" Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.788562 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5173ee92-12de-4849-9659-882e5cfb1566-config\") pod \"dnsmasq-dns-665f7b5ff9-klz7d\" (UID: \"5173ee92-12de-4849-9659-882e5cfb1566\") " pod="openstack/dnsmasq-dns-665f7b5ff9-klz7d" Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.788626 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5173ee92-12de-4849-9659-882e5cfb1566-ovsdbserver-nb\") pod \"dnsmasq-dns-665f7b5ff9-klz7d\" (UID: \"5173ee92-12de-4849-9659-882e5cfb1566\") " pod="openstack/dnsmasq-dns-665f7b5ff9-klz7d" Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.890105 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rg7sl\" (UniqueName: \"kubernetes.io/projected/5173ee92-12de-4849-9659-882e5cfb1566-kube-api-access-rg7sl\") pod \"dnsmasq-dns-665f7b5ff9-klz7d\" (UID: \"5173ee92-12de-4849-9659-882e5cfb1566\") " pod="openstack/dnsmasq-dns-665f7b5ff9-klz7d" Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.890244 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5173ee92-12de-4849-9659-882e5cfb1566-dns-swift-storage-0\") pod \"dnsmasq-dns-665f7b5ff9-klz7d\" (UID: \"5173ee92-12de-4849-9659-882e5cfb1566\") " pod="openstack/dnsmasq-dns-665f7b5ff9-klz7d" Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.890380 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5173ee92-12de-4849-9659-882e5cfb1566-ovsdbserver-sb\") pod \"dnsmasq-dns-665f7b5ff9-klz7d\" (UID: \"5173ee92-12de-4849-9659-882e5cfb1566\") " pod="openstack/dnsmasq-dns-665f7b5ff9-klz7d" Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.890430 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5173ee92-12de-4849-9659-882e5cfb1566-dns-svc\") pod \"dnsmasq-dns-665f7b5ff9-klz7d\" (UID: \"5173ee92-12de-4849-9659-882e5cfb1566\") " pod="openstack/dnsmasq-dns-665f7b5ff9-klz7d" Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.890476 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5173ee92-12de-4849-9659-882e5cfb1566-config\") pod \"dnsmasq-dns-665f7b5ff9-klz7d\" (UID: \"5173ee92-12de-4849-9659-882e5cfb1566\") " pod="openstack/dnsmasq-dns-665f7b5ff9-klz7d" Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.890564 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5173ee92-12de-4849-9659-882e5cfb1566-ovsdbserver-nb\") pod \"dnsmasq-dns-665f7b5ff9-klz7d\" (UID: \"5173ee92-12de-4849-9659-882e5cfb1566\") " pod="openstack/dnsmasq-dns-665f7b5ff9-klz7d" Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.891281 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5173ee92-12de-4849-9659-882e5cfb1566-dns-swift-storage-0\") pod \"dnsmasq-dns-665f7b5ff9-klz7d\" (UID: \"5173ee92-12de-4849-9659-882e5cfb1566\") " pod="openstack/dnsmasq-dns-665f7b5ff9-klz7d" Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.891345 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5173ee92-12de-4849-9659-882e5cfb1566-dns-svc\") pod \"dnsmasq-dns-665f7b5ff9-klz7d\" (UID: \"5173ee92-12de-4849-9659-882e5cfb1566\") " pod="openstack/dnsmasq-dns-665f7b5ff9-klz7d" Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.891890 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5173ee92-12de-4849-9659-882e5cfb1566-config\") pod \"dnsmasq-dns-665f7b5ff9-klz7d\" (UID: \"5173ee92-12de-4849-9659-882e5cfb1566\") " pod="openstack/dnsmasq-dns-665f7b5ff9-klz7d" Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.891956 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5173ee92-12de-4849-9659-882e5cfb1566-ovsdbserver-sb\") pod \"dnsmasq-dns-665f7b5ff9-klz7d\" (UID: \"5173ee92-12de-4849-9659-882e5cfb1566\") " pod="openstack/dnsmasq-dns-665f7b5ff9-klz7d" Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.896450 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vthqg" Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.900614 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5173ee92-12de-4849-9659-882e5cfb1566-ovsdbserver-nb\") pod \"dnsmasq-dns-665f7b5ff9-klz7d\" (UID: \"5173ee92-12de-4849-9659-882e5cfb1566\") " pod="openstack/dnsmasq-dns-665f7b5ff9-klz7d" Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.912761 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg7sl\" (UniqueName: \"kubernetes.io/projected/5173ee92-12de-4849-9659-882e5cfb1566-kube-api-access-rg7sl\") pod \"dnsmasq-dns-665f7b5ff9-klz7d\" (UID: \"5173ee92-12de-4849-9659-882e5cfb1566\") " pod="openstack/dnsmasq-dns-665f7b5ff9-klz7d" Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.980131 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-x7lqv" Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.992629 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9q7nr\" (UniqueName: \"kubernetes.io/projected/f292b8ac-6250-4a8a-b73e-75c6aeebe9d5-kube-api-access-9q7nr\") pod \"f292b8ac-6250-4a8a-b73e-75c6aeebe9d5\" (UID: \"f292b8ac-6250-4a8a-b73e-75c6aeebe9d5\") " Dec 04 10:35:18 crc kubenswrapper[4831]: I1204 10:35:18.992751 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jm88\" (UniqueName: \"kubernetes.io/projected/292354b3-46c7-4c76-b593-dda39380e797-kube-api-access-2jm88\") pod \"292354b3-46c7-4c76-b593-dda39380e797\" (UID: \"292354b3-46c7-4c76-b593-dda39380e797\") " Dec 04 10:35:19 crc kubenswrapper[4831]: I1204 10:35:19.001841 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/292354b3-46c7-4c76-b593-dda39380e797-kube-api-access-2jm88" (OuterVolumeSpecName: "kube-api-access-2jm88") pod "292354b3-46c7-4c76-b593-dda39380e797" (UID: "292354b3-46c7-4c76-b593-dda39380e797"). InnerVolumeSpecName "kube-api-access-2jm88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:35:19 crc kubenswrapper[4831]: I1204 10:35:19.019943 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f292b8ac-6250-4a8a-b73e-75c6aeebe9d5-kube-api-access-9q7nr" (OuterVolumeSpecName: "kube-api-access-9q7nr") pod "f292b8ac-6250-4a8a-b73e-75c6aeebe9d5" (UID: "f292b8ac-6250-4a8a-b73e-75c6aeebe9d5"). InnerVolumeSpecName "kube-api-access-9q7nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:35:19 crc kubenswrapper[4831]: I1204 10:35:19.095246 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9q7nr\" (UniqueName: \"kubernetes.io/projected/f292b8ac-6250-4a8a-b73e-75c6aeebe9d5-kube-api-access-9q7nr\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:19 crc kubenswrapper[4831]: I1204 10:35:19.095284 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jm88\" (UniqueName: \"kubernetes.io/projected/292354b3-46c7-4c76-b593-dda39380e797-kube-api-access-2jm88\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:19 crc kubenswrapper[4831]: I1204 10:35:19.117148 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-665f7b5ff9-klz7d" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.599834 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-x7lqv" Dec 04 10:35:20 crc kubenswrapper[4831]: E1204 10:35:20.607912 4831 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.332s" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.607945 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:35:20 crc kubenswrapper[4831]: E1204 10:35:20.608226 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f292b8ac-6250-4a8a-b73e-75c6aeebe9d5" containerName="mariadb-database-create" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.608241 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f292b8ac-6250-4a8a-b73e-75c6aeebe9d5" containerName="mariadb-database-create" Dec 04 10:35:20 crc kubenswrapper[4831]: E1204 10:35:20.608261 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="292354b3-46c7-4c76-b593-dda39380e797" containerName="mariadb-database-create" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.608269 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="292354b3-46c7-4c76-b593-dda39380e797" containerName="mariadb-database-create" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.611549 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="292354b3-46c7-4c76-b593-dda39380e797" containerName="mariadb-database-create" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.611585 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f292b8ac-6250-4a8a-b73e-75c6aeebe9d5" containerName="mariadb-database-create" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.612619 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.612643 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-665f7b5ff9-klz7d"] Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.612654 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-x7lqv" event={"ID":"f292b8ac-6250-4a8a-b73e-75c6aeebe9d5","Type":"ContainerDied","Data":"46b72d94d1aa3111d776923f4eeeedce8a41d35a0913464136c56d1074b34db0"} Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.612690 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46b72d94d1aa3111d776923f4eeeedce8a41d35a0913464136c56d1074b34db0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.612703 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.613815 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.613862 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.616787 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.619876 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hd6qr" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.620451 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.630555 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-vthqg" event={"ID":"292354b3-46c7-4c76-b593-dda39380e797","Type":"ContainerDied","Data":"b1e4c135cb3a055c1a060be812602729257d425f63eb000cd0e6f8298af201f8"} Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.630609 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1e4c135cb3a055c1a060be812602729257d425f63eb000cd0e6f8298af201f8" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.630712 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vthqg" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.635284 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.635403 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.635556 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.637940 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-kvqnq" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.648785 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hd6qr" event={"ID":"9066e1c7-0172-4ff8-b3ef-b4f5e316a4d2","Type":"ContainerDied","Data":"ce2f9f60319266dd6167a561e1f6c620715d4f895ba56cf9529c9a5c95d0032a"} Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.648834 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce2f9f60319266dd6167a561e1f6c620715d4f895ba56cf9529c9a5c95d0032a" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.648903 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hd6qr" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.732607 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlwrv\" (UniqueName: \"kubernetes.io/projected/9066e1c7-0172-4ff8-b3ef-b4f5e316a4d2-kube-api-access-vlwrv\") pod \"9066e1c7-0172-4ff8-b3ef-b4f5e316a4d2\" (UID: \"9066e1c7-0172-4ff8-b3ef-b4f5e316a4d2\") " Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.733017 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx47j\" (UniqueName: \"kubernetes.io/projected/d70148a5-dabd-495e-99a5-bba978f790f3-kube-api-access-vx47j\") pod \"glance-default-external-api-0\" (UID: \"d70148a5-dabd-495e-99a5-bba978f790f3\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.733055 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgldx\" (UniqueName: \"kubernetes.io/projected/2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf-kube-api-access-cgldx\") pod \"glance-default-internal-api-0\" (UID: \"2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.733083 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf-logs\") pod \"glance-default-internal-api-0\" (UID: \"2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.733111 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.733146 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d70148a5-dabd-495e-99a5-bba978f790f3-scripts\") pod \"glance-default-external-api-0\" (UID: \"d70148a5-dabd-495e-99a5-bba978f790f3\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.733221 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d70148a5-dabd-495e-99a5-bba978f790f3-logs\") pod \"glance-default-external-api-0\" (UID: \"d70148a5-dabd-495e-99a5-bba978f790f3\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.733296 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d70148a5-dabd-495e-99a5-bba978f790f3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d70148a5-dabd-495e-99a5-bba978f790f3\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.733318 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.733338 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d70148a5-dabd-495e-99a5-bba978f790f3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d70148a5-dabd-495e-99a5-bba978f790f3\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.733369 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"d70148a5-dabd-495e-99a5-bba978f790f3\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.733408 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.733429 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.733454 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.733473 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d70148a5-dabd-495e-99a5-bba978f790f3-config-data\") pod \"glance-default-external-api-0\" (UID: \"d70148a5-dabd-495e-99a5-bba978f790f3\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.755440 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9066e1c7-0172-4ff8-b3ef-b4f5e316a4d2-kube-api-access-vlwrv" (OuterVolumeSpecName: "kube-api-access-vlwrv") pod "9066e1c7-0172-4ff8-b3ef-b4f5e316a4d2" (UID: "9066e1c7-0172-4ff8-b3ef-b4f5e316a4d2"). InnerVolumeSpecName "kube-api-access-vlwrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.880813 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.880877 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.880920 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.880961 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d70148a5-dabd-495e-99a5-bba978f790f3-config-data\") pod \"glance-default-external-api-0\" (UID: \"d70148a5-dabd-495e-99a5-bba978f790f3\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.881105 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx47j\" (UniqueName: \"kubernetes.io/projected/d70148a5-dabd-495e-99a5-bba978f790f3-kube-api-access-vx47j\") pod \"glance-default-external-api-0\" (UID: \"d70148a5-dabd-495e-99a5-bba978f790f3\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.881150 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgldx\" (UniqueName: \"kubernetes.io/projected/2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf-kube-api-access-cgldx\") pod \"glance-default-internal-api-0\" (UID: \"2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.881193 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf-logs\") pod \"glance-default-internal-api-0\" (UID: \"2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.881237 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.881301 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d70148a5-dabd-495e-99a5-bba978f790f3-scripts\") pod \"glance-default-external-api-0\" (UID: \"d70148a5-dabd-495e-99a5-bba978f790f3\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.881408 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d70148a5-dabd-495e-99a5-bba978f790f3-logs\") pod \"glance-default-external-api-0\" (UID: \"d70148a5-dabd-495e-99a5-bba978f790f3\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.881514 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d70148a5-dabd-495e-99a5-bba978f790f3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d70148a5-dabd-495e-99a5-bba978f790f3\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.881554 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.881589 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d70148a5-dabd-495e-99a5-bba978f790f3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d70148a5-dabd-495e-99a5-bba978f790f3\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.881650 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"d70148a5-dabd-495e-99a5-bba978f790f3\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.882480 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf-logs\") pod \"glance-default-internal-api-0\" (UID: \"2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.926260 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.926574 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.926633 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlwrv\" (UniqueName: \"kubernetes.io/projected/9066e1c7-0172-4ff8-b3ef-b4f5e316a4d2-kube-api-access-vlwrv\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.927426 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"d70148a5-dabd-495e-99a5-bba978f790f3\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.932463 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d70148a5-dabd-495e-99a5-bba978f790f3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d70148a5-dabd-495e-99a5-bba978f790f3\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.932872 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d70148a5-dabd-495e-99a5-bba978f790f3-logs\") pod \"glance-default-external-api-0\" (UID: \"d70148a5-dabd-495e-99a5-bba978f790f3\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.933557 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.933622 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.939146 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d70148a5-dabd-495e-99a5-bba978f790f3-scripts\") pod \"glance-default-external-api-0\" (UID: \"d70148a5-dabd-495e-99a5-bba978f790f3\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.959460 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.966901 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d70148a5-dabd-495e-99a5-bba978f790f3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d70148a5-dabd-495e-99a5-bba978f790f3\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.968208 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d70148a5-dabd-495e-99a5-bba978f790f3-config-data\") pod \"glance-default-external-api-0\" (UID: \"d70148a5-dabd-495e-99a5-bba978f790f3\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.972880 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx47j\" (UniqueName: \"kubernetes.io/projected/d70148a5-dabd-495e-99a5-bba978f790f3-kube-api-access-vx47j\") pod \"glance-default-external-api-0\" (UID: \"d70148a5-dabd-495e-99a5-bba978f790f3\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:20 crc kubenswrapper[4831]: I1204 10:35:20.974557 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgldx\" (UniqueName: \"kubernetes.io/projected/2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf-kube-api-access-cgldx\") pod \"glance-default-internal-api-0\" (UID: \"2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:21 crc kubenswrapper[4831]: I1204 10:35:21.023578 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:21 crc kubenswrapper[4831]: I1204 10:35:21.025535 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"d70148a5-dabd-495e-99a5-bba978f790f3\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:21 crc kubenswrapper[4831]: I1204 10:35:21.312235 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 10:35:21 crc kubenswrapper[4831]: I1204 10:35:21.315132 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 10:35:21 crc kubenswrapper[4831]: I1204 10:35:21.673259 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a8165b5c-aa05-4fbb-9541-7b25c5a75dab","Type":"ContainerStarted","Data":"aa14084d820d4e5ae0121d90b696ff38005cdf9cde47f1393531d11d7bed4ee2"} Dec 04 10:35:21 crc kubenswrapper[4831]: I1204 10:35:21.675912 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-665f7b5ff9-klz7d" event={"ID":"5173ee92-12de-4849-9659-882e5cfb1566","Type":"ContainerStarted","Data":"1927f36549715d169df02f8ee59ec04b92459a9191554b0cb7d3d2bb01ae70e4"} Dec 04 10:35:21 crc kubenswrapper[4831]: I1204 10:35:21.675959 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-665f7b5ff9-klz7d" event={"ID":"5173ee92-12de-4849-9659-882e5cfb1566","Type":"ContainerStarted","Data":"848c24145c819c42882eb04be06ec52cd5d1725211d62079fa0bd470c60608e5"} Dec 04 10:35:21 crc kubenswrapper[4831]: I1204 10:35:21.695052 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-678bb56fc5-tf8dc" event={"ID":"04bc7d54-ace7-4cd2-b867-9ccc94190c9d","Type":"ContainerStarted","Data":"63cafffd975d640e573c19f84e392f7a16cd95c086578b0c9aadc8509f183900"} Dec 04 10:35:21 crc kubenswrapper[4831]: I1204 10:35:21.704852 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"72ca3a62-9e01-4f14-8b45-6db4951fb6e6","Type":"ContainerStarted","Data":"2c0e04e721d4a92800cd581e83deeaa1f392408423c211e35d354d64cc9bfefa"} Dec 04 10:35:21 crc kubenswrapper[4831]: I1204 10:35:21.971344 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:35:21 crc kubenswrapper[4831]: I1204 10:35:21.971615 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:35:22 crc kubenswrapper[4831]: I1204 10:35:22.028065 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:35:22 crc kubenswrapper[4831]: W1204 10:35:22.032908 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd70148a5_dabd_495e_99a5_bba978f790f3.slice/crio-ebe52925cf882efe56319b49580d293a3aa94e52173f5f12ce861913cae3ef2d WatchSource:0}: Error finding container ebe52925cf882efe56319b49580d293a3aa94e52173f5f12ce861913cae3ef2d: Status 404 returned error can't find the container with id ebe52925cf882efe56319b49580d293a3aa94e52173f5f12ce861913cae3ef2d Dec 04 10:35:22 crc kubenswrapper[4831]: I1204 10:35:22.689800 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:35:22 crc kubenswrapper[4831]: I1204 10:35:22.740492 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fa9407a-9a04-4712-bac8-d89067b14304","Type":"ContainerStarted","Data":"13cd496430f493f3d5eb1c1f803b5815d3d73e339423997d2be1e31c268f6fb7"} Dec 04 10:35:22 crc kubenswrapper[4831]: I1204 10:35:22.742340 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d70148a5-dabd-495e-99a5-bba978f790f3","Type":"ContainerStarted","Data":"ebe52925cf882efe56319b49580d293a3aa94e52173f5f12ce861913cae3ef2d"} Dec 04 10:35:22 crc kubenswrapper[4831]: I1204 10:35:22.763506 4831 generic.go:334] "Generic (PLEG): container finished" podID="5173ee92-12de-4849-9659-882e5cfb1566" containerID="1927f36549715d169df02f8ee59ec04b92459a9191554b0cb7d3d2bb01ae70e4" exitCode=0 Dec 04 10:35:22 crc kubenswrapper[4831]: I1204 10:35:22.763791 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-678bb56fc5-tf8dc" podUID="04bc7d54-ace7-4cd2-b867-9ccc94190c9d" containerName="dnsmasq-dns" containerID="cri-o://63cafffd975d640e573c19f84e392f7a16cd95c086578b0c9aadc8509f183900" gracePeriod=10 Dec 04 10:35:22 crc kubenswrapper[4831]: I1204 10:35:22.764369 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-665f7b5ff9-klz7d" event={"ID":"5173ee92-12de-4849-9659-882e5cfb1566","Type":"ContainerDied","Data":"1927f36549715d169df02f8ee59ec04b92459a9191554b0cb7d3d2bb01ae70e4"} Dec 04 10:35:22 crc kubenswrapper[4831]: I1204 10:35:22.764405 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-678bb56fc5-tf8dc" Dec 04 10:35:22 crc kubenswrapper[4831]: I1204 10:35:22.770503 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 10:35:22 crc kubenswrapper[4831]: I1204 10:35:22.847782 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-678bb56fc5-tf8dc" podStartSLOduration=6.8477631599999995 podStartE2EDuration="6.84776316s" podCreationTimestamp="2025-12-04 10:35:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:35:22.829427133 +0000 UTC m=+1219.778602457" watchObservedRunningTime="2025-12-04 10:35:22.84776316 +0000 UTC m=+1219.796938474" Dec 04 10:35:22 crc kubenswrapper[4831]: I1204 10:35:22.877150 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 10:35:23 crc kubenswrapper[4831]: I1204 10:35:23.463004 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-678bb56fc5-tf8dc" Dec 04 10:35:23 crc kubenswrapper[4831]: I1204 10:35:23.607010 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04bc7d54-ace7-4cd2-b867-9ccc94190c9d-config\") pod \"04bc7d54-ace7-4cd2-b867-9ccc94190c9d\" (UID: \"04bc7d54-ace7-4cd2-b867-9ccc94190c9d\") " Dec 04 10:35:23 crc kubenswrapper[4831]: I1204 10:35:23.607299 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04bc7d54-ace7-4cd2-b867-9ccc94190c9d-ovsdbserver-sb\") pod \"04bc7d54-ace7-4cd2-b867-9ccc94190c9d\" (UID: \"04bc7d54-ace7-4cd2-b867-9ccc94190c9d\") " Dec 04 10:35:23 crc kubenswrapper[4831]: I1204 10:35:23.607416 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwp95\" (UniqueName: \"kubernetes.io/projected/04bc7d54-ace7-4cd2-b867-9ccc94190c9d-kube-api-access-qwp95\") pod \"04bc7d54-ace7-4cd2-b867-9ccc94190c9d\" (UID: \"04bc7d54-ace7-4cd2-b867-9ccc94190c9d\") " Dec 04 10:35:23 crc kubenswrapper[4831]: I1204 10:35:23.607493 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04bc7d54-ace7-4cd2-b867-9ccc94190c9d-dns-svc\") pod \"04bc7d54-ace7-4cd2-b867-9ccc94190c9d\" (UID: \"04bc7d54-ace7-4cd2-b867-9ccc94190c9d\") " Dec 04 10:35:23 crc kubenswrapper[4831]: I1204 10:35:23.607974 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04bc7d54-ace7-4cd2-b867-9ccc94190c9d-dns-swift-storage-0\") pod \"04bc7d54-ace7-4cd2-b867-9ccc94190c9d\" (UID: \"04bc7d54-ace7-4cd2-b867-9ccc94190c9d\") " Dec 04 10:35:23 crc kubenswrapper[4831]: I1204 10:35:23.608084 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04bc7d54-ace7-4cd2-b867-9ccc94190c9d-ovsdbserver-nb\") pod \"04bc7d54-ace7-4cd2-b867-9ccc94190c9d\" (UID: \"04bc7d54-ace7-4cd2-b867-9ccc94190c9d\") " Dec 04 10:35:23 crc kubenswrapper[4831]: I1204 10:35:23.637777 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04bc7d54-ace7-4cd2-b867-9ccc94190c9d-kube-api-access-qwp95" (OuterVolumeSpecName: "kube-api-access-qwp95") pod "04bc7d54-ace7-4cd2-b867-9ccc94190c9d" (UID: "04bc7d54-ace7-4cd2-b867-9ccc94190c9d"). InnerVolumeSpecName "kube-api-access-qwp95". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:35:23 crc kubenswrapper[4831]: I1204 10:35:23.713256 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwp95\" (UniqueName: \"kubernetes.io/projected/04bc7d54-ace7-4cd2-b867-9ccc94190c9d-kube-api-access-qwp95\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:23 crc kubenswrapper[4831]: I1204 10:35:23.723867 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04bc7d54-ace7-4cd2-b867-9ccc94190c9d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "04bc7d54-ace7-4cd2-b867-9ccc94190c9d" (UID: "04bc7d54-ace7-4cd2-b867-9ccc94190c9d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:35:23 crc kubenswrapper[4831]: I1204 10:35:23.749205 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04bc7d54-ace7-4cd2-b867-9ccc94190c9d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "04bc7d54-ace7-4cd2-b867-9ccc94190c9d" (UID: "04bc7d54-ace7-4cd2-b867-9ccc94190c9d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:35:23 crc kubenswrapper[4831]: I1204 10:35:23.763171 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04bc7d54-ace7-4cd2-b867-9ccc94190c9d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "04bc7d54-ace7-4cd2-b867-9ccc94190c9d" (UID: "04bc7d54-ace7-4cd2-b867-9ccc94190c9d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:35:23 crc kubenswrapper[4831]: I1204 10:35:23.764073 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04bc7d54-ace7-4cd2-b867-9ccc94190c9d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "04bc7d54-ace7-4cd2-b867-9ccc94190c9d" (UID: "04bc7d54-ace7-4cd2-b867-9ccc94190c9d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:35:23 crc kubenswrapper[4831]: I1204 10:35:23.776184 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04bc7d54-ace7-4cd2-b867-9ccc94190c9d-config" (OuterVolumeSpecName: "config") pod "04bc7d54-ace7-4cd2-b867-9ccc94190c9d" (UID: "04bc7d54-ace7-4cd2-b867-9ccc94190c9d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:35:23 crc kubenswrapper[4831]: I1204 10:35:23.785560 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a8165b5c-aa05-4fbb-9541-7b25c5a75dab","Type":"ContainerStarted","Data":"ed01a400f31f6f1b88ce1bda9c588e7abe641eb3da9178fc8120f9249b674f8f"} Dec 04 10:35:23 crc kubenswrapper[4831]: I1204 10:35:23.785722 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="a8165b5c-aa05-4fbb-9541-7b25c5a75dab" containerName="cinder-api-log" containerID="cri-o://aa14084d820d4e5ae0121d90b696ff38005cdf9cde47f1393531d11d7bed4ee2" gracePeriod=30 Dec 04 10:35:23 crc kubenswrapper[4831]: I1204 10:35:23.785957 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 04 10:35:23 crc kubenswrapper[4831]: I1204 10:35:23.786185 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="a8165b5c-aa05-4fbb-9541-7b25c5a75dab" containerName="cinder-api" containerID="cri-o://ed01a400f31f6f1b88ce1bda9c588e7abe641eb3da9178fc8120f9249b674f8f" gracePeriod=30 Dec 04 10:35:23 crc kubenswrapper[4831]: I1204 10:35:23.818927 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d70148a5-dabd-495e-99a5-bba978f790f3","Type":"ContainerStarted","Data":"1e580ae25b273d0954a447ab21a27041865e0589a79d60218f6061181a286398"} Dec 04 10:35:23 crc kubenswrapper[4831]: I1204 10:35:23.819145 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04bc7d54-ace7-4cd2-b867-9ccc94190c9d-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:23 crc kubenswrapper[4831]: I1204 10:35:23.819160 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04bc7d54-ace7-4cd2-b867-9ccc94190c9d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:23 crc kubenswrapper[4831]: I1204 10:35:23.819171 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04bc7d54-ace7-4cd2-b867-9ccc94190c9d-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:23 crc kubenswrapper[4831]: I1204 10:35:23.819179 4831 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04bc7d54-ace7-4cd2-b867-9ccc94190c9d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:23 crc kubenswrapper[4831]: I1204 10:35:23.819188 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04bc7d54-ace7-4cd2-b867-9ccc94190c9d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:23 crc kubenswrapper[4831]: I1204 10:35:23.825972 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=7.825948924 podStartE2EDuration="7.825948924s" podCreationTimestamp="2025-12-04 10:35:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:35:23.813099732 +0000 UTC m=+1220.762275046" watchObservedRunningTime="2025-12-04 10:35:23.825948924 +0000 UTC m=+1220.775124238" Dec 04 10:35:23 crc kubenswrapper[4831]: I1204 10:35:23.831755 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-665f7b5ff9-klz7d" event={"ID":"5173ee92-12de-4849-9659-882e5cfb1566","Type":"ContainerStarted","Data":"d073ffebbd17041287755022174d9c5fe685e388d6b4c25553ca339ab6ba33ce"} Dec 04 10:35:23 crc kubenswrapper[4831]: I1204 10:35:23.832983 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-665f7b5ff9-klz7d" Dec 04 10:35:23 crc kubenswrapper[4831]: I1204 10:35:23.847553 4831 generic.go:334] "Generic (PLEG): container finished" podID="04bc7d54-ace7-4cd2-b867-9ccc94190c9d" containerID="63cafffd975d640e573c19f84e392f7a16cd95c086578b0c9aadc8509f183900" exitCode=0 Dec 04 10:35:23 crc kubenswrapper[4831]: I1204 10:35:23.847606 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-678bb56fc5-tf8dc" event={"ID":"04bc7d54-ace7-4cd2-b867-9ccc94190c9d","Type":"ContainerDied","Data":"63cafffd975d640e573c19f84e392f7a16cd95c086578b0c9aadc8509f183900"} Dec 04 10:35:23 crc kubenswrapper[4831]: I1204 10:35:23.847625 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-678bb56fc5-tf8dc" event={"ID":"04bc7d54-ace7-4cd2-b867-9ccc94190c9d","Type":"ContainerDied","Data":"346acaf7bf950e2341827380b42dc6eb6fe26e8250028da61a75b19ec7319a9e"} Dec 04 10:35:23 crc kubenswrapper[4831]: I1204 10:35:23.847640 4831 scope.go:117] "RemoveContainer" containerID="63cafffd975d640e573c19f84e392f7a16cd95c086578b0c9aadc8509f183900" Dec 04 10:35:23 crc kubenswrapper[4831]: I1204 10:35:23.847845 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-678bb56fc5-tf8dc" Dec 04 10:35:23 crc kubenswrapper[4831]: I1204 10:35:23.859284 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-665f7b5ff9-klz7d" podStartSLOduration=5.859262208 podStartE2EDuration="5.859262208s" podCreationTimestamp="2025-12-04 10:35:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:35:23.853013492 +0000 UTC m=+1220.802188816" watchObservedRunningTime="2025-12-04 10:35:23.859262208 +0000 UTC m=+1220.808437532" Dec 04 10:35:23 crc kubenswrapper[4831]: I1204 10:35:23.872916 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"72ca3a62-9e01-4f14-8b45-6db4951fb6e6","Type":"ContainerStarted","Data":"17e1d0021386645c2462c680dfa2a92e0a4b55f149a8e05b1eb361431e680f2f"} Dec 04 10:35:23 crc kubenswrapper[4831]: I1204 10:35:23.879183 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf","Type":"ContainerStarted","Data":"66af1cc4f15e4bb51a37b78c2d077fb4efe775bc957a763f134584855471c23c"} Dec 04 10:35:23 crc kubenswrapper[4831]: I1204 10:35:23.899197 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=7.590854377 podStartE2EDuration="7.899171628s" podCreationTimestamp="2025-12-04 10:35:16 +0000 UTC" firstStartedPulling="2025-12-04 10:35:17.909648057 +0000 UTC m=+1214.858823361" lastFinishedPulling="2025-12-04 10:35:18.217965298 +0000 UTC m=+1215.167140612" observedRunningTime="2025-12-04 10:35:23.892701376 +0000 UTC m=+1220.841876700" watchObservedRunningTime="2025-12-04 10:35:23.899171628 +0000 UTC m=+1220.848346942" Dec 04 10:35:23 crc kubenswrapper[4831]: I1204 10:35:23.940068 4831 scope.go:117] "RemoveContainer" containerID="193ceedcf09cb420ef02c38ea265ee3fc936a69b3f0e2e2dddeeccd1fc108251" Dec 04 10:35:23 crc kubenswrapper[4831]: I1204 10:35:23.942183 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-678bb56fc5-tf8dc"] Dec 04 10:35:23 crc kubenswrapper[4831]: I1204 10:35:23.952839 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-678bb56fc5-tf8dc"] Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.074842 4831 scope.go:117] "RemoveContainer" containerID="63cafffd975d640e573c19f84e392f7a16cd95c086578b0c9aadc8509f183900" Dec 04 10:35:24 crc kubenswrapper[4831]: E1204 10:35:24.075599 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63cafffd975d640e573c19f84e392f7a16cd95c086578b0c9aadc8509f183900\": container with ID starting with 63cafffd975d640e573c19f84e392f7a16cd95c086578b0c9aadc8509f183900 not found: ID does not exist" containerID="63cafffd975d640e573c19f84e392f7a16cd95c086578b0c9aadc8509f183900" Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.075695 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63cafffd975d640e573c19f84e392f7a16cd95c086578b0c9aadc8509f183900"} err="failed to get container status \"63cafffd975d640e573c19f84e392f7a16cd95c086578b0c9aadc8509f183900\": rpc error: code = NotFound desc = could not find container \"63cafffd975d640e573c19f84e392f7a16cd95c086578b0c9aadc8509f183900\": container with ID starting with 63cafffd975d640e573c19f84e392f7a16cd95c086578b0c9aadc8509f183900 not found: ID does not exist" Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.075720 4831 scope.go:117] "RemoveContainer" containerID="193ceedcf09cb420ef02c38ea265ee3fc936a69b3f0e2e2dddeeccd1fc108251" Dec 04 10:35:24 crc kubenswrapper[4831]: E1204 10:35:24.076132 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"193ceedcf09cb420ef02c38ea265ee3fc936a69b3f0e2e2dddeeccd1fc108251\": container with ID starting with 193ceedcf09cb420ef02c38ea265ee3fc936a69b3f0e2e2dddeeccd1fc108251 not found: ID does not exist" containerID="193ceedcf09cb420ef02c38ea265ee3fc936a69b3f0e2e2dddeeccd1fc108251" Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.076158 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"193ceedcf09cb420ef02c38ea265ee3fc936a69b3f0e2e2dddeeccd1fc108251"} err="failed to get container status \"193ceedcf09cb420ef02c38ea265ee3fc936a69b3f0e2e2dddeeccd1fc108251\": rpc error: code = NotFound desc = could not find container \"193ceedcf09cb420ef02c38ea265ee3fc936a69b3f0e2e2dddeeccd1fc108251\": container with ID starting with 193ceedcf09cb420ef02c38ea265ee3fc936a69b3f0e2e2dddeeccd1fc108251 not found: ID does not exist" Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.345172 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.432399 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8165b5c-aa05-4fbb-9541-7b25c5a75dab-logs\") pod \"a8165b5c-aa05-4fbb-9541-7b25c5a75dab\" (UID: \"a8165b5c-aa05-4fbb-9541-7b25c5a75dab\") " Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.432536 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8165b5c-aa05-4fbb-9541-7b25c5a75dab-config-data-custom\") pod \"a8165b5c-aa05-4fbb-9541-7b25c5a75dab\" (UID: \"a8165b5c-aa05-4fbb-9541-7b25c5a75dab\") " Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.432598 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8165b5c-aa05-4fbb-9541-7b25c5a75dab-combined-ca-bundle\") pod \"a8165b5c-aa05-4fbb-9541-7b25c5a75dab\" (UID: \"a8165b5c-aa05-4fbb-9541-7b25c5a75dab\") " Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.432758 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8165b5c-aa05-4fbb-9541-7b25c5a75dab-config-data\") pod \"a8165b5c-aa05-4fbb-9541-7b25c5a75dab\" (UID: \"a8165b5c-aa05-4fbb-9541-7b25c5a75dab\") " Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.432802 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8165b5c-aa05-4fbb-9541-7b25c5a75dab-scripts\") pod \"a8165b5c-aa05-4fbb-9541-7b25c5a75dab\" (UID: \"a8165b5c-aa05-4fbb-9541-7b25c5a75dab\") " Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.432821 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpf5f\" (UniqueName: \"kubernetes.io/projected/a8165b5c-aa05-4fbb-9541-7b25c5a75dab-kube-api-access-kpf5f\") pod \"a8165b5c-aa05-4fbb-9541-7b25c5a75dab\" (UID: \"a8165b5c-aa05-4fbb-9541-7b25c5a75dab\") " Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.432841 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a8165b5c-aa05-4fbb-9541-7b25c5a75dab-etc-machine-id\") pod \"a8165b5c-aa05-4fbb-9541-7b25c5a75dab\" (UID: \"a8165b5c-aa05-4fbb-9541-7b25c5a75dab\") " Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.433231 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a8165b5c-aa05-4fbb-9541-7b25c5a75dab-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a8165b5c-aa05-4fbb-9541-7b25c5a75dab" (UID: "a8165b5c-aa05-4fbb-9541-7b25c5a75dab"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.437181 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8165b5c-aa05-4fbb-9541-7b25c5a75dab-logs" (OuterVolumeSpecName: "logs") pod "a8165b5c-aa05-4fbb-9541-7b25c5a75dab" (UID: "a8165b5c-aa05-4fbb-9541-7b25c5a75dab"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.443474 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8165b5c-aa05-4fbb-9541-7b25c5a75dab-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a8165b5c-aa05-4fbb-9541-7b25c5a75dab" (UID: "a8165b5c-aa05-4fbb-9541-7b25c5a75dab"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.444983 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8165b5c-aa05-4fbb-9541-7b25c5a75dab-kube-api-access-kpf5f" (OuterVolumeSpecName: "kube-api-access-kpf5f") pod "a8165b5c-aa05-4fbb-9541-7b25c5a75dab" (UID: "a8165b5c-aa05-4fbb-9541-7b25c5a75dab"). InnerVolumeSpecName "kube-api-access-kpf5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.456322 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8165b5c-aa05-4fbb-9541-7b25c5a75dab-scripts" (OuterVolumeSpecName: "scripts") pod "a8165b5c-aa05-4fbb-9541-7b25c5a75dab" (UID: "a8165b5c-aa05-4fbb-9541-7b25c5a75dab"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.486769 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8165b5c-aa05-4fbb-9541-7b25c5a75dab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a8165b5c-aa05-4fbb-9541-7b25c5a75dab" (UID: "a8165b5c-aa05-4fbb-9541-7b25c5a75dab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.527855 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8165b5c-aa05-4fbb-9541-7b25c5a75dab-config-data" (OuterVolumeSpecName: "config-data") pod "a8165b5c-aa05-4fbb-9541-7b25c5a75dab" (UID: "a8165b5c-aa05-4fbb-9541-7b25c5a75dab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.534917 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8165b5c-aa05-4fbb-9541-7b25c5a75dab-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.534962 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8165b5c-aa05-4fbb-9541-7b25c5a75dab-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.534977 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpf5f\" (UniqueName: \"kubernetes.io/projected/a8165b5c-aa05-4fbb-9541-7b25c5a75dab-kube-api-access-kpf5f\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.534992 4831 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a8165b5c-aa05-4fbb-9541-7b25c5a75dab-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.535007 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8165b5c-aa05-4fbb-9541-7b25c5a75dab-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.535017 4831 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8165b5c-aa05-4fbb-9541-7b25c5a75dab-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.535028 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8165b5c-aa05-4fbb-9541-7b25c5a75dab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.892108 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d70148a5-dabd-495e-99a5-bba978f790f3","Type":"ContainerStarted","Data":"032fb5f8c7f9424652d8d59594f8486199c0ad896ac1ea43f164175952386490"} Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.892211 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d70148a5-dabd-495e-99a5-bba978f790f3" containerName="glance-log" containerID="cri-o://1e580ae25b273d0954a447ab21a27041865e0589a79d60218f6061181a286398" gracePeriod=30 Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.892264 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d70148a5-dabd-495e-99a5-bba978f790f3" containerName="glance-httpd" containerID="cri-o://032fb5f8c7f9424652d8d59594f8486199c0ad896ac1ea43f164175952386490" gracePeriod=30 Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.899371 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fa9407a-9a04-4712-bac8-d89067b14304","Type":"ContainerStarted","Data":"8cb7483f62005d83064a87e635c4881257cb8cba0cb7f365e44c81c19700294a"} Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.899472 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.901836 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf","Type":"ContainerStarted","Data":"23206df2bac2856f43efed995d60cd89740864d655544cb9484ce2b3ca0792d0"} Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.903789 4831 generic.go:334] "Generic (PLEG): container finished" podID="a8165b5c-aa05-4fbb-9541-7b25c5a75dab" containerID="ed01a400f31f6f1b88ce1bda9c588e7abe641eb3da9178fc8120f9249b674f8f" exitCode=0 Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.903821 4831 generic.go:334] "Generic (PLEG): container finished" podID="a8165b5c-aa05-4fbb-9541-7b25c5a75dab" containerID="aa14084d820d4e5ae0121d90b696ff38005cdf9cde47f1393531d11d7bed4ee2" exitCode=143 Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.904620 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.905839 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a8165b5c-aa05-4fbb-9541-7b25c5a75dab","Type":"ContainerDied","Data":"ed01a400f31f6f1b88ce1bda9c588e7abe641eb3da9178fc8120f9249b674f8f"} Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.906040 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a8165b5c-aa05-4fbb-9541-7b25c5a75dab","Type":"ContainerDied","Data":"aa14084d820d4e5ae0121d90b696ff38005cdf9cde47f1393531d11d7bed4ee2"} Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.906053 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a8165b5c-aa05-4fbb-9541-7b25c5a75dab","Type":"ContainerDied","Data":"12de460683e4c03bc34512e482b76b5d7298809b4fd628c9cf984de59df767c1"} Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.906070 4831 scope.go:117] "RemoveContainer" containerID="ed01a400f31f6f1b88ce1bda9c588e7abe641eb3da9178fc8120f9249b674f8f" Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.914350 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.914334093 podStartE2EDuration="6.914334093s" podCreationTimestamp="2025-12-04 10:35:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:35:24.913607734 +0000 UTC m=+1221.862783048" watchObservedRunningTime="2025-12-04 10:35:24.914334093 +0000 UTC m=+1221.863509407" Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.954618 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.6326254860000002 podStartE2EDuration="8.954601882s" podCreationTimestamp="2025-12-04 10:35:16 +0000 UTC" firstStartedPulling="2025-12-04 10:35:17.334269672 +0000 UTC m=+1214.283444986" lastFinishedPulling="2025-12-04 10:35:23.656246068 +0000 UTC m=+1220.605421382" observedRunningTime="2025-12-04 10:35:24.954047248 +0000 UTC m=+1221.903222562" watchObservedRunningTime="2025-12-04 10:35:24.954601882 +0000 UTC m=+1221.903777196" Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.971004 4831 scope.go:117] "RemoveContainer" containerID="aa14084d820d4e5ae0121d90b696ff38005cdf9cde47f1393531d11d7bed4ee2" Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.983875 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 04 10:35:24 crc kubenswrapper[4831]: I1204 10:35:24.996216 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.007958 4831 scope.go:117] "RemoveContainer" containerID="ed01a400f31f6f1b88ce1bda9c588e7abe641eb3da9178fc8120f9249b674f8f" Dec 04 10:35:25 crc kubenswrapper[4831]: E1204 10:35:25.008951 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed01a400f31f6f1b88ce1bda9c588e7abe641eb3da9178fc8120f9249b674f8f\": container with ID starting with ed01a400f31f6f1b88ce1bda9c588e7abe641eb3da9178fc8120f9249b674f8f not found: ID does not exist" containerID="ed01a400f31f6f1b88ce1bda9c588e7abe641eb3da9178fc8120f9249b674f8f" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.008980 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed01a400f31f6f1b88ce1bda9c588e7abe641eb3da9178fc8120f9249b674f8f"} err="failed to get container status \"ed01a400f31f6f1b88ce1bda9c588e7abe641eb3da9178fc8120f9249b674f8f\": rpc error: code = NotFound desc = could not find container \"ed01a400f31f6f1b88ce1bda9c588e7abe641eb3da9178fc8120f9249b674f8f\": container with ID starting with ed01a400f31f6f1b88ce1bda9c588e7abe641eb3da9178fc8120f9249b674f8f not found: ID does not exist" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.009001 4831 scope.go:117] "RemoveContainer" containerID="aa14084d820d4e5ae0121d90b696ff38005cdf9cde47f1393531d11d7bed4ee2" Dec 04 10:35:25 crc kubenswrapper[4831]: E1204 10:35:25.009198 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa14084d820d4e5ae0121d90b696ff38005cdf9cde47f1393531d11d7bed4ee2\": container with ID starting with aa14084d820d4e5ae0121d90b696ff38005cdf9cde47f1393531d11d7bed4ee2 not found: ID does not exist" containerID="aa14084d820d4e5ae0121d90b696ff38005cdf9cde47f1393531d11d7bed4ee2" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.009217 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa14084d820d4e5ae0121d90b696ff38005cdf9cde47f1393531d11d7bed4ee2"} err="failed to get container status \"aa14084d820d4e5ae0121d90b696ff38005cdf9cde47f1393531d11d7bed4ee2\": rpc error: code = NotFound desc = could not find container \"aa14084d820d4e5ae0121d90b696ff38005cdf9cde47f1393531d11d7bed4ee2\": container with ID starting with aa14084d820d4e5ae0121d90b696ff38005cdf9cde47f1393531d11d7bed4ee2 not found: ID does not exist" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.009229 4831 scope.go:117] "RemoveContainer" containerID="ed01a400f31f6f1b88ce1bda9c588e7abe641eb3da9178fc8120f9249b674f8f" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.009969 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 04 10:35:25 crc kubenswrapper[4831]: E1204 10:35:25.010341 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8165b5c-aa05-4fbb-9541-7b25c5a75dab" containerName="cinder-api" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.010358 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8165b5c-aa05-4fbb-9541-7b25c5a75dab" containerName="cinder-api" Dec 04 10:35:25 crc kubenswrapper[4831]: E1204 10:35:25.010380 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04bc7d54-ace7-4cd2-b867-9ccc94190c9d" containerName="init" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.010387 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="04bc7d54-ace7-4cd2-b867-9ccc94190c9d" containerName="init" Dec 04 10:35:25 crc kubenswrapper[4831]: E1204 10:35:25.010402 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9066e1c7-0172-4ff8-b3ef-b4f5e316a4d2" containerName="mariadb-database-create" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.010408 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="9066e1c7-0172-4ff8-b3ef-b4f5e316a4d2" containerName="mariadb-database-create" Dec 04 10:35:25 crc kubenswrapper[4831]: E1204 10:35:25.010429 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04bc7d54-ace7-4cd2-b867-9ccc94190c9d" containerName="dnsmasq-dns" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.010435 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="04bc7d54-ace7-4cd2-b867-9ccc94190c9d" containerName="dnsmasq-dns" Dec 04 10:35:25 crc kubenswrapper[4831]: E1204 10:35:25.010450 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8165b5c-aa05-4fbb-9541-7b25c5a75dab" containerName="cinder-api-log" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.010457 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8165b5c-aa05-4fbb-9541-7b25c5a75dab" containerName="cinder-api-log" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.010611 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="9066e1c7-0172-4ff8-b3ef-b4f5e316a4d2" containerName="mariadb-database-create" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.010628 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="04bc7d54-ace7-4cd2-b867-9ccc94190c9d" containerName="dnsmasq-dns" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.010637 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8165b5c-aa05-4fbb-9541-7b25c5a75dab" containerName="cinder-api-log" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.010646 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8165b5c-aa05-4fbb-9541-7b25c5a75dab" containerName="cinder-api" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.010953 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed01a400f31f6f1b88ce1bda9c588e7abe641eb3da9178fc8120f9249b674f8f"} err="failed to get container status \"ed01a400f31f6f1b88ce1bda9c588e7abe641eb3da9178fc8120f9249b674f8f\": rpc error: code = NotFound desc = could not find container \"ed01a400f31f6f1b88ce1bda9c588e7abe641eb3da9178fc8120f9249b674f8f\": container with ID starting with ed01a400f31f6f1b88ce1bda9c588e7abe641eb3da9178fc8120f9249b674f8f not found: ID does not exist" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.010971 4831 scope.go:117] "RemoveContainer" containerID="aa14084d820d4e5ae0121d90b696ff38005cdf9cde47f1393531d11d7bed4ee2" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.011153 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa14084d820d4e5ae0121d90b696ff38005cdf9cde47f1393531d11d7bed4ee2"} err="failed to get container status \"aa14084d820d4e5ae0121d90b696ff38005cdf9cde47f1393531d11d7bed4ee2\": rpc error: code = NotFound desc = could not find container \"aa14084d820d4e5ae0121d90b696ff38005cdf9cde47f1393531d11d7bed4ee2\": container with ID starting with aa14084d820d4e5ae0121d90b696ff38005cdf9cde47f1393531d11d7bed4ee2 not found: ID does not exist" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.011953 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.025072 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.025286 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.025389 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.034876 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.146908 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f61bdc02-a856-448f-9cdf-f3c43efc4bfc-config-data-custom\") pod \"cinder-api-0\" (UID: \"f61bdc02-a856-448f-9cdf-f3c43efc4bfc\") " pod="openstack/cinder-api-0" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.146993 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f61bdc02-a856-448f-9cdf-f3c43efc4bfc-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f61bdc02-a856-448f-9cdf-f3c43efc4bfc\") " pod="openstack/cinder-api-0" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.147020 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9djt4\" (UniqueName: \"kubernetes.io/projected/f61bdc02-a856-448f-9cdf-f3c43efc4bfc-kube-api-access-9djt4\") pod \"cinder-api-0\" (UID: \"f61bdc02-a856-448f-9cdf-f3c43efc4bfc\") " pod="openstack/cinder-api-0" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.147053 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f61bdc02-a856-448f-9cdf-f3c43efc4bfc-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f61bdc02-a856-448f-9cdf-f3c43efc4bfc\") " pod="openstack/cinder-api-0" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.147085 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f61bdc02-a856-448f-9cdf-f3c43efc4bfc-logs\") pod \"cinder-api-0\" (UID: \"f61bdc02-a856-448f-9cdf-f3c43efc4bfc\") " pod="openstack/cinder-api-0" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.147160 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f61bdc02-a856-448f-9cdf-f3c43efc4bfc-scripts\") pod \"cinder-api-0\" (UID: \"f61bdc02-a856-448f-9cdf-f3c43efc4bfc\") " pod="openstack/cinder-api-0" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.147191 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f61bdc02-a856-448f-9cdf-f3c43efc4bfc-config-data\") pod \"cinder-api-0\" (UID: \"f61bdc02-a856-448f-9cdf-f3c43efc4bfc\") " pod="openstack/cinder-api-0" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.147238 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f61bdc02-a856-448f-9cdf-f3c43efc4bfc-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f61bdc02-a856-448f-9cdf-f3c43efc4bfc\") " pod="openstack/cinder-api-0" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.147285 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f61bdc02-a856-448f-9cdf-f3c43efc4bfc-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f61bdc02-a856-448f-9cdf-f3c43efc4bfc\") " pod="openstack/cinder-api-0" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.249264 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f61bdc02-a856-448f-9cdf-f3c43efc4bfc-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f61bdc02-a856-448f-9cdf-f3c43efc4bfc\") " pod="openstack/cinder-api-0" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.249312 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9djt4\" (UniqueName: \"kubernetes.io/projected/f61bdc02-a856-448f-9cdf-f3c43efc4bfc-kube-api-access-9djt4\") pod \"cinder-api-0\" (UID: \"f61bdc02-a856-448f-9cdf-f3c43efc4bfc\") " pod="openstack/cinder-api-0" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.249337 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f61bdc02-a856-448f-9cdf-f3c43efc4bfc-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f61bdc02-a856-448f-9cdf-f3c43efc4bfc\") " pod="openstack/cinder-api-0" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.249361 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f61bdc02-a856-448f-9cdf-f3c43efc4bfc-logs\") pod \"cinder-api-0\" (UID: \"f61bdc02-a856-448f-9cdf-f3c43efc4bfc\") " pod="openstack/cinder-api-0" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.249382 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f61bdc02-a856-448f-9cdf-f3c43efc4bfc-scripts\") pod \"cinder-api-0\" (UID: \"f61bdc02-a856-448f-9cdf-f3c43efc4bfc\") " pod="openstack/cinder-api-0" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.249404 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f61bdc02-a856-448f-9cdf-f3c43efc4bfc-config-data\") pod \"cinder-api-0\" (UID: \"f61bdc02-a856-448f-9cdf-f3c43efc4bfc\") " pod="openstack/cinder-api-0" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.249441 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f61bdc02-a856-448f-9cdf-f3c43efc4bfc-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f61bdc02-a856-448f-9cdf-f3c43efc4bfc\") " pod="openstack/cinder-api-0" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.249480 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f61bdc02-a856-448f-9cdf-f3c43efc4bfc-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f61bdc02-a856-448f-9cdf-f3c43efc4bfc\") " pod="openstack/cinder-api-0" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.249549 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f61bdc02-a856-448f-9cdf-f3c43efc4bfc-config-data-custom\") pod \"cinder-api-0\" (UID: \"f61bdc02-a856-448f-9cdf-f3c43efc4bfc\") " pod="openstack/cinder-api-0" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.252172 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f61bdc02-a856-448f-9cdf-f3c43efc4bfc-logs\") pod \"cinder-api-0\" (UID: \"f61bdc02-a856-448f-9cdf-f3c43efc4bfc\") " pod="openstack/cinder-api-0" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.252243 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f61bdc02-a856-448f-9cdf-f3c43efc4bfc-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f61bdc02-a856-448f-9cdf-f3c43efc4bfc\") " pod="openstack/cinder-api-0" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.261963 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f61bdc02-a856-448f-9cdf-f3c43efc4bfc-config-data\") pod \"cinder-api-0\" (UID: \"f61bdc02-a856-448f-9cdf-f3c43efc4bfc\") " pod="openstack/cinder-api-0" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.264090 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f61bdc02-a856-448f-9cdf-f3c43efc4bfc-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f61bdc02-a856-448f-9cdf-f3c43efc4bfc\") " pod="openstack/cinder-api-0" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.263562 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f61bdc02-a856-448f-9cdf-f3c43efc4bfc-config-data-custom\") pod \"cinder-api-0\" (UID: \"f61bdc02-a856-448f-9cdf-f3c43efc4bfc\") " pod="openstack/cinder-api-0" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.267178 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f61bdc02-a856-448f-9cdf-f3c43efc4bfc-scripts\") pod \"cinder-api-0\" (UID: \"f61bdc02-a856-448f-9cdf-f3c43efc4bfc\") " pod="openstack/cinder-api-0" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.272261 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f61bdc02-a856-448f-9cdf-f3c43efc4bfc-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f61bdc02-a856-448f-9cdf-f3c43efc4bfc\") " pod="openstack/cinder-api-0" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.275350 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f61bdc02-a856-448f-9cdf-f3c43efc4bfc-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f61bdc02-a856-448f-9cdf-f3c43efc4bfc\") " pod="openstack/cinder-api-0" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.276574 4831 scope.go:117] "RemoveContainer" containerID="bac90f7915063288de67a2e139c40837b5708a4e13373e20cb3ac91496de72b3" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.300245 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9djt4\" (UniqueName: \"kubernetes.io/projected/f61bdc02-a856-448f-9cdf-f3c43efc4bfc-kube-api-access-9djt4\") pod \"cinder-api-0\" (UID: \"f61bdc02-a856-448f-9cdf-f3c43efc4bfc\") " pod="openstack/cinder-api-0" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.301591 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04bc7d54-ace7-4cd2-b867-9ccc94190c9d" path="/var/lib/kubelet/pods/04bc7d54-ace7-4cd2-b867-9ccc94190c9d/volumes" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.302369 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8165b5c-aa05-4fbb-9541-7b25c5a75dab" path="/var/lib/kubelet/pods/a8165b5c-aa05-4fbb-9541-7b25c5a75dab/volumes" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.378995 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.392353 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.751290 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.865347 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d70148a5-dabd-495e-99a5-bba978f790f3-logs\") pod \"d70148a5-dabd-495e-99a5-bba978f790f3\" (UID: \"d70148a5-dabd-495e-99a5-bba978f790f3\") " Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.865476 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"d70148a5-dabd-495e-99a5-bba978f790f3\" (UID: \"d70148a5-dabd-495e-99a5-bba978f790f3\") " Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.865532 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vx47j\" (UniqueName: \"kubernetes.io/projected/d70148a5-dabd-495e-99a5-bba978f790f3-kube-api-access-vx47j\") pod \"d70148a5-dabd-495e-99a5-bba978f790f3\" (UID: \"d70148a5-dabd-495e-99a5-bba978f790f3\") " Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.865578 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d70148a5-dabd-495e-99a5-bba978f790f3-combined-ca-bundle\") pod \"d70148a5-dabd-495e-99a5-bba978f790f3\" (UID: \"d70148a5-dabd-495e-99a5-bba978f790f3\") " Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.865639 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d70148a5-dabd-495e-99a5-bba978f790f3-scripts\") pod \"d70148a5-dabd-495e-99a5-bba978f790f3\" (UID: \"d70148a5-dabd-495e-99a5-bba978f790f3\") " Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.865749 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d70148a5-dabd-495e-99a5-bba978f790f3-httpd-run\") pod \"d70148a5-dabd-495e-99a5-bba978f790f3\" (UID: \"d70148a5-dabd-495e-99a5-bba978f790f3\") " Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.865769 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d70148a5-dabd-495e-99a5-bba978f790f3-config-data\") pod \"d70148a5-dabd-495e-99a5-bba978f790f3\" (UID: \"d70148a5-dabd-495e-99a5-bba978f790f3\") " Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.865956 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d70148a5-dabd-495e-99a5-bba978f790f3-logs" (OuterVolumeSpecName: "logs") pod "d70148a5-dabd-495e-99a5-bba978f790f3" (UID: "d70148a5-dabd-495e-99a5-bba978f790f3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.866230 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d70148a5-dabd-495e-99a5-bba978f790f3-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.866441 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d70148a5-dabd-495e-99a5-bba978f790f3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d70148a5-dabd-495e-99a5-bba978f790f3" (UID: "d70148a5-dabd-495e-99a5-bba978f790f3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.876913 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d70148a5-dabd-495e-99a5-bba978f790f3-kube-api-access-vx47j" (OuterVolumeSpecName: "kube-api-access-vx47j") pod "d70148a5-dabd-495e-99a5-bba978f790f3" (UID: "d70148a5-dabd-495e-99a5-bba978f790f3"). InnerVolumeSpecName "kube-api-access-vx47j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.882227 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-79f849bb84-btxkg" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.891966 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "d70148a5-dabd-495e-99a5-bba978f790f3" (UID: "d70148a5-dabd-495e-99a5-bba978f790f3"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.896176 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d70148a5-dabd-495e-99a5-bba978f790f3-scripts" (OuterVolumeSpecName: "scripts") pod "d70148a5-dabd-495e-99a5-bba978f790f3" (UID: "d70148a5-dabd-495e-99a5-bba978f790f3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.911161 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d70148a5-dabd-495e-99a5-bba978f790f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d70148a5-dabd-495e-99a5-bba978f790f3" (UID: "d70148a5-dabd-495e-99a5-bba978f790f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.915418 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 04 10:35:25 crc kubenswrapper[4831]: W1204 10:35:25.918752 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf61bdc02_a856_448f_9cdf_f3c43efc4bfc.slice/crio-88b2581c03bd90b767ec6d2d766067ae33d8a16ed09866ca5143a2f108f65d8f WatchSource:0}: Error finding container 88b2581c03bd90b767ec6d2d766067ae33d8a16ed09866ca5143a2f108f65d8f: Status 404 returned error can't find the container with id 88b2581c03bd90b767ec6d2d766067ae33d8a16ed09866ca5143a2f108f65d8f Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.934725 4831 generic.go:334] "Generic (PLEG): container finished" podID="d70148a5-dabd-495e-99a5-bba978f790f3" containerID="032fb5f8c7f9424652d8d59594f8486199c0ad896ac1ea43f164175952386490" exitCode=0 Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.937178 4831 generic.go:334] "Generic (PLEG): container finished" podID="d70148a5-dabd-495e-99a5-bba978f790f3" containerID="1e580ae25b273d0954a447ab21a27041865e0589a79d60218f6061181a286398" exitCode=143 Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.937313 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d70148a5-dabd-495e-99a5-bba978f790f3","Type":"ContainerDied","Data":"032fb5f8c7f9424652d8d59594f8486199c0ad896ac1ea43f164175952386490"} Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.937407 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d70148a5-dabd-495e-99a5-bba978f790f3","Type":"ContainerDied","Data":"1e580ae25b273d0954a447ab21a27041865e0589a79d60218f6061181a286398"} Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.937492 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d70148a5-dabd-495e-99a5-bba978f790f3","Type":"ContainerDied","Data":"ebe52925cf882efe56319b49580d293a3aa94e52173f5f12ce861913cae3ef2d"} Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.937557 4831 scope.go:117] "RemoveContainer" containerID="032fb5f8c7f9424652d8d59594f8486199c0ad896ac1ea43f164175952386490" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.937743 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.968903 4831 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.970033 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vx47j\" (UniqueName: \"kubernetes.io/projected/d70148a5-dabd-495e-99a5-bba978f790f3-kube-api-access-vx47j\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.970122 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d70148a5-dabd-495e-99a5-bba978f790f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.970217 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d70148a5-dabd-495e-99a5-bba978f790f3-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.970297 4831 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d70148a5-dabd-495e-99a5-bba978f790f3-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.995072 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf","Type":"ContainerStarted","Data":"1d51a23531a32b9f14baaa5d7f6fcf2d59c3260099acddcc691282b097af0bd0"} Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.995258 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf" containerName="glance-log" containerID="cri-o://23206df2bac2856f43efed995d60cd89740864d655544cb9484ce2b3ca0792d0" gracePeriod=30 Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.995507 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf" containerName="glance-httpd" containerID="cri-o://1d51a23531a32b9f14baaa5d7f6fcf2d59c3260099acddcc691282b097af0bd0" gracePeriod=30 Dec 04 10:35:25 crc kubenswrapper[4831]: I1204 10:35:25.999861 4831 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.004621 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"6040f79c-a151-4681-a758-d2741bff68b6","Type":"ContainerStarted","Data":"362f1baaa691428bb0b53e1a5646ee86e99621f28c605cae4217d58e7e827ff7"} Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.006821 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d70148a5-dabd-495e-99a5-bba978f790f3-config-data" (OuterVolumeSpecName: "config-data") pod "d70148a5-dabd-495e-99a5-bba978f790f3" (UID: "d70148a5-dabd-495e-99a5-bba978f790f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.055836 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=8.055817213 podStartE2EDuration="8.055817213s" podCreationTimestamp="2025-12-04 10:35:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:35:26.01995285 +0000 UTC m=+1222.969128184" watchObservedRunningTime="2025-12-04 10:35:26.055817213 +0000 UTC m=+1223.004992527" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.073148 4831 scope.go:117] "RemoveContainer" containerID="1e580ae25b273d0954a447ab21a27041865e0589a79d60218f6061181a286398" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.076944 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d70148a5-dabd-495e-99a5-bba978f790f3-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.077165 4831 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.122328 4831 scope.go:117] "RemoveContainer" containerID="032fb5f8c7f9424652d8d59594f8486199c0ad896ac1ea43f164175952386490" Dec 04 10:35:26 crc kubenswrapper[4831]: E1204 10:35:26.126151 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"032fb5f8c7f9424652d8d59594f8486199c0ad896ac1ea43f164175952386490\": container with ID starting with 032fb5f8c7f9424652d8d59594f8486199c0ad896ac1ea43f164175952386490 not found: ID does not exist" containerID="032fb5f8c7f9424652d8d59594f8486199c0ad896ac1ea43f164175952386490" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.126378 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"032fb5f8c7f9424652d8d59594f8486199c0ad896ac1ea43f164175952386490"} err="failed to get container status \"032fb5f8c7f9424652d8d59594f8486199c0ad896ac1ea43f164175952386490\": rpc error: code = NotFound desc = could not find container \"032fb5f8c7f9424652d8d59594f8486199c0ad896ac1ea43f164175952386490\": container with ID starting with 032fb5f8c7f9424652d8d59594f8486199c0ad896ac1ea43f164175952386490 not found: ID does not exist" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.126405 4831 scope.go:117] "RemoveContainer" containerID="1e580ae25b273d0954a447ab21a27041865e0589a79d60218f6061181a286398" Dec 04 10:35:26 crc kubenswrapper[4831]: E1204 10:35:26.135814 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e580ae25b273d0954a447ab21a27041865e0589a79d60218f6061181a286398\": container with ID starting with 1e580ae25b273d0954a447ab21a27041865e0589a79d60218f6061181a286398 not found: ID does not exist" containerID="1e580ae25b273d0954a447ab21a27041865e0589a79d60218f6061181a286398" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.135866 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e580ae25b273d0954a447ab21a27041865e0589a79d60218f6061181a286398"} err="failed to get container status \"1e580ae25b273d0954a447ab21a27041865e0589a79d60218f6061181a286398\": rpc error: code = NotFound desc = could not find container \"1e580ae25b273d0954a447ab21a27041865e0589a79d60218f6061181a286398\": container with ID starting with 1e580ae25b273d0954a447ab21a27041865e0589a79d60218f6061181a286398 not found: ID does not exist" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.135896 4831 scope.go:117] "RemoveContainer" containerID="032fb5f8c7f9424652d8d59594f8486199c0ad896ac1ea43f164175952386490" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.136424 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"032fb5f8c7f9424652d8d59594f8486199c0ad896ac1ea43f164175952386490"} err="failed to get container status \"032fb5f8c7f9424652d8d59594f8486199c0ad896ac1ea43f164175952386490\": rpc error: code = NotFound desc = could not find container \"032fb5f8c7f9424652d8d59594f8486199c0ad896ac1ea43f164175952386490\": container with ID starting with 032fb5f8c7f9424652d8d59594f8486199c0ad896ac1ea43f164175952386490 not found: ID does not exist" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.136446 4831 scope.go:117] "RemoveContainer" containerID="1e580ae25b273d0954a447ab21a27041865e0589a79d60218f6061181a286398" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.139463 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e580ae25b273d0954a447ab21a27041865e0589a79d60218f6061181a286398"} err="failed to get container status \"1e580ae25b273d0954a447ab21a27041865e0589a79d60218f6061181a286398\": rpc error: code = NotFound desc = could not find container \"1e580ae25b273d0954a447ab21a27041865e0589a79d60218f6061181a286398\": container with ID starting with 1e580ae25b273d0954a447ab21a27041865e0589a79d60218f6061181a286398 not found: ID does not exist" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.280292 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.285450 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.357643 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:35:26 crc kubenswrapper[4831]: E1204 10:35:26.360516 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d70148a5-dabd-495e-99a5-bba978f790f3" containerName="glance-log" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.360543 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="d70148a5-dabd-495e-99a5-bba978f790f3" containerName="glance-log" Dec 04 10:35:26 crc kubenswrapper[4831]: E1204 10:35:26.360569 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d70148a5-dabd-495e-99a5-bba978f790f3" containerName="glance-httpd" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.360578 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="d70148a5-dabd-495e-99a5-bba978f790f3" containerName="glance-httpd" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.364834 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="d70148a5-dabd-495e-99a5-bba978f790f3" containerName="glance-log" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.364858 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="d70148a5-dabd-495e-99a5-bba978f790f3" containerName="glance-httpd" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.366412 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.370100 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.370442 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.387189 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.494637 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"9b4fe343-759d-470c-9e6e-1daba0a5d58d\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.494995 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5x8n\" (UniqueName: \"kubernetes.io/projected/9b4fe343-759d-470c-9e6e-1daba0a5d58d-kube-api-access-l5x8n\") pod \"glance-default-external-api-0\" (UID: \"9b4fe343-759d-470c-9e6e-1daba0a5d58d\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.495142 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b4fe343-759d-470c-9e6e-1daba0a5d58d-logs\") pod \"glance-default-external-api-0\" (UID: \"9b4fe343-759d-470c-9e6e-1daba0a5d58d\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.495282 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b4fe343-759d-470c-9e6e-1daba0a5d58d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9b4fe343-759d-470c-9e6e-1daba0a5d58d\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.495396 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9b4fe343-759d-470c-9e6e-1daba0a5d58d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9b4fe343-759d-470c-9e6e-1daba0a5d58d\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.496134 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b4fe343-759d-470c-9e6e-1daba0a5d58d-config-data\") pod \"glance-default-external-api-0\" (UID: \"9b4fe343-759d-470c-9e6e-1daba0a5d58d\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.496219 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b4fe343-759d-470c-9e6e-1daba0a5d58d-scripts\") pod \"glance-default-external-api-0\" (UID: \"9b4fe343-759d-470c-9e6e-1daba0a5d58d\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.496278 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b4fe343-759d-470c-9e6e-1daba0a5d58d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9b4fe343-759d-470c-9e6e-1daba0a5d58d\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.597864 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b4fe343-759d-470c-9e6e-1daba0a5d58d-scripts\") pod \"glance-default-external-api-0\" (UID: \"9b4fe343-759d-470c-9e6e-1daba0a5d58d\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.597926 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b4fe343-759d-470c-9e6e-1daba0a5d58d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9b4fe343-759d-470c-9e6e-1daba0a5d58d\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.597963 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"9b4fe343-759d-470c-9e6e-1daba0a5d58d\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.597982 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5x8n\" (UniqueName: \"kubernetes.io/projected/9b4fe343-759d-470c-9e6e-1daba0a5d58d-kube-api-access-l5x8n\") pod \"glance-default-external-api-0\" (UID: \"9b4fe343-759d-470c-9e6e-1daba0a5d58d\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.598062 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b4fe343-759d-470c-9e6e-1daba0a5d58d-logs\") pod \"glance-default-external-api-0\" (UID: \"9b4fe343-759d-470c-9e6e-1daba0a5d58d\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.598119 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b4fe343-759d-470c-9e6e-1daba0a5d58d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9b4fe343-759d-470c-9e6e-1daba0a5d58d\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.598183 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9b4fe343-759d-470c-9e6e-1daba0a5d58d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9b4fe343-759d-470c-9e6e-1daba0a5d58d\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.598216 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b4fe343-759d-470c-9e6e-1daba0a5d58d-config-data\") pod \"glance-default-external-api-0\" (UID: \"9b4fe343-759d-470c-9e6e-1daba0a5d58d\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.599395 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b4fe343-759d-470c-9e6e-1daba0a5d58d-logs\") pod \"glance-default-external-api-0\" (UID: \"9b4fe343-759d-470c-9e6e-1daba0a5d58d\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.599715 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9b4fe343-759d-470c-9e6e-1daba0a5d58d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9b4fe343-759d-470c-9e6e-1daba0a5d58d\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.599809 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"9b4fe343-759d-470c-9e6e-1daba0a5d58d\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.604596 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b4fe343-759d-470c-9e6e-1daba0a5d58d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9b4fe343-759d-470c-9e6e-1daba0a5d58d\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.605132 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b4fe343-759d-470c-9e6e-1daba0a5d58d-config-data\") pod \"glance-default-external-api-0\" (UID: \"9b4fe343-759d-470c-9e6e-1daba0a5d58d\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.609356 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b4fe343-759d-470c-9e6e-1daba0a5d58d-scripts\") pod \"glance-default-external-api-0\" (UID: \"9b4fe343-759d-470c-9e6e-1daba0a5d58d\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.610653 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b4fe343-759d-470c-9e6e-1daba0a5d58d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9b4fe343-759d-470c-9e6e-1daba0a5d58d\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.634624 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5x8n\" (UniqueName: \"kubernetes.io/projected/9b4fe343-759d-470c-9e6e-1daba0a5d58d-kube-api-access-l5x8n\") pod \"glance-default-external-api-0\" (UID: \"9b4fe343-759d-470c-9e6e-1daba0a5d58d\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.657475 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"9b4fe343-759d-470c-9e6e-1daba0a5d58d\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.704232 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 10:35:26 crc kubenswrapper[4831]: I1204 10:35:26.895038 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.004892 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf-httpd-run\") pod \"2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf\" (UID: \"2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf\") " Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.004983 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf\" (UID: \"2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf\") " Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.005081 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf-config-data\") pod \"2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf\" (UID: \"2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf\") " Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.005106 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf-combined-ca-bundle\") pod \"2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf\" (UID: \"2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf\") " Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.005155 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf-logs\") pod \"2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf\" (UID: \"2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf\") " Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.005205 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgldx\" (UniqueName: \"kubernetes.io/projected/2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf-kube-api-access-cgldx\") pod \"2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf\" (UID: \"2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf\") " Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.005252 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf-scripts\") pod \"2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf\" (UID: \"2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf\") " Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.005354 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf" (UID: "2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.005621 4831 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.005921 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf-logs" (OuterVolumeSpecName: "logs") pod "2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf" (UID: "2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.010989 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf-kube-api-access-cgldx" (OuterVolumeSpecName: "kube-api-access-cgldx") pod "2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf" (UID: "2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf"). InnerVolumeSpecName "kube-api-access-cgldx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.015742 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf" (UID: "2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.022523 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf-scripts" (OuterVolumeSpecName: "scripts") pod "2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf" (UID: "2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.028065 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f61bdc02-a856-448f-9cdf-f3c43efc4bfc","Type":"ContainerStarted","Data":"4cb31c7549ce4c78efc28e8f18b8cb6c6369d4e0a1cbd6434f6b8d436f7c8444"} Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.028122 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f61bdc02-a856-448f-9cdf-f3c43efc4bfc","Type":"ContainerStarted","Data":"88b2581c03bd90b767ec6d2d766067ae33d8a16ed09866ca5143a2f108f65d8f"} Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.031036 4831 generic.go:334] "Generic (PLEG): container finished" podID="2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf" containerID="1d51a23531a32b9f14baaa5d7f6fcf2d59c3260099acddcc691282b097af0bd0" exitCode=0 Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.031077 4831 generic.go:334] "Generic (PLEG): container finished" podID="2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf" containerID="23206df2bac2856f43efed995d60cd89740864d655544cb9484ce2b3ca0792d0" exitCode=143 Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.031118 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.031133 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf","Type":"ContainerDied","Data":"1d51a23531a32b9f14baaa5d7f6fcf2d59c3260099acddcc691282b097af0bd0"} Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.031170 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf","Type":"ContainerDied","Data":"23206df2bac2856f43efed995d60cd89740864d655544cb9484ce2b3ca0792d0"} Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.031184 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf","Type":"ContainerDied","Data":"66af1cc4f15e4bb51a37b78c2d077fb4efe775bc957a763f134584855471c23c"} Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.031203 4831 scope.go:117] "RemoveContainer" containerID="1d51a23531a32b9f14baaa5d7f6fcf2d59c3260099acddcc691282b097af0bd0" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.034463 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7fa9407a-9a04-4712-bac8-d89067b14304" containerName="ceilometer-central-agent" containerID="cri-o://55d6de675878e9339fa5a8febe11f79629911d2d33419b23452f26a864f1668b" gracePeriod=30 Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.034533 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7fa9407a-9a04-4712-bac8-d89067b14304" containerName="sg-core" containerID="cri-o://13cd496430f493f3d5eb1c1f803b5815d3d73e339423997d2be1e31c268f6fb7" gracePeriod=30 Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.034564 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7fa9407a-9a04-4712-bac8-d89067b14304" containerName="ceilometer-notification-agent" containerID="cri-o://0513a579e5eccabc0b8e66fb008748bbcdc24aad70905a810665e9e5784cd89a" gracePeriod=30 Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.034625 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7fa9407a-9a04-4712-bac8-d89067b14304" containerName="proxy-httpd" containerID="cri-o://8cb7483f62005d83064a87e635c4881257cb8cba0cb7f365e44c81c19700294a" gracePeriod=30 Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.076805 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf" (UID: "2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.101120 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf-config-data" (OuterVolumeSpecName: "config-data") pod "2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf" (UID: "2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.103314 4831 scope.go:117] "RemoveContainer" containerID="23206df2bac2856f43efed995d60cd89740864d655544cb9484ce2b3ca0792d0" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.110860 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.110941 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgldx\" (UniqueName: \"kubernetes.io/projected/2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf-kube-api-access-cgldx\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.110960 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.111095 4831 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.111135 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.111198 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.116740 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.151109 4831 scope.go:117] "RemoveContainer" containerID="1d51a23531a32b9f14baaa5d7f6fcf2d59c3260099acddcc691282b097af0bd0" Dec 04 10:35:27 crc kubenswrapper[4831]: E1204 10:35:27.153673 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d51a23531a32b9f14baaa5d7f6fcf2d59c3260099acddcc691282b097af0bd0\": container with ID starting with 1d51a23531a32b9f14baaa5d7f6fcf2d59c3260099acddcc691282b097af0bd0 not found: ID does not exist" containerID="1d51a23531a32b9f14baaa5d7f6fcf2d59c3260099acddcc691282b097af0bd0" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.153715 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d51a23531a32b9f14baaa5d7f6fcf2d59c3260099acddcc691282b097af0bd0"} err="failed to get container status \"1d51a23531a32b9f14baaa5d7f6fcf2d59c3260099acddcc691282b097af0bd0\": rpc error: code = NotFound desc = could not find container \"1d51a23531a32b9f14baaa5d7f6fcf2d59c3260099acddcc691282b097af0bd0\": container with ID starting with 1d51a23531a32b9f14baaa5d7f6fcf2d59c3260099acddcc691282b097af0bd0 not found: ID does not exist" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.153742 4831 scope.go:117] "RemoveContainer" containerID="23206df2bac2856f43efed995d60cd89740864d655544cb9484ce2b3ca0792d0" Dec 04 10:35:27 crc kubenswrapper[4831]: E1204 10:35:27.154050 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23206df2bac2856f43efed995d60cd89740864d655544cb9484ce2b3ca0792d0\": container with ID starting with 23206df2bac2856f43efed995d60cd89740864d655544cb9484ce2b3ca0792d0 not found: ID does not exist" containerID="23206df2bac2856f43efed995d60cd89740864d655544cb9484ce2b3ca0792d0" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.154067 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23206df2bac2856f43efed995d60cd89740864d655544cb9484ce2b3ca0792d0"} err="failed to get container status \"23206df2bac2856f43efed995d60cd89740864d655544cb9484ce2b3ca0792d0\": rpc error: code = NotFound desc = could not find container \"23206df2bac2856f43efed995d60cd89740864d655544cb9484ce2b3ca0792d0\": container with ID starting with 23206df2bac2856f43efed995d60cd89740864d655544cb9484ce2b3ca0792d0 not found: ID does not exist" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.154079 4831 scope.go:117] "RemoveContainer" containerID="1d51a23531a32b9f14baaa5d7f6fcf2d59c3260099acddcc691282b097af0bd0" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.154810 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d51a23531a32b9f14baaa5d7f6fcf2d59c3260099acddcc691282b097af0bd0"} err="failed to get container status \"1d51a23531a32b9f14baaa5d7f6fcf2d59c3260099acddcc691282b097af0bd0\": rpc error: code = NotFound desc = could not find container \"1d51a23531a32b9f14baaa5d7f6fcf2d59c3260099acddcc691282b097af0bd0\": container with ID starting with 1d51a23531a32b9f14baaa5d7f6fcf2d59c3260099acddcc691282b097af0bd0 not found: ID does not exist" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.154828 4831 scope.go:117] "RemoveContainer" containerID="23206df2bac2856f43efed995d60cd89740864d655544cb9484ce2b3ca0792d0" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.154976 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23206df2bac2856f43efed995d60cd89740864d655544cb9484ce2b3ca0792d0"} err="failed to get container status \"23206df2bac2856f43efed995d60cd89740864d655544cb9484ce2b3ca0792d0\": rpc error: code = NotFound desc = could not find container \"23206df2bac2856f43efed995d60cd89740864d655544cb9484ce2b3ca0792d0\": container with ID starting with 23206df2bac2856f43efed995d60cd89740864d655544cb9484ce2b3ca0792d0 not found: ID does not exist" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.162800 4831 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.212841 4831 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:27 crc kubenswrapper[4831]: W1204 10:35:27.289478 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b4fe343_759d_470c_9e6e_1daba0a5d58d.slice/crio-8898853e5a3437b117d1d5605846936787d0a50434e0d5fece9f2b0e6cdf66a8 WatchSource:0}: Error finding container 8898853e5a3437b117d1d5605846936787d0a50434e0d5fece9f2b0e6cdf66a8: Status 404 returned error can't find the container with id 8898853e5a3437b117d1d5605846936787d0a50434e0d5fece9f2b0e6cdf66a8 Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.289823 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d70148a5-dabd-495e-99a5-bba978f790f3" path="/var/lib/kubelet/pods/d70148a5-dabd-495e-99a5-bba978f790f3/volumes" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.290868 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.382746 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.393086 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.395701 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.411719 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 10:35:27 crc kubenswrapper[4831]: E1204 10:35:27.412192 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf" containerName="glance-log" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.412208 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf" containerName="glance-log" Dec 04 10:35:27 crc kubenswrapper[4831]: E1204 10:35:27.412234 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf" containerName="glance-httpd" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.412242 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf" containerName="glance-httpd" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.412449 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf" containerName="glance-log" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.412463 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf" containerName="glance-httpd" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.414234 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.417971 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.421710 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.430977 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.521846 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"63dffe41-8eb3-4696-bfdc-de4fe1735e19\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.521939 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63dffe41-8eb3-4696-bfdc-de4fe1735e19-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"63dffe41-8eb3-4696-bfdc-de4fe1735e19\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.521994 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63dffe41-8eb3-4696-bfdc-de4fe1735e19-config-data\") pod \"glance-default-internal-api-0\" (UID: \"63dffe41-8eb3-4696-bfdc-de4fe1735e19\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.522023 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63dffe41-8eb3-4696-bfdc-de4fe1735e19-scripts\") pod \"glance-default-internal-api-0\" (UID: \"63dffe41-8eb3-4696-bfdc-de4fe1735e19\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.522064 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvgx7\" (UniqueName: \"kubernetes.io/projected/63dffe41-8eb3-4696-bfdc-de4fe1735e19-kube-api-access-cvgx7\") pod \"glance-default-internal-api-0\" (UID: \"63dffe41-8eb3-4696-bfdc-de4fe1735e19\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.522127 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/63dffe41-8eb3-4696-bfdc-de4fe1735e19-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"63dffe41-8eb3-4696-bfdc-de4fe1735e19\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.522447 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63dffe41-8eb3-4696-bfdc-de4fe1735e19-logs\") pod \"glance-default-internal-api-0\" (UID: \"63dffe41-8eb3-4696-bfdc-de4fe1735e19\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.522495 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63dffe41-8eb3-4696-bfdc-de4fe1735e19-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"63dffe41-8eb3-4696-bfdc-de4fe1735e19\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.627838 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63dffe41-8eb3-4696-bfdc-de4fe1735e19-config-data\") pod \"glance-default-internal-api-0\" (UID: \"63dffe41-8eb3-4696-bfdc-de4fe1735e19\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.628207 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63dffe41-8eb3-4696-bfdc-de4fe1735e19-scripts\") pod \"glance-default-internal-api-0\" (UID: \"63dffe41-8eb3-4696-bfdc-de4fe1735e19\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.628248 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvgx7\" (UniqueName: \"kubernetes.io/projected/63dffe41-8eb3-4696-bfdc-de4fe1735e19-kube-api-access-cvgx7\") pod \"glance-default-internal-api-0\" (UID: \"63dffe41-8eb3-4696-bfdc-de4fe1735e19\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.628300 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/63dffe41-8eb3-4696-bfdc-de4fe1735e19-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"63dffe41-8eb3-4696-bfdc-de4fe1735e19\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.628334 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63dffe41-8eb3-4696-bfdc-de4fe1735e19-logs\") pod \"glance-default-internal-api-0\" (UID: \"63dffe41-8eb3-4696-bfdc-de4fe1735e19\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.628368 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63dffe41-8eb3-4696-bfdc-de4fe1735e19-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"63dffe41-8eb3-4696-bfdc-de4fe1735e19\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.628398 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"63dffe41-8eb3-4696-bfdc-de4fe1735e19\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.628434 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63dffe41-8eb3-4696-bfdc-de4fe1735e19-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"63dffe41-8eb3-4696-bfdc-de4fe1735e19\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.630134 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/63dffe41-8eb3-4696-bfdc-de4fe1735e19-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"63dffe41-8eb3-4696-bfdc-de4fe1735e19\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.631967 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"63dffe41-8eb3-4696-bfdc-de4fe1735e19\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.636051 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63dffe41-8eb3-4696-bfdc-de4fe1735e19-logs\") pod \"glance-default-internal-api-0\" (UID: \"63dffe41-8eb3-4696-bfdc-de4fe1735e19\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.664992 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63dffe41-8eb3-4696-bfdc-de4fe1735e19-config-data\") pod \"glance-default-internal-api-0\" (UID: \"63dffe41-8eb3-4696-bfdc-de4fe1735e19\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.671967 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63dffe41-8eb3-4696-bfdc-de4fe1735e19-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"63dffe41-8eb3-4696-bfdc-de4fe1735e19\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.672563 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63dffe41-8eb3-4696-bfdc-de4fe1735e19-scripts\") pod \"glance-default-internal-api-0\" (UID: \"63dffe41-8eb3-4696-bfdc-de4fe1735e19\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.677390 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63dffe41-8eb3-4696-bfdc-de4fe1735e19-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"63dffe41-8eb3-4696-bfdc-de4fe1735e19\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.678047 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvgx7\" (UniqueName: \"kubernetes.io/projected/63dffe41-8eb3-4696-bfdc-de4fe1735e19-kube-api-access-cvgx7\") pod \"glance-default-internal-api-0\" (UID: \"63dffe41-8eb3-4696-bfdc-de4fe1735e19\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.715989 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"63dffe41-8eb3-4696-bfdc-de4fe1735e19\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:27 crc kubenswrapper[4831]: I1204 10:35:27.790388 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 10:35:28 crc kubenswrapper[4831]: I1204 10:35:28.054697 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9b4fe343-759d-470c-9e6e-1daba0a5d58d","Type":"ContainerStarted","Data":"8898853e5a3437b117d1d5605846936787d0a50434e0d5fece9f2b0e6cdf66a8"} Dec 04 10:35:28 crc kubenswrapper[4831]: I1204 10:35:28.062045 4831 generic.go:334] "Generic (PLEG): container finished" podID="7fa9407a-9a04-4712-bac8-d89067b14304" containerID="8cb7483f62005d83064a87e635c4881257cb8cba0cb7f365e44c81c19700294a" exitCode=0 Dec 04 10:35:28 crc kubenswrapper[4831]: I1204 10:35:28.062080 4831 generic.go:334] "Generic (PLEG): container finished" podID="7fa9407a-9a04-4712-bac8-d89067b14304" containerID="13cd496430f493f3d5eb1c1f803b5815d3d73e339423997d2be1e31c268f6fb7" exitCode=2 Dec 04 10:35:28 crc kubenswrapper[4831]: I1204 10:35:28.062089 4831 generic.go:334] "Generic (PLEG): container finished" podID="7fa9407a-9a04-4712-bac8-d89067b14304" containerID="0513a579e5eccabc0b8e66fb008748bbcdc24aad70905a810665e9e5784cd89a" exitCode=0 Dec 04 10:35:28 crc kubenswrapper[4831]: I1204 10:35:28.062099 4831 generic.go:334] "Generic (PLEG): container finished" podID="7fa9407a-9a04-4712-bac8-d89067b14304" containerID="55d6de675878e9339fa5a8febe11f79629911d2d33419b23452f26a864f1668b" exitCode=0 Dec 04 10:35:28 crc kubenswrapper[4831]: I1204 10:35:28.062140 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fa9407a-9a04-4712-bac8-d89067b14304","Type":"ContainerDied","Data":"8cb7483f62005d83064a87e635c4881257cb8cba0cb7f365e44c81c19700294a"} Dec 04 10:35:28 crc kubenswrapper[4831]: I1204 10:35:28.062206 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fa9407a-9a04-4712-bac8-d89067b14304","Type":"ContainerDied","Data":"13cd496430f493f3d5eb1c1f803b5815d3d73e339423997d2be1e31c268f6fb7"} Dec 04 10:35:28 crc kubenswrapper[4831]: I1204 10:35:28.062220 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fa9407a-9a04-4712-bac8-d89067b14304","Type":"ContainerDied","Data":"0513a579e5eccabc0b8e66fb008748bbcdc24aad70905a810665e9e5784cd89a"} Dec 04 10:35:28 crc kubenswrapper[4831]: I1204 10:35:28.062234 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fa9407a-9a04-4712-bac8-d89067b14304","Type":"ContainerDied","Data":"55d6de675878e9339fa5a8febe11f79629911d2d33419b23452f26a864f1668b"} Dec 04 10:35:28 crc kubenswrapper[4831]: I1204 10:35:28.116216 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 10:35:28 crc kubenswrapper[4831]: I1204 10:35:28.404545 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-66c4c75c85-69mpg" Dec 04 10:35:28 crc kubenswrapper[4831]: I1204 10:35:28.491845 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-79f849bb84-btxkg"] Dec 04 10:35:28 crc kubenswrapper[4831]: I1204 10:35:28.492403 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-79f849bb84-btxkg" podUID="a02d0bff-55e1-4de5-95e4-98d65018cbf0" containerName="neutron-api" containerID="cri-o://c78af9fedf96b9361597f1e3c2127dce97c3295d889d879db9bc8d9d35aeecb0" gracePeriod=30 Dec 04 10:35:28 crc kubenswrapper[4831]: I1204 10:35:28.492704 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-79f849bb84-btxkg" podUID="a02d0bff-55e1-4de5-95e4-98d65018cbf0" containerName="neutron-httpd" containerID="cri-o://a555dd93ad49421b95a6d98a6065de3c3559238fc0d523fcb3ca02d4764348c3" gracePeriod=30 Dec 04 10:35:28 crc kubenswrapper[4831]: I1204 10:35:28.501581 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:35:28 crc kubenswrapper[4831]: I1204 10:35:28.502915 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 10:35:28 crc kubenswrapper[4831]: I1204 10:35:28.560760 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fa9407a-9a04-4712-bac8-d89067b14304-scripts\") pod \"7fa9407a-9a04-4712-bac8-d89067b14304\" (UID: \"7fa9407a-9a04-4712-bac8-d89067b14304\") " Dec 04 10:35:28 crc kubenswrapper[4831]: I1204 10:35:28.560817 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fa9407a-9a04-4712-bac8-d89067b14304-combined-ca-bundle\") pod \"7fa9407a-9a04-4712-bac8-d89067b14304\" (UID: \"7fa9407a-9a04-4712-bac8-d89067b14304\") " Dec 04 10:35:28 crc kubenswrapper[4831]: I1204 10:35:28.560883 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fa9407a-9a04-4712-bac8-d89067b14304-log-httpd\") pod \"7fa9407a-9a04-4712-bac8-d89067b14304\" (UID: \"7fa9407a-9a04-4712-bac8-d89067b14304\") " Dec 04 10:35:28 crc kubenswrapper[4831]: I1204 10:35:28.560922 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7fa9407a-9a04-4712-bac8-d89067b14304-sg-core-conf-yaml\") pod \"7fa9407a-9a04-4712-bac8-d89067b14304\" (UID: \"7fa9407a-9a04-4712-bac8-d89067b14304\") " Dec 04 10:35:28 crc kubenswrapper[4831]: I1204 10:35:28.561000 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skj6x\" (UniqueName: \"kubernetes.io/projected/7fa9407a-9a04-4712-bac8-d89067b14304-kube-api-access-skj6x\") pod \"7fa9407a-9a04-4712-bac8-d89067b14304\" (UID: \"7fa9407a-9a04-4712-bac8-d89067b14304\") " Dec 04 10:35:28 crc kubenswrapper[4831]: I1204 10:35:28.561037 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fa9407a-9a04-4712-bac8-d89067b14304-run-httpd\") pod \"7fa9407a-9a04-4712-bac8-d89067b14304\" (UID: \"7fa9407a-9a04-4712-bac8-d89067b14304\") " Dec 04 10:35:28 crc kubenswrapper[4831]: I1204 10:35:28.561088 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fa9407a-9a04-4712-bac8-d89067b14304-config-data\") pod \"7fa9407a-9a04-4712-bac8-d89067b14304\" (UID: \"7fa9407a-9a04-4712-bac8-d89067b14304\") " Dec 04 10:35:28 crc kubenswrapper[4831]: I1204 10:35:28.563413 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fa9407a-9a04-4712-bac8-d89067b14304-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7fa9407a-9a04-4712-bac8-d89067b14304" (UID: "7fa9407a-9a04-4712-bac8-d89067b14304"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:35:28 crc kubenswrapper[4831]: I1204 10:35:28.563722 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fa9407a-9a04-4712-bac8-d89067b14304-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7fa9407a-9a04-4712-bac8-d89067b14304" (UID: "7fa9407a-9a04-4712-bac8-d89067b14304"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:35:28 crc kubenswrapper[4831]: I1204 10:35:28.572956 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fa9407a-9a04-4712-bac8-d89067b14304-scripts" (OuterVolumeSpecName: "scripts") pod "7fa9407a-9a04-4712-bac8-d89067b14304" (UID: "7fa9407a-9a04-4712-bac8-d89067b14304"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:28 crc kubenswrapper[4831]: I1204 10:35:28.577342 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fa9407a-9a04-4712-bac8-d89067b14304-kube-api-access-skj6x" (OuterVolumeSpecName: "kube-api-access-skj6x") pod "7fa9407a-9a04-4712-bac8-d89067b14304" (UID: "7fa9407a-9a04-4712-bac8-d89067b14304"). InnerVolumeSpecName "kube-api-access-skj6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:35:28 crc kubenswrapper[4831]: I1204 10:35:28.598129 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fa9407a-9a04-4712-bac8-d89067b14304-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7fa9407a-9a04-4712-bac8-d89067b14304" (UID: "7fa9407a-9a04-4712-bac8-d89067b14304"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:28 crc kubenswrapper[4831]: I1204 10:35:28.663423 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fa9407a-9a04-4712-bac8-d89067b14304-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:28 crc kubenswrapper[4831]: I1204 10:35:28.663601 4831 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fa9407a-9a04-4712-bac8-d89067b14304-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:28 crc kubenswrapper[4831]: I1204 10:35:28.663682 4831 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7fa9407a-9a04-4712-bac8-d89067b14304-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:28 crc kubenswrapper[4831]: I1204 10:35:28.663775 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skj6x\" (UniqueName: \"kubernetes.io/projected/7fa9407a-9a04-4712-bac8-d89067b14304-kube-api-access-skj6x\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:28 crc kubenswrapper[4831]: I1204 10:35:28.663848 4831 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fa9407a-9a04-4712-bac8-d89067b14304-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:28 crc kubenswrapper[4831]: I1204 10:35:28.794821 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fa9407a-9a04-4712-bac8-d89067b14304-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7fa9407a-9a04-4712-bac8-d89067b14304" (UID: "7fa9407a-9a04-4712-bac8-d89067b14304"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:28 crc kubenswrapper[4831]: I1204 10:35:28.805978 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fa9407a-9a04-4712-bac8-d89067b14304-config-data" (OuterVolumeSpecName: "config-data") pod "7fa9407a-9a04-4712-bac8-d89067b14304" (UID: "7fa9407a-9a04-4712-bac8-d89067b14304"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:28 crc kubenswrapper[4831]: I1204 10:35:28.877184 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fa9407a-9a04-4712-bac8-d89067b14304-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:28 crc kubenswrapper[4831]: I1204 10:35:28.877234 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fa9407a-9a04-4712-bac8-d89067b14304-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.082311 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"63dffe41-8eb3-4696-bfdc-de4fe1735e19","Type":"ContainerStarted","Data":"7e81ad5300a0a7f4247e37d94f9ded775db9bc0a8cc2458784f4aefbe5b5f97c"} Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.086710 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9b4fe343-759d-470c-9e6e-1daba0a5d58d","Type":"ContainerStarted","Data":"bd919d5e1909a1e207ccd77fd162a09cad523e0a7fd2c88e5498805dc1a8bfd3"} Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.094682 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f61bdc02-a856-448f-9cdf-f3c43efc4bfc","Type":"ContainerStarted","Data":"e4dd2ecd15dd5135a0eb8f8a09004c4a12d8a1de69f3fb5a89b9750a2f2dcf74"} Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.094779 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.098873 4831 generic.go:334] "Generic (PLEG): container finished" podID="a02d0bff-55e1-4de5-95e4-98d65018cbf0" containerID="a555dd93ad49421b95a6d98a6065de3c3559238fc0d523fcb3ca02d4764348c3" exitCode=0 Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.098942 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79f849bb84-btxkg" event={"ID":"a02d0bff-55e1-4de5-95e4-98d65018cbf0","Type":"ContainerDied","Data":"a555dd93ad49421b95a6d98a6065de3c3559238fc0d523fcb3ca02d4764348c3"} Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.104073 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="72ca3a62-9e01-4f14-8b45-6db4951fb6e6" containerName="cinder-scheduler" containerID="cri-o://2c0e04e721d4a92800cd581e83deeaa1f392408423c211e35d354d64cc9bfefa" gracePeriod=30 Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.104449 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.109955 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="72ca3a62-9e01-4f14-8b45-6db4951fb6e6" containerName="probe" containerID="cri-o://17e1d0021386645c2462c680dfa2a92e0a4b55f149a8e05b1eb361431e680f2f" gracePeriod=30 Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.110137 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fa9407a-9a04-4712-bac8-d89067b14304","Type":"ContainerDied","Data":"b401e568c3b353c6e76d648da930dd5b0aff62abe12d2b538e33eae5a75ea49d"} Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.110183 4831 scope.go:117] "RemoveContainer" containerID="8cb7483f62005d83064a87e635c4881257cb8cba0cb7f365e44c81c19700294a" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.118788 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-665f7b5ff9-klz7d" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.141562 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.141538646 podStartE2EDuration="5.141538646s" podCreationTimestamp="2025-12-04 10:35:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:35:29.126615829 +0000 UTC m=+1226.075791163" watchObservedRunningTime="2025-12-04 10:35:29.141538646 +0000 UTC m=+1226.090713960" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.205501 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.212937 4831 scope.go:117] "RemoveContainer" containerID="13cd496430f493f3d5eb1c1f803b5815d3d73e339423997d2be1e31c268f6fb7" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.213103 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.235479 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7656fdcbd7-2jzrb"] Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.235784 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7656fdcbd7-2jzrb" podUID="a7a16dd7-b208-4a3a-9309-0ba292ed12fe" containerName="dnsmasq-dns" containerID="cri-o://36db02e40224a63052c46c613debcf4d8c24d650642a0d056f187b6071ea0748" gracePeriod=10 Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.247877 4831 scope.go:117] "RemoveContainer" containerID="0513a579e5eccabc0b8e66fb008748bbcdc24aad70905a810665e9e5784cd89a" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.257679 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:35:29 crc kubenswrapper[4831]: E1204 10:35:29.258064 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fa9407a-9a04-4712-bac8-d89067b14304" containerName="ceilometer-notification-agent" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.258080 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fa9407a-9a04-4712-bac8-d89067b14304" containerName="ceilometer-notification-agent" Dec 04 10:35:29 crc kubenswrapper[4831]: E1204 10:35:29.258098 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fa9407a-9a04-4712-bac8-d89067b14304" containerName="sg-core" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.258106 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fa9407a-9a04-4712-bac8-d89067b14304" containerName="sg-core" Dec 04 10:35:29 crc kubenswrapper[4831]: E1204 10:35:29.258128 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fa9407a-9a04-4712-bac8-d89067b14304" containerName="proxy-httpd" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.258134 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fa9407a-9a04-4712-bac8-d89067b14304" containerName="proxy-httpd" Dec 04 10:35:29 crc kubenswrapper[4831]: E1204 10:35:29.258142 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fa9407a-9a04-4712-bac8-d89067b14304" containerName="ceilometer-central-agent" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.258148 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fa9407a-9a04-4712-bac8-d89067b14304" containerName="ceilometer-central-agent" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.258325 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fa9407a-9a04-4712-bac8-d89067b14304" containerName="sg-core" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.258344 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fa9407a-9a04-4712-bac8-d89067b14304" containerName="ceilometer-central-agent" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.258357 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fa9407a-9a04-4712-bac8-d89067b14304" containerName="proxy-httpd" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.258369 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fa9407a-9a04-4712-bac8-d89067b14304" containerName="ceilometer-notification-agent" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.260104 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.269196 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.269224 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.274395 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.283907 4831 scope.go:117] "RemoveContainer" containerID="55d6de675878e9339fa5a8febe11f79629911d2d33419b23452f26a864f1668b" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.315760 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf" path="/var/lib/kubelet/pods/2a333fe3-7c6e-419f-81c7-8a5a60d9c2bf/volumes" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.316783 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fa9407a-9a04-4712-bac8-d89067b14304" path="/var/lib/kubelet/pods/7fa9407a-9a04-4712-bac8-d89067b14304/volumes" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.390061 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46d9cb17-c424-47e2-a9f3-d00f479be770-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"46d9cb17-c424-47e2-a9f3-d00f479be770\") " pod="openstack/ceilometer-0" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.391077 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46d9cb17-c424-47e2-a9f3-d00f479be770-log-httpd\") pod \"ceilometer-0\" (UID: \"46d9cb17-c424-47e2-a9f3-d00f479be770\") " pod="openstack/ceilometer-0" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.391107 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46d9cb17-c424-47e2-a9f3-d00f479be770-scripts\") pod \"ceilometer-0\" (UID: \"46d9cb17-c424-47e2-a9f3-d00f479be770\") " pod="openstack/ceilometer-0" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.391207 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46d9cb17-c424-47e2-a9f3-d00f479be770-run-httpd\") pod \"ceilometer-0\" (UID: \"46d9cb17-c424-47e2-a9f3-d00f479be770\") " pod="openstack/ceilometer-0" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.391229 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46d9cb17-c424-47e2-a9f3-d00f479be770-config-data\") pod \"ceilometer-0\" (UID: \"46d9cb17-c424-47e2-a9f3-d00f479be770\") " pod="openstack/ceilometer-0" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.391249 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/46d9cb17-c424-47e2-a9f3-d00f479be770-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"46d9cb17-c424-47e2-a9f3-d00f479be770\") " pod="openstack/ceilometer-0" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.391299 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmkbp\" (UniqueName: \"kubernetes.io/projected/46d9cb17-c424-47e2-a9f3-d00f479be770-kube-api-access-nmkbp\") pod \"ceilometer-0\" (UID: \"46d9cb17-c424-47e2-a9f3-d00f479be770\") " pod="openstack/ceilometer-0" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.493362 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46d9cb17-c424-47e2-a9f3-d00f479be770-log-httpd\") pod \"ceilometer-0\" (UID: \"46d9cb17-c424-47e2-a9f3-d00f479be770\") " pod="openstack/ceilometer-0" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.493439 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46d9cb17-c424-47e2-a9f3-d00f479be770-scripts\") pod \"ceilometer-0\" (UID: \"46d9cb17-c424-47e2-a9f3-d00f479be770\") " pod="openstack/ceilometer-0" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.493582 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46d9cb17-c424-47e2-a9f3-d00f479be770-run-httpd\") pod \"ceilometer-0\" (UID: \"46d9cb17-c424-47e2-a9f3-d00f479be770\") " pod="openstack/ceilometer-0" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.493638 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46d9cb17-c424-47e2-a9f3-d00f479be770-config-data\") pod \"ceilometer-0\" (UID: \"46d9cb17-c424-47e2-a9f3-d00f479be770\") " pod="openstack/ceilometer-0" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.493698 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/46d9cb17-c424-47e2-a9f3-d00f479be770-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"46d9cb17-c424-47e2-a9f3-d00f479be770\") " pod="openstack/ceilometer-0" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.493730 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmkbp\" (UniqueName: \"kubernetes.io/projected/46d9cb17-c424-47e2-a9f3-d00f479be770-kube-api-access-nmkbp\") pod \"ceilometer-0\" (UID: \"46d9cb17-c424-47e2-a9f3-d00f479be770\") " pod="openstack/ceilometer-0" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.493826 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46d9cb17-c424-47e2-a9f3-d00f479be770-log-httpd\") pod \"ceilometer-0\" (UID: \"46d9cb17-c424-47e2-a9f3-d00f479be770\") " pod="openstack/ceilometer-0" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.493878 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46d9cb17-c424-47e2-a9f3-d00f479be770-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"46d9cb17-c424-47e2-a9f3-d00f479be770\") " pod="openstack/ceilometer-0" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.494040 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46d9cb17-c424-47e2-a9f3-d00f479be770-run-httpd\") pod \"ceilometer-0\" (UID: \"46d9cb17-c424-47e2-a9f3-d00f479be770\") " pod="openstack/ceilometer-0" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.499413 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/46d9cb17-c424-47e2-a9f3-d00f479be770-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"46d9cb17-c424-47e2-a9f3-d00f479be770\") " pod="openstack/ceilometer-0" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.521341 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46d9cb17-c424-47e2-a9f3-d00f479be770-scripts\") pod \"ceilometer-0\" (UID: \"46d9cb17-c424-47e2-a9f3-d00f479be770\") " pod="openstack/ceilometer-0" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.521475 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46d9cb17-c424-47e2-a9f3-d00f479be770-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"46d9cb17-c424-47e2-a9f3-d00f479be770\") " pod="openstack/ceilometer-0" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.522419 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46d9cb17-c424-47e2-a9f3-d00f479be770-config-data\") pod \"ceilometer-0\" (UID: \"46d9cb17-c424-47e2-a9f3-d00f479be770\") " pod="openstack/ceilometer-0" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.528578 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmkbp\" (UniqueName: \"kubernetes.io/projected/46d9cb17-c424-47e2-a9f3-d00f479be770-kube-api-access-nmkbp\") pod \"ceilometer-0\" (UID: \"46d9cb17-c424-47e2-a9f3-d00f479be770\") " pod="openstack/ceilometer-0" Dec 04 10:35:29 crc kubenswrapper[4831]: I1204 10:35:29.596041 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:35:30 crc kubenswrapper[4831]: I1204 10:35:30.155586 4831 generic.go:334] "Generic (PLEG): container finished" podID="a7a16dd7-b208-4a3a-9309-0ba292ed12fe" containerID="36db02e40224a63052c46c613debcf4d8c24d650642a0d056f187b6071ea0748" exitCode=0 Dec 04 10:35:30 crc kubenswrapper[4831]: I1204 10:35:30.155698 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7656fdcbd7-2jzrb" event={"ID":"a7a16dd7-b208-4a3a-9309-0ba292ed12fe","Type":"ContainerDied","Data":"36db02e40224a63052c46c613debcf4d8c24d650642a0d056f187b6071ea0748"} Dec 04 10:35:30 crc kubenswrapper[4831]: I1204 10:35:30.166603 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:35:30 crc kubenswrapper[4831]: I1204 10:35:30.168087 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"63dffe41-8eb3-4696-bfdc-de4fe1735e19","Type":"ContainerStarted","Data":"ffdf0493afea65d971fe4de728ed8cbca86bac9e71ba09e6a8be9114d83960fb"} Dec 04 10:35:30 crc kubenswrapper[4831]: I1204 10:35:30.177409 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9b4fe343-759d-470c-9e6e-1daba0a5d58d","Type":"ContainerStarted","Data":"f0024ea85d13065fa874af4a564db1a8f568886102b923f30d5f3a98dfb18c83"} Dec 04 10:35:30 crc kubenswrapper[4831]: I1204 10:35:30.208881 4831 generic.go:334] "Generic (PLEG): container finished" podID="72ca3a62-9e01-4f14-8b45-6db4951fb6e6" containerID="17e1d0021386645c2462c680dfa2a92e0a4b55f149a8e05b1eb361431e680f2f" exitCode=0 Dec 04 10:35:30 crc kubenswrapper[4831]: I1204 10:35:30.210213 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"72ca3a62-9e01-4f14-8b45-6db4951fb6e6","Type":"ContainerDied","Data":"17e1d0021386645c2462c680dfa2a92e0a4b55f149a8e05b1eb361431e680f2f"} Dec 04 10:35:30 crc kubenswrapper[4831]: I1204 10:35:30.214381 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.214361331 podStartE2EDuration="4.214361331s" podCreationTimestamp="2025-12-04 10:35:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:35:30.198352726 +0000 UTC m=+1227.147528040" watchObservedRunningTime="2025-12-04 10:35:30.214361331 +0000 UTC m=+1227.163536645" Dec 04 10:35:30 crc kubenswrapper[4831]: I1204 10:35:30.324108 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7656fdcbd7-2jzrb" Dec 04 10:35:30 crc kubenswrapper[4831]: I1204 10:35:30.417069 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a7a16dd7-b208-4a3a-9309-0ba292ed12fe-dns-swift-storage-0\") pod \"a7a16dd7-b208-4a3a-9309-0ba292ed12fe\" (UID: \"a7a16dd7-b208-4a3a-9309-0ba292ed12fe\") " Dec 04 10:35:30 crc kubenswrapper[4831]: I1204 10:35:30.417163 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a7a16dd7-b208-4a3a-9309-0ba292ed12fe-ovsdbserver-nb\") pod \"a7a16dd7-b208-4a3a-9309-0ba292ed12fe\" (UID: \"a7a16dd7-b208-4a3a-9309-0ba292ed12fe\") " Dec 04 10:35:30 crc kubenswrapper[4831]: I1204 10:35:30.417198 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7a16dd7-b208-4a3a-9309-0ba292ed12fe-config\") pod \"a7a16dd7-b208-4a3a-9309-0ba292ed12fe\" (UID: \"a7a16dd7-b208-4a3a-9309-0ba292ed12fe\") " Dec 04 10:35:30 crc kubenswrapper[4831]: I1204 10:35:30.417379 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a7a16dd7-b208-4a3a-9309-0ba292ed12fe-ovsdbserver-sb\") pod \"a7a16dd7-b208-4a3a-9309-0ba292ed12fe\" (UID: \"a7a16dd7-b208-4a3a-9309-0ba292ed12fe\") " Dec 04 10:35:30 crc kubenswrapper[4831]: I1204 10:35:30.417420 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a7a16dd7-b208-4a3a-9309-0ba292ed12fe-dns-svc\") pod \"a7a16dd7-b208-4a3a-9309-0ba292ed12fe\" (UID: \"a7a16dd7-b208-4a3a-9309-0ba292ed12fe\") " Dec 04 10:35:30 crc kubenswrapper[4831]: I1204 10:35:30.417450 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d52tn\" (UniqueName: \"kubernetes.io/projected/a7a16dd7-b208-4a3a-9309-0ba292ed12fe-kube-api-access-d52tn\") pod \"a7a16dd7-b208-4a3a-9309-0ba292ed12fe\" (UID: \"a7a16dd7-b208-4a3a-9309-0ba292ed12fe\") " Dec 04 10:35:30 crc kubenswrapper[4831]: I1204 10:35:30.424633 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7a16dd7-b208-4a3a-9309-0ba292ed12fe-kube-api-access-d52tn" (OuterVolumeSpecName: "kube-api-access-d52tn") pod "a7a16dd7-b208-4a3a-9309-0ba292ed12fe" (UID: "a7a16dd7-b208-4a3a-9309-0ba292ed12fe"). InnerVolumeSpecName "kube-api-access-d52tn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:35:30 crc kubenswrapper[4831]: I1204 10:35:30.486584 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7a16dd7-b208-4a3a-9309-0ba292ed12fe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a7a16dd7-b208-4a3a-9309-0ba292ed12fe" (UID: "a7a16dd7-b208-4a3a-9309-0ba292ed12fe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:35:30 crc kubenswrapper[4831]: I1204 10:35:30.490137 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7a16dd7-b208-4a3a-9309-0ba292ed12fe-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a7a16dd7-b208-4a3a-9309-0ba292ed12fe" (UID: "a7a16dd7-b208-4a3a-9309-0ba292ed12fe"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:35:30 crc kubenswrapper[4831]: I1204 10:35:30.497133 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7a16dd7-b208-4a3a-9309-0ba292ed12fe-config" (OuterVolumeSpecName: "config") pod "a7a16dd7-b208-4a3a-9309-0ba292ed12fe" (UID: "a7a16dd7-b208-4a3a-9309-0ba292ed12fe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:35:30 crc kubenswrapper[4831]: E1204 10:35:30.501938 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a7a16dd7-b208-4a3a-9309-0ba292ed12fe-dns-swift-storage-0 podName:a7a16dd7-b208-4a3a-9309-0ba292ed12fe nodeName:}" failed. No retries permitted until 2025-12-04 10:35:31.001915137 +0000 UTC m=+1227.951090451 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "dns-swift-storage-0" (UniqueName: "kubernetes.io/configmap/a7a16dd7-b208-4a3a-9309-0ba292ed12fe-dns-swift-storage-0") pod "a7a16dd7-b208-4a3a-9309-0ba292ed12fe" (UID: "a7a16dd7-b208-4a3a-9309-0ba292ed12fe") : error deleting /var/lib/kubelet/pods/a7a16dd7-b208-4a3a-9309-0ba292ed12fe/volume-subpaths: remove /var/lib/kubelet/pods/a7a16dd7-b208-4a3a-9309-0ba292ed12fe/volume-subpaths: no such file or directory Dec 04 10:35:30 crc kubenswrapper[4831]: I1204 10:35:30.502235 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7a16dd7-b208-4a3a-9309-0ba292ed12fe-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a7a16dd7-b208-4a3a-9309-0ba292ed12fe" (UID: "a7a16dd7-b208-4a3a-9309-0ba292ed12fe"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:35:30 crc kubenswrapper[4831]: I1204 10:35:30.519546 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7a16dd7-b208-4a3a-9309-0ba292ed12fe-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:30 crc kubenswrapper[4831]: I1204 10:35:30.519576 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a7a16dd7-b208-4a3a-9309-0ba292ed12fe-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:30 crc kubenswrapper[4831]: I1204 10:35:30.519584 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a7a16dd7-b208-4a3a-9309-0ba292ed12fe-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:30 crc kubenswrapper[4831]: I1204 10:35:30.519593 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d52tn\" (UniqueName: \"kubernetes.io/projected/a7a16dd7-b208-4a3a-9309-0ba292ed12fe-kube-api-access-d52tn\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:30 crc kubenswrapper[4831]: I1204 10:35:30.519602 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a7a16dd7-b208-4a3a-9309-0ba292ed12fe-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:30 crc kubenswrapper[4831]: I1204 10:35:30.961307 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.036754 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a7a16dd7-b208-4a3a-9309-0ba292ed12fe-dns-swift-storage-0\") pod \"a7a16dd7-b208-4a3a-9309-0ba292ed12fe\" (UID: \"a7a16dd7-b208-4a3a-9309-0ba292ed12fe\") " Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.037903 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7a16dd7-b208-4a3a-9309-0ba292ed12fe-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a7a16dd7-b208-4a3a-9309-0ba292ed12fe" (UID: "a7a16dd7-b208-4a3a-9309-0ba292ed12fe"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.047045 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.139396 4831 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a7a16dd7-b208-4a3a-9309-0ba292ed12fe-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.172567 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.235966 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7656fdcbd7-2jzrb" event={"ID":"a7a16dd7-b208-4a3a-9309-0ba292ed12fe","Type":"ContainerDied","Data":"44c9d2d596f73dd964f999f2ed0d4ad78628af94bde3de97b9fcf9b5864fbacf"} Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.236034 4831 scope.go:117] "RemoveContainer" containerID="36db02e40224a63052c46c613debcf4d8c24d650642a0d056f187b6071ea0748" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.236035 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7656fdcbd7-2jzrb" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.239234 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"63dffe41-8eb3-4696-bfdc-de4fe1735e19","Type":"ContainerStarted","Data":"9c711f4d0bf2a86ce430041bc836b37392f167f0530a4b24f3f2c99ff417d2fc"} Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.240653 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/72ca3a62-9e01-4f14-8b45-6db4951fb6e6-config-data-custom\") pod \"72ca3a62-9e01-4f14-8b45-6db4951fb6e6\" (UID: \"72ca3a62-9e01-4f14-8b45-6db4951fb6e6\") " Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.240795 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72ca3a62-9e01-4f14-8b45-6db4951fb6e6-combined-ca-bundle\") pod \"72ca3a62-9e01-4f14-8b45-6db4951fb6e6\" (UID: \"72ca3a62-9e01-4f14-8b45-6db4951fb6e6\") " Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.240822 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72ca3a62-9e01-4f14-8b45-6db4951fb6e6-config-data\") pod \"72ca3a62-9e01-4f14-8b45-6db4951fb6e6\" (UID: \"72ca3a62-9e01-4f14-8b45-6db4951fb6e6\") " Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.240862 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65zwx\" (UniqueName: \"kubernetes.io/projected/72ca3a62-9e01-4f14-8b45-6db4951fb6e6-kube-api-access-65zwx\") pod \"72ca3a62-9e01-4f14-8b45-6db4951fb6e6\" (UID: \"72ca3a62-9e01-4f14-8b45-6db4951fb6e6\") " Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.240939 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72ca3a62-9e01-4f14-8b45-6db4951fb6e6-scripts\") pod \"72ca3a62-9e01-4f14-8b45-6db4951fb6e6\" (UID: \"72ca3a62-9e01-4f14-8b45-6db4951fb6e6\") " Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.240956 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/72ca3a62-9e01-4f14-8b45-6db4951fb6e6-etc-machine-id\") pod \"72ca3a62-9e01-4f14-8b45-6db4951fb6e6\" (UID: \"72ca3a62-9e01-4f14-8b45-6db4951fb6e6\") " Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.241375 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72ca3a62-9e01-4f14-8b45-6db4951fb6e6-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "72ca3a62-9e01-4f14-8b45-6db4951fb6e6" (UID: "72ca3a62-9e01-4f14-8b45-6db4951fb6e6"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.245774 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"46d9cb17-c424-47e2-a9f3-d00f479be770","Type":"ContainerStarted","Data":"290ec4343d8a9e924021fb8be00b63c56030866278fdcf442da7e37e39566f87"} Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.245820 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"46d9cb17-c424-47e2-a9f3-d00f479be770","Type":"ContainerStarted","Data":"3b8c12573adf18db918ed0fa3b95780b2a3f6b5406fda505d663c22c5fabd99b"} Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.254036 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72ca3a62-9e01-4f14-8b45-6db4951fb6e6-scripts" (OuterVolumeSpecName: "scripts") pod "72ca3a62-9e01-4f14-8b45-6db4951fb6e6" (UID: "72ca3a62-9e01-4f14-8b45-6db4951fb6e6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.254099 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72ca3a62-9e01-4f14-8b45-6db4951fb6e6-kube-api-access-65zwx" (OuterVolumeSpecName: "kube-api-access-65zwx") pod "72ca3a62-9e01-4f14-8b45-6db4951fb6e6" (UID: "72ca3a62-9e01-4f14-8b45-6db4951fb6e6"). InnerVolumeSpecName "kube-api-access-65zwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.255636 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72ca3a62-9e01-4f14-8b45-6db4951fb6e6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "72ca3a62-9e01-4f14-8b45-6db4951fb6e6" (UID: "72ca3a62-9e01-4f14-8b45-6db4951fb6e6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.259135 4831 generic.go:334] "Generic (PLEG): container finished" podID="72ca3a62-9e01-4f14-8b45-6db4951fb6e6" containerID="2c0e04e721d4a92800cd581e83deeaa1f392408423c211e35d354d64cc9bfefa" exitCode=0 Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.259785 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.260089 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"72ca3a62-9e01-4f14-8b45-6db4951fb6e6","Type":"ContainerDied","Data":"2c0e04e721d4a92800cd581e83deeaa1f392408423c211e35d354d64cc9bfefa"} Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.260119 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.260131 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"72ca3a62-9e01-4f14-8b45-6db4951fb6e6","Type":"ContainerDied","Data":"f5c0f6158fd16b893efde2972c39137b2cdfb5869987b5694af4623a589af274"} Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.271168 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.271151882 podStartE2EDuration="4.271151882s" podCreationTimestamp="2025-12-04 10:35:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:35:31.267925926 +0000 UTC m=+1228.217101240" watchObservedRunningTime="2025-12-04 10:35:31.271151882 +0000 UTC m=+1228.220327196" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.297893 4831 scope.go:117] "RemoveContainer" containerID="294914b80011213e69f277fa86f5427efde413629150503558c967245506f485" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.305229 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.305273 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7656fdcbd7-2jzrb"] Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.312310 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7656fdcbd7-2jzrb"] Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.337026 4831 scope.go:117] "RemoveContainer" containerID="17e1d0021386645c2462c680dfa2a92e0a4b55f149a8e05b1eb361431e680f2f" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.343687 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65zwx\" (UniqueName: \"kubernetes.io/projected/72ca3a62-9e01-4f14-8b45-6db4951fb6e6-kube-api-access-65zwx\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.343716 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72ca3a62-9e01-4f14-8b45-6db4951fb6e6-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.343729 4831 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/72ca3a62-9e01-4f14-8b45-6db4951fb6e6-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.343742 4831 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/72ca3a62-9e01-4f14-8b45-6db4951fb6e6-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.368556 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72ca3a62-9e01-4f14-8b45-6db4951fb6e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72ca3a62-9e01-4f14-8b45-6db4951fb6e6" (UID: "72ca3a62-9e01-4f14-8b45-6db4951fb6e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.373434 4831 scope.go:117] "RemoveContainer" containerID="2c0e04e721d4a92800cd581e83deeaa1f392408423c211e35d354d64cc9bfefa" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.394648 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.419449 4831 scope.go:117] "RemoveContainer" containerID="17e1d0021386645c2462c680dfa2a92e0a4b55f149a8e05b1eb361431e680f2f" Dec 04 10:35:31 crc kubenswrapper[4831]: E1204 10:35:31.419999 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17e1d0021386645c2462c680dfa2a92e0a4b55f149a8e05b1eb361431e680f2f\": container with ID starting with 17e1d0021386645c2462c680dfa2a92e0a4b55f149a8e05b1eb361431e680f2f not found: ID does not exist" containerID="17e1d0021386645c2462c680dfa2a92e0a4b55f149a8e05b1eb361431e680f2f" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.420061 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17e1d0021386645c2462c680dfa2a92e0a4b55f149a8e05b1eb361431e680f2f"} err="failed to get container status \"17e1d0021386645c2462c680dfa2a92e0a4b55f149a8e05b1eb361431e680f2f\": rpc error: code = NotFound desc = could not find container \"17e1d0021386645c2462c680dfa2a92e0a4b55f149a8e05b1eb361431e680f2f\": container with ID starting with 17e1d0021386645c2462c680dfa2a92e0a4b55f149a8e05b1eb361431e680f2f not found: ID does not exist" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.420097 4831 scope.go:117] "RemoveContainer" containerID="2c0e04e721d4a92800cd581e83deeaa1f392408423c211e35d354d64cc9bfefa" Dec 04 10:35:31 crc kubenswrapper[4831]: E1204 10:35:31.421981 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c0e04e721d4a92800cd581e83deeaa1f392408423c211e35d354d64cc9bfefa\": container with ID starting with 2c0e04e721d4a92800cd581e83deeaa1f392408423c211e35d354d64cc9bfefa not found: ID does not exist" containerID="2c0e04e721d4a92800cd581e83deeaa1f392408423c211e35d354d64cc9bfefa" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.422013 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c0e04e721d4a92800cd581e83deeaa1f392408423c211e35d354d64cc9bfefa"} err="failed to get container status \"2c0e04e721d4a92800cd581e83deeaa1f392408423c211e35d354d64cc9bfefa\": rpc error: code = NotFound desc = could not find container \"2c0e04e721d4a92800cd581e83deeaa1f392408423c211e35d354d64cc9bfefa\": container with ID starting with 2c0e04e721d4a92800cd581e83deeaa1f392408423c211e35d354d64cc9bfefa not found: ID does not exist" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.423405 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72ca3a62-9e01-4f14-8b45-6db4951fb6e6-config-data" (OuterVolumeSpecName: "config-data") pod "72ca3a62-9e01-4f14-8b45-6db4951fb6e6" (UID: "72ca3a62-9e01-4f14-8b45-6db4951fb6e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.444777 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72ca3a62-9e01-4f14-8b45-6db4951fb6e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.444813 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72ca3a62-9e01-4f14-8b45-6db4951fb6e6-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.592154 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.604052 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.624249 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 10:35:31 crc kubenswrapper[4831]: E1204 10:35:31.624727 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7a16dd7-b208-4a3a-9309-0ba292ed12fe" containerName="dnsmasq-dns" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.624754 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7a16dd7-b208-4a3a-9309-0ba292ed12fe" containerName="dnsmasq-dns" Dec 04 10:35:31 crc kubenswrapper[4831]: E1204 10:35:31.624767 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72ca3a62-9e01-4f14-8b45-6db4951fb6e6" containerName="cinder-scheduler" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.624777 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="72ca3a62-9e01-4f14-8b45-6db4951fb6e6" containerName="cinder-scheduler" Dec 04 10:35:31 crc kubenswrapper[4831]: E1204 10:35:31.624804 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72ca3a62-9e01-4f14-8b45-6db4951fb6e6" containerName="probe" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.624814 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="72ca3a62-9e01-4f14-8b45-6db4951fb6e6" containerName="probe" Dec 04 10:35:31 crc kubenswrapper[4831]: E1204 10:35:31.624857 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7a16dd7-b208-4a3a-9309-0ba292ed12fe" containerName="init" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.624866 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7a16dd7-b208-4a3a-9309-0ba292ed12fe" containerName="init" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.625103 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="72ca3a62-9e01-4f14-8b45-6db4951fb6e6" containerName="cinder-scheduler" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.625128 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="72ca3a62-9e01-4f14-8b45-6db4951fb6e6" containerName="probe" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.625266 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7a16dd7-b208-4a3a-9309-0ba292ed12fe" containerName="dnsmasq-dns" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.626567 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.632350 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.646639 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.750621 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmkkr\" (UniqueName: \"kubernetes.io/projected/5a0cad59-03e2-4ed0-84ee-c51f7c1b5e3f-kube-api-access-zmkkr\") pod \"cinder-scheduler-0\" (UID: \"5a0cad59-03e2-4ed0-84ee-c51f7c1b5e3f\") " pod="openstack/cinder-scheduler-0" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.750964 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5a0cad59-03e2-4ed0-84ee-c51f7c1b5e3f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5a0cad59-03e2-4ed0-84ee-c51f7c1b5e3f\") " pod="openstack/cinder-scheduler-0" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.751020 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a0cad59-03e2-4ed0-84ee-c51f7c1b5e3f-scripts\") pod \"cinder-scheduler-0\" (UID: \"5a0cad59-03e2-4ed0-84ee-c51f7c1b5e3f\") " pod="openstack/cinder-scheduler-0" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.751034 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a0cad59-03e2-4ed0-84ee-c51f7c1b5e3f-config-data\") pod \"cinder-scheduler-0\" (UID: \"5a0cad59-03e2-4ed0-84ee-c51f7c1b5e3f\") " pod="openstack/cinder-scheduler-0" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.751156 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a0cad59-03e2-4ed0-84ee-c51f7c1b5e3f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5a0cad59-03e2-4ed0-84ee-c51f7c1b5e3f\") " pod="openstack/cinder-scheduler-0" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.751232 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a0cad59-03e2-4ed0-84ee-c51f7c1b5e3f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5a0cad59-03e2-4ed0-84ee-c51f7c1b5e3f\") " pod="openstack/cinder-scheduler-0" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.853216 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a0cad59-03e2-4ed0-84ee-c51f7c1b5e3f-config-data\") pod \"cinder-scheduler-0\" (UID: \"5a0cad59-03e2-4ed0-84ee-c51f7c1b5e3f\") " pod="openstack/cinder-scheduler-0" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.853261 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a0cad59-03e2-4ed0-84ee-c51f7c1b5e3f-scripts\") pod \"cinder-scheduler-0\" (UID: \"5a0cad59-03e2-4ed0-84ee-c51f7c1b5e3f\") " pod="openstack/cinder-scheduler-0" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.853399 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a0cad59-03e2-4ed0-84ee-c51f7c1b5e3f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5a0cad59-03e2-4ed0-84ee-c51f7c1b5e3f\") " pod="openstack/cinder-scheduler-0" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.853450 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a0cad59-03e2-4ed0-84ee-c51f7c1b5e3f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5a0cad59-03e2-4ed0-84ee-c51f7c1b5e3f\") " pod="openstack/cinder-scheduler-0" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.853495 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmkkr\" (UniqueName: \"kubernetes.io/projected/5a0cad59-03e2-4ed0-84ee-c51f7c1b5e3f-kube-api-access-zmkkr\") pod \"cinder-scheduler-0\" (UID: \"5a0cad59-03e2-4ed0-84ee-c51f7c1b5e3f\") " pod="openstack/cinder-scheduler-0" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.853535 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5a0cad59-03e2-4ed0-84ee-c51f7c1b5e3f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5a0cad59-03e2-4ed0-84ee-c51f7c1b5e3f\") " pod="openstack/cinder-scheduler-0" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.859165 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a0cad59-03e2-4ed0-84ee-c51f7c1b5e3f-config-data\") pod \"cinder-scheduler-0\" (UID: \"5a0cad59-03e2-4ed0-84ee-c51f7c1b5e3f\") " pod="openstack/cinder-scheduler-0" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.859230 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5a0cad59-03e2-4ed0-84ee-c51f7c1b5e3f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5a0cad59-03e2-4ed0-84ee-c51f7c1b5e3f\") " pod="openstack/cinder-scheduler-0" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.860801 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a0cad59-03e2-4ed0-84ee-c51f7c1b5e3f-scripts\") pod \"cinder-scheduler-0\" (UID: \"5a0cad59-03e2-4ed0-84ee-c51f7c1b5e3f\") " pod="openstack/cinder-scheduler-0" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.860928 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a0cad59-03e2-4ed0-84ee-c51f7c1b5e3f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5a0cad59-03e2-4ed0-84ee-c51f7c1b5e3f\") " pod="openstack/cinder-scheduler-0" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.862986 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a0cad59-03e2-4ed0-84ee-c51f7c1b5e3f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5a0cad59-03e2-4ed0-84ee-c51f7c1b5e3f\") " pod="openstack/cinder-scheduler-0" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.873623 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmkkr\" (UniqueName: \"kubernetes.io/projected/5a0cad59-03e2-4ed0-84ee-c51f7c1b5e3f-kube-api-access-zmkkr\") pod \"cinder-scheduler-0\" (UID: \"5a0cad59-03e2-4ed0-84ee-c51f7c1b5e3f\") " pod="openstack/cinder-scheduler-0" Dec 04 10:35:31 crc kubenswrapper[4831]: I1204 10:35:31.996915 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 10:35:32 crc kubenswrapper[4831]: I1204 10:35:32.288233 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"46d9cb17-c424-47e2-a9f3-d00f479be770","Type":"ContainerStarted","Data":"898643a493e42f1244a61658bf2f03b1d6c3570659ce68edd2983922982979dd"} Dec 04 10:35:32 crc kubenswrapper[4831]: I1204 10:35:32.288569 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"46d9cb17-c424-47e2-a9f3-d00f479be770","Type":"ContainerStarted","Data":"fd71aaacbb01e9e38227d8ba792a2cc308ba6526c95152cd3bf28d95d4ec6807"} Dec 04 10:35:32 crc kubenswrapper[4831]: I1204 10:35:32.289239 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9b4fe343-759d-470c-9e6e-1daba0a5d58d" containerName="glance-log" containerID="cri-o://bd919d5e1909a1e207ccd77fd162a09cad523e0a7fd2c88e5498805dc1a8bfd3" gracePeriod=30 Dec 04 10:35:32 crc kubenswrapper[4831]: I1204 10:35:32.289349 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9b4fe343-759d-470c-9e6e-1daba0a5d58d" containerName="glance-httpd" containerID="cri-o://f0024ea85d13065fa874af4a564db1a8f568886102b923f30d5f3a98dfb18c83" gracePeriod=30 Dec 04 10:35:32 crc kubenswrapper[4831]: W1204 10:35:32.478253 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a0cad59_03e2_4ed0_84ee_c51f7c1b5e3f.slice/crio-26f3c5b47aa111b765121b1ed7880a0797b995280f2ac50c1bf60a1e2f406ebd WatchSource:0}: Error finding container 26f3c5b47aa111b765121b1ed7880a0797b995280f2ac50c1bf60a1e2f406ebd: Status 404 returned error can't find the container with id 26f3c5b47aa111b765121b1ed7880a0797b995280f2ac50c1bf60a1e2f406ebd Dec 04 10:35:32 crc kubenswrapper[4831]: I1204 10:35:32.480762 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 10:35:32 crc kubenswrapper[4831]: I1204 10:35:32.578695 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 10:35:32 crc kubenswrapper[4831]: I1204 10:35:32.901333 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:35:33 crc kubenswrapper[4831]: I1204 10:35:33.316959 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72ca3a62-9e01-4f14-8b45-6db4951fb6e6" path="/var/lib/kubelet/pods/72ca3a62-9e01-4f14-8b45-6db4951fb6e6/volumes" Dec 04 10:35:33 crc kubenswrapper[4831]: I1204 10:35:33.317046 4831 generic.go:334] "Generic (PLEG): container finished" podID="9b4fe343-759d-470c-9e6e-1daba0a5d58d" containerID="f0024ea85d13065fa874af4a564db1a8f568886102b923f30d5f3a98dfb18c83" exitCode=0 Dec 04 10:35:33 crc kubenswrapper[4831]: I1204 10:35:33.317492 4831 generic.go:334] "Generic (PLEG): container finished" podID="9b4fe343-759d-470c-9e6e-1daba0a5d58d" containerID="bd919d5e1909a1e207ccd77fd162a09cad523e0a7fd2c88e5498805dc1a8bfd3" exitCode=143 Dec 04 10:35:33 crc kubenswrapper[4831]: I1204 10:35:33.318452 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7a16dd7-b208-4a3a-9309-0ba292ed12fe" path="/var/lib/kubelet/pods/a7a16dd7-b208-4a3a-9309-0ba292ed12fe/volumes" Dec 04 10:35:33 crc kubenswrapper[4831]: I1204 10:35:33.319170 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5a0cad59-03e2-4ed0-84ee-c51f7c1b5e3f","Type":"ContainerStarted","Data":"26f3c5b47aa111b765121b1ed7880a0797b995280f2ac50c1bf60a1e2f406ebd"} Dec 04 10:35:33 crc kubenswrapper[4831]: I1204 10:35:33.319198 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9b4fe343-759d-470c-9e6e-1daba0a5d58d","Type":"ContainerDied","Data":"f0024ea85d13065fa874af4a564db1a8f568886102b923f30d5f3a98dfb18c83"} Dec 04 10:35:33 crc kubenswrapper[4831]: I1204 10:35:33.319213 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9b4fe343-759d-470c-9e6e-1daba0a5d58d","Type":"ContainerDied","Data":"bd919d5e1909a1e207ccd77fd162a09cad523e0a7fd2c88e5498805dc1a8bfd3"} Dec 04 10:35:33 crc kubenswrapper[4831]: I1204 10:35:33.319360 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="63dffe41-8eb3-4696-bfdc-de4fe1735e19" containerName="glance-log" containerID="cri-o://ffdf0493afea65d971fe4de728ed8cbca86bac9e71ba09e6a8be9114d83960fb" gracePeriod=30 Dec 04 10:35:33 crc kubenswrapper[4831]: I1204 10:35:33.319525 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="63dffe41-8eb3-4696-bfdc-de4fe1735e19" containerName="glance-httpd" containerID="cri-o://9c711f4d0bf2a86ce430041bc836b37392f167f0530a4b24f3f2c99ff417d2fc" gracePeriod=30 Dec 04 10:35:33 crc kubenswrapper[4831]: I1204 10:35:33.971123 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-b1d5-account-create-lq99p"] Dec 04 10:35:33 crc kubenswrapper[4831]: I1204 10:35:33.978622 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-b1d5-account-create-lq99p" Dec 04 10:35:33 crc kubenswrapper[4831]: I1204 10:35:33.981132 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.000729 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-b1d5-account-create-lq99p"] Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.108117 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwrds\" (UniqueName: \"kubernetes.io/projected/d0cfead2-fbc2-4c82-a779-c1419a2bdd13-kube-api-access-xwrds\") pod \"nova-api-b1d5-account-create-lq99p\" (UID: \"d0cfead2-fbc2-4c82-a779-c1419a2bdd13\") " pod="openstack/nova-api-b1d5-account-create-lq99p" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.203734 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-2dcb-account-create-xjrzk"] Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.208086 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2dcb-account-create-xjrzk" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.214404 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.217788 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.221846 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwrds\" (UniqueName: \"kubernetes.io/projected/d0cfead2-fbc2-4c82-a779-c1419a2bdd13-kube-api-access-xwrds\") pod \"nova-api-b1d5-account-create-lq99p\" (UID: \"d0cfead2-fbc2-4c82-a779-c1419a2bdd13\") " pod="openstack/nova-api-b1d5-account-create-lq99p" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.232752 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-2dcb-account-create-xjrzk"] Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.259988 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwrds\" (UniqueName: \"kubernetes.io/projected/d0cfead2-fbc2-4c82-a779-c1419a2bdd13-kube-api-access-xwrds\") pod \"nova-api-b1d5-account-create-lq99p\" (UID: \"d0cfead2-fbc2-4c82-a779-c1419a2bdd13\") " pod="openstack/nova-api-b1d5-account-create-lq99p" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.324277 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b4fe343-759d-470c-9e6e-1daba0a5d58d-scripts\") pod \"9b4fe343-759d-470c-9e6e-1daba0a5d58d\" (UID: \"9b4fe343-759d-470c-9e6e-1daba0a5d58d\") " Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.324353 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9b4fe343-759d-470c-9e6e-1daba0a5d58d-httpd-run\") pod \"9b4fe343-759d-470c-9e6e-1daba0a5d58d\" (UID: \"9b4fe343-759d-470c-9e6e-1daba0a5d58d\") " Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.324420 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b4fe343-759d-470c-9e6e-1daba0a5d58d-config-data\") pod \"9b4fe343-759d-470c-9e6e-1daba0a5d58d\" (UID: \"9b4fe343-759d-470c-9e6e-1daba0a5d58d\") " Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.324518 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"9b4fe343-759d-470c-9e6e-1daba0a5d58d\" (UID: \"9b4fe343-759d-470c-9e6e-1daba0a5d58d\") " Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.324557 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b4fe343-759d-470c-9e6e-1daba0a5d58d-logs\") pod \"9b4fe343-759d-470c-9e6e-1daba0a5d58d\" (UID: \"9b4fe343-759d-470c-9e6e-1daba0a5d58d\") " Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.324585 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b4fe343-759d-470c-9e6e-1daba0a5d58d-combined-ca-bundle\") pod \"9b4fe343-759d-470c-9e6e-1daba0a5d58d\" (UID: \"9b4fe343-759d-470c-9e6e-1daba0a5d58d\") " Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.324637 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b4fe343-759d-470c-9e6e-1daba0a5d58d-public-tls-certs\") pod \"9b4fe343-759d-470c-9e6e-1daba0a5d58d\" (UID: \"9b4fe343-759d-470c-9e6e-1daba0a5d58d\") " Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.324685 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5x8n\" (UniqueName: \"kubernetes.io/projected/9b4fe343-759d-470c-9e6e-1daba0a5d58d-kube-api-access-l5x8n\") pod \"9b4fe343-759d-470c-9e6e-1daba0a5d58d\" (UID: \"9b4fe343-759d-470c-9e6e-1daba0a5d58d\") " Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.324734 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b4fe343-759d-470c-9e6e-1daba0a5d58d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9b4fe343-759d-470c-9e6e-1daba0a5d58d" (UID: "9b4fe343-759d-470c-9e6e-1daba0a5d58d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.324973 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b4fe343-759d-470c-9e6e-1daba0a5d58d-logs" (OuterVolumeSpecName: "logs") pod "9b4fe343-759d-470c-9e6e-1daba0a5d58d" (UID: "9b4fe343-759d-470c-9e6e-1daba0a5d58d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.325193 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvjn6\" (UniqueName: \"kubernetes.io/projected/bcf3269a-7a62-450b-b6e3-5b18451dd26f-kube-api-access-xvjn6\") pod \"nova-cell0-2dcb-account-create-xjrzk\" (UID: \"bcf3269a-7a62-450b-b6e3-5b18451dd26f\") " pod="openstack/nova-cell0-2dcb-account-create-xjrzk" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.325277 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b4fe343-759d-470c-9e6e-1daba0a5d58d-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.325292 4831 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9b4fe343-759d-470c-9e6e-1daba0a5d58d-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.328727 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b4fe343-759d-470c-9e6e-1daba0a5d58d-scripts" (OuterVolumeSpecName: "scripts") pod "9b4fe343-759d-470c-9e6e-1daba0a5d58d" (UID: "9b4fe343-759d-470c-9e6e-1daba0a5d58d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.333983 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b4fe343-759d-470c-9e6e-1daba0a5d58d-kube-api-access-l5x8n" (OuterVolumeSpecName: "kube-api-access-l5x8n") pod "9b4fe343-759d-470c-9e6e-1daba0a5d58d" (UID: "9b4fe343-759d-470c-9e6e-1daba0a5d58d"). InnerVolumeSpecName "kube-api-access-l5x8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.362833 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "9b4fe343-759d-470c-9e6e-1daba0a5d58d" (UID: "9b4fe343-759d-470c-9e6e-1daba0a5d58d"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.380617 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-63ed-account-create-lqdgk"] Dec 04 10:35:34 crc kubenswrapper[4831]: E1204 10:35:34.381159 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b4fe343-759d-470c-9e6e-1daba0a5d58d" containerName="glance-log" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.381178 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b4fe343-759d-470c-9e6e-1daba0a5d58d" containerName="glance-log" Dec 04 10:35:34 crc kubenswrapper[4831]: E1204 10:35:34.381227 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b4fe343-759d-470c-9e6e-1daba0a5d58d" containerName="glance-httpd" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.381234 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b4fe343-759d-470c-9e6e-1daba0a5d58d" containerName="glance-httpd" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.381401 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b4fe343-759d-470c-9e6e-1daba0a5d58d" containerName="glance-log" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.381420 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b4fe343-759d-470c-9e6e-1daba0a5d58d" containerName="glance-httpd" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.382068 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-63ed-account-create-lqdgk" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.385520 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.392125 4831 generic.go:334] "Generic (PLEG): container finished" podID="63dffe41-8eb3-4696-bfdc-de4fe1735e19" containerID="9c711f4d0bf2a86ce430041bc836b37392f167f0530a4b24f3f2c99ff417d2fc" exitCode=0 Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.392159 4831 generic.go:334] "Generic (PLEG): container finished" podID="63dffe41-8eb3-4696-bfdc-de4fe1735e19" containerID="ffdf0493afea65d971fe4de728ed8cbca86bac9e71ba09e6a8be9114d83960fb" exitCode=143 Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.392205 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"63dffe41-8eb3-4696-bfdc-de4fe1735e19","Type":"ContainerDied","Data":"9c711f4d0bf2a86ce430041bc836b37392f167f0530a4b24f3f2c99ff417d2fc"} Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.392236 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"63dffe41-8eb3-4696-bfdc-de4fe1735e19","Type":"ContainerDied","Data":"ffdf0493afea65d971fe4de728ed8cbca86bac9e71ba09e6a8be9114d83960fb"} Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.401287 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b4fe343-759d-470c-9e6e-1daba0a5d58d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b4fe343-759d-470c-9e6e-1daba0a5d58d" (UID: "9b4fe343-759d-470c-9e6e-1daba0a5d58d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.402212 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-63ed-account-create-lqdgk"] Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.408187 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5a0cad59-03e2-4ed0-84ee-c51f7c1b5e3f","Type":"ContainerStarted","Data":"5943e03e9ff0ef22c92eca3015824131f5bd6352c23c70753a032482377532a2"} Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.422418 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9b4fe343-759d-470c-9e6e-1daba0a5d58d","Type":"ContainerDied","Data":"8898853e5a3437b117d1d5605846936787d0a50434e0d5fece9f2b0e6cdf66a8"} Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.422493 4831 scope.go:117] "RemoveContainer" containerID="f0024ea85d13065fa874af4a564db1a8f568886102b923f30d5f3a98dfb18c83" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.423024 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.429459 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvjn6\" (UniqueName: \"kubernetes.io/projected/bcf3269a-7a62-450b-b6e3-5b18451dd26f-kube-api-access-xvjn6\") pod \"nova-cell0-2dcb-account-create-xjrzk\" (UID: \"bcf3269a-7a62-450b-b6e3-5b18451dd26f\") " pod="openstack/nova-cell0-2dcb-account-create-xjrzk" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.431016 4831 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.431504 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b4fe343-759d-470c-9e6e-1daba0a5d58d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.431530 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5x8n\" (UniqueName: \"kubernetes.io/projected/9b4fe343-759d-470c-9e6e-1daba0a5d58d-kube-api-access-l5x8n\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.431542 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b4fe343-759d-470c-9e6e-1daba0a5d58d-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.438790 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b4fe343-759d-470c-9e6e-1daba0a5d58d-config-data" (OuterVolumeSpecName: "config-data") pod "9b4fe343-759d-470c-9e6e-1daba0a5d58d" (UID: "9b4fe343-759d-470c-9e6e-1daba0a5d58d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.452714 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"46d9cb17-c424-47e2-a9f3-d00f479be770","Type":"ContainerStarted","Data":"73951d7e874c2e9c585d5ec5160ff814c456ecb47286d3c9f983181c13c50ef0"} Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.452941 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="46d9cb17-c424-47e2-a9f3-d00f479be770" containerName="ceilometer-central-agent" containerID="cri-o://290ec4343d8a9e924021fb8be00b63c56030866278fdcf442da7e37e39566f87" gracePeriod=30 Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.453282 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.455295 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="46d9cb17-c424-47e2-a9f3-d00f479be770" containerName="sg-core" containerID="cri-o://898643a493e42f1244a61658bf2f03b1d6c3570659ce68edd2983922982979dd" gracePeriod=30 Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.455422 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="46d9cb17-c424-47e2-a9f3-d00f479be770" containerName="proxy-httpd" containerID="cri-o://73951d7e874c2e9c585d5ec5160ff814c456ecb47286d3c9f983181c13c50ef0" gracePeriod=30 Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.455696 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="46d9cb17-c424-47e2-a9f3-d00f479be770" containerName="ceilometer-notification-agent" containerID="cri-o://fd71aaacbb01e9e38227d8ba792a2cc308ba6526c95152cd3bf28d95d4ec6807" gracePeriod=30 Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.456092 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvjn6\" (UniqueName: \"kubernetes.io/projected/bcf3269a-7a62-450b-b6e3-5b18451dd26f-kube-api-access-xvjn6\") pod \"nova-cell0-2dcb-account-create-xjrzk\" (UID: \"bcf3269a-7a62-450b-b6e3-5b18451dd26f\") " pod="openstack/nova-cell0-2dcb-account-create-xjrzk" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.486574 4831 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.517825 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-b1d5-account-create-lq99p" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.526633 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.8292947590000002 podStartE2EDuration="5.526607092s" podCreationTimestamp="2025-12-04 10:35:29 +0000 UTC" firstStartedPulling="2025-12-04 10:35:30.211871475 +0000 UTC m=+1227.161046789" lastFinishedPulling="2025-12-04 10:35:33.909183808 +0000 UTC m=+1230.858359122" observedRunningTime="2025-12-04 10:35:34.471499689 +0000 UTC m=+1231.420675023" watchObservedRunningTime="2025-12-04 10:35:34.526607092 +0000 UTC m=+1231.475782406" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.532499 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2dcb-account-create-xjrzk" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.534867 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm4r8\" (UniqueName: \"kubernetes.io/projected/65a7e487-a7d3-491f-bc96-e7e2ff378a2c-kube-api-access-sm4r8\") pod \"nova-cell1-63ed-account-create-lqdgk\" (UID: \"65a7e487-a7d3-491f-bc96-e7e2ff378a2c\") " pod="openstack/nova-cell1-63ed-account-create-lqdgk" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.535100 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b4fe343-759d-470c-9e6e-1daba0a5d58d-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.535125 4831 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.581977 4831 scope.go:117] "RemoveContainer" containerID="bd919d5e1909a1e207ccd77fd162a09cad523e0a7fd2c88e5498805dc1a8bfd3" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.584652 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b4fe343-759d-470c-9e6e-1daba0a5d58d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9b4fe343-759d-470c-9e6e-1daba0a5d58d" (UID: "9b4fe343-759d-470c-9e6e-1daba0a5d58d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.614086 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.646562 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm4r8\" (UniqueName: \"kubernetes.io/projected/65a7e487-a7d3-491f-bc96-e7e2ff378a2c-kube-api-access-sm4r8\") pod \"nova-cell1-63ed-account-create-lqdgk\" (UID: \"65a7e487-a7d3-491f-bc96-e7e2ff378a2c\") " pod="openstack/nova-cell1-63ed-account-create-lqdgk" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.648185 4831 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b4fe343-759d-470c-9e6e-1daba0a5d58d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.672726 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm4r8\" (UniqueName: \"kubernetes.io/projected/65a7e487-a7d3-491f-bc96-e7e2ff378a2c-kube-api-access-sm4r8\") pod \"nova-cell1-63ed-account-create-lqdgk\" (UID: \"65a7e487-a7d3-491f-bc96-e7e2ff378a2c\") " pod="openstack/nova-cell1-63ed-account-create-lqdgk" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.718191 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-63ed-account-create-lqdgk" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.748936 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"63dffe41-8eb3-4696-bfdc-de4fe1735e19\" (UID: \"63dffe41-8eb3-4696-bfdc-de4fe1735e19\") " Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.749234 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63dffe41-8eb3-4696-bfdc-de4fe1735e19-config-data\") pod \"63dffe41-8eb3-4696-bfdc-de4fe1735e19\" (UID: \"63dffe41-8eb3-4696-bfdc-de4fe1735e19\") " Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.749332 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/63dffe41-8eb3-4696-bfdc-de4fe1735e19-httpd-run\") pod \"63dffe41-8eb3-4696-bfdc-de4fe1735e19\" (UID: \"63dffe41-8eb3-4696-bfdc-de4fe1735e19\") " Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.749400 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvgx7\" (UniqueName: \"kubernetes.io/projected/63dffe41-8eb3-4696-bfdc-de4fe1735e19-kube-api-access-cvgx7\") pod \"63dffe41-8eb3-4696-bfdc-de4fe1735e19\" (UID: \"63dffe41-8eb3-4696-bfdc-de4fe1735e19\") " Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.749428 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63dffe41-8eb3-4696-bfdc-de4fe1735e19-combined-ca-bundle\") pod \"63dffe41-8eb3-4696-bfdc-de4fe1735e19\" (UID: \"63dffe41-8eb3-4696-bfdc-de4fe1735e19\") " Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.749463 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63dffe41-8eb3-4696-bfdc-de4fe1735e19-logs\") pod \"63dffe41-8eb3-4696-bfdc-de4fe1735e19\" (UID: \"63dffe41-8eb3-4696-bfdc-de4fe1735e19\") " Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.749487 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63dffe41-8eb3-4696-bfdc-de4fe1735e19-internal-tls-certs\") pod \"63dffe41-8eb3-4696-bfdc-de4fe1735e19\" (UID: \"63dffe41-8eb3-4696-bfdc-de4fe1735e19\") " Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.749622 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63dffe41-8eb3-4696-bfdc-de4fe1735e19-scripts\") pod \"63dffe41-8eb3-4696-bfdc-de4fe1735e19\" (UID: \"63dffe41-8eb3-4696-bfdc-de4fe1735e19\") " Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.749724 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63dffe41-8eb3-4696-bfdc-de4fe1735e19-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "63dffe41-8eb3-4696-bfdc-de4fe1735e19" (UID: "63dffe41-8eb3-4696-bfdc-de4fe1735e19"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.749950 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63dffe41-8eb3-4696-bfdc-de4fe1735e19-logs" (OuterVolumeSpecName: "logs") pod "63dffe41-8eb3-4696-bfdc-de4fe1735e19" (UID: "63dffe41-8eb3-4696-bfdc-de4fe1735e19"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.750001 4831 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/63dffe41-8eb3-4696-bfdc-de4fe1735e19-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.753524 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "63dffe41-8eb3-4696-bfdc-de4fe1735e19" (UID: "63dffe41-8eb3-4696-bfdc-de4fe1735e19"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.757172 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63dffe41-8eb3-4696-bfdc-de4fe1735e19-scripts" (OuterVolumeSpecName: "scripts") pod "63dffe41-8eb3-4696-bfdc-de4fe1735e19" (UID: "63dffe41-8eb3-4696-bfdc-de4fe1735e19"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.759923 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63dffe41-8eb3-4696-bfdc-de4fe1735e19-kube-api-access-cvgx7" (OuterVolumeSpecName: "kube-api-access-cvgx7") pod "63dffe41-8eb3-4696-bfdc-de4fe1735e19" (UID: "63dffe41-8eb3-4696-bfdc-de4fe1735e19"). InnerVolumeSpecName "kube-api-access-cvgx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.812905 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.828490 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.834533 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63dffe41-8eb3-4696-bfdc-de4fe1735e19-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63dffe41-8eb3-4696-bfdc-de4fe1735e19" (UID: "63dffe41-8eb3-4696-bfdc-de4fe1735e19"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.852353 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvgx7\" (UniqueName: \"kubernetes.io/projected/63dffe41-8eb3-4696-bfdc-de4fe1735e19-kube-api-access-cvgx7\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.852526 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63dffe41-8eb3-4696-bfdc-de4fe1735e19-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.852585 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63dffe41-8eb3-4696-bfdc-de4fe1735e19-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.852641 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63dffe41-8eb3-4696-bfdc-de4fe1735e19-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.852852 4831 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.862931 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:35:34 crc kubenswrapper[4831]: E1204 10:35:34.863518 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63dffe41-8eb3-4696-bfdc-de4fe1735e19" containerName="glance-httpd" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.863547 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="63dffe41-8eb3-4696-bfdc-de4fe1735e19" containerName="glance-httpd" Dec 04 10:35:34 crc kubenswrapper[4831]: E1204 10:35:34.863573 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63dffe41-8eb3-4696-bfdc-de4fe1735e19" containerName="glance-log" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.863579 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="63dffe41-8eb3-4696-bfdc-de4fe1735e19" containerName="glance-log" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.863833 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="63dffe41-8eb3-4696-bfdc-de4fe1735e19" containerName="glance-log" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.863849 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="63dffe41-8eb3-4696-bfdc-de4fe1735e19" containerName="glance-httpd" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.865145 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.869043 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.869364 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.880375 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.932155 4831 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.958925 4831 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.971114 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63dffe41-8eb3-4696-bfdc-de4fe1735e19-config-data" (OuterVolumeSpecName: "config-data") pod "63dffe41-8eb3-4696-bfdc-de4fe1735e19" (UID: "63dffe41-8eb3-4696-bfdc-de4fe1735e19"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:34 crc kubenswrapper[4831]: I1204 10:35:34.980673 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63dffe41-8eb3-4696-bfdc-de4fe1735e19-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "63dffe41-8eb3-4696-bfdc-de4fe1735e19" (UID: "63dffe41-8eb3-4696-bfdc-de4fe1735e19"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.060756 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/110d0cb6-3ad6-4d48-ae88-1864408c86af-scripts\") pod \"glance-default-external-api-0\" (UID: \"110d0cb6-3ad6-4d48-ae88-1864408c86af\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.060843 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66n6d\" (UniqueName: \"kubernetes.io/projected/110d0cb6-3ad6-4d48-ae88-1864408c86af-kube-api-access-66n6d\") pod \"glance-default-external-api-0\" (UID: \"110d0cb6-3ad6-4d48-ae88-1864408c86af\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.060876 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/110d0cb6-3ad6-4d48-ae88-1864408c86af-logs\") pod \"glance-default-external-api-0\" (UID: \"110d0cb6-3ad6-4d48-ae88-1864408c86af\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.060902 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"110d0cb6-3ad6-4d48-ae88-1864408c86af\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.060936 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/110d0cb6-3ad6-4d48-ae88-1864408c86af-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"110d0cb6-3ad6-4d48-ae88-1864408c86af\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.061007 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/110d0cb6-3ad6-4d48-ae88-1864408c86af-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"110d0cb6-3ad6-4d48-ae88-1864408c86af\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.061077 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/110d0cb6-3ad6-4d48-ae88-1864408c86af-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"110d0cb6-3ad6-4d48-ae88-1864408c86af\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.061129 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/110d0cb6-3ad6-4d48-ae88-1864408c86af-config-data\") pod \"glance-default-external-api-0\" (UID: \"110d0cb6-3ad6-4d48-ae88-1864408c86af\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.061220 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63dffe41-8eb3-4696-bfdc-de4fe1735e19-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.061236 4831 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63dffe41-8eb3-4696-bfdc-de4fe1735e19-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.163625 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/110d0cb6-3ad6-4d48-ae88-1864408c86af-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"110d0cb6-3ad6-4d48-ae88-1864408c86af\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.163698 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/110d0cb6-3ad6-4d48-ae88-1864408c86af-config-data\") pod \"glance-default-external-api-0\" (UID: \"110d0cb6-3ad6-4d48-ae88-1864408c86af\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.163764 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/110d0cb6-3ad6-4d48-ae88-1864408c86af-scripts\") pod \"glance-default-external-api-0\" (UID: \"110d0cb6-3ad6-4d48-ae88-1864408c86af\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.163795 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66n6d\" (UniqueName: \"kubernetes.io/projected/110d0cb6-3ad6-4d48-ae88-1864408c86af-kube-api-access-66n6d\") pod \"glance-default-external-api-0\" (UID: \"110d0cb6-3ad6-4d48-ae88-1864408c86af\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.163814 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/110d0cb6-3ad6-4d48-ae88-1864408c86af-logs\") pod \"glance-default-external-api-0\" (UID: \"110d0cb6-3ad6-4d48-ae88-1864408c86af\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.163852 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"110d0cb6-3ad6-4d48-ae88-1864408c86af\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.163876 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/110d0cb6-3ad6-4d48-ae88-1864408c86af-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"110d0cb6-3ad6-4d48-ae88-1864408c86af\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.163920 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/110d0cb6-3ad6-4d48-ae88-1864408c86af-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"110d0cb6-3ad6-4d48-ae88-1864408c86af\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.164437 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/110d0cb6-3ad6-4d48-ae88-1864408c86af-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"110d0cb6-3ad6-4d48-ae88-1864408c86af\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.164648 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/110d0cb6-3ad6-4d48-ae88-1864408c86af-logs\") pod \"glance-default-external-api-0\" (UID: \"110d0cb6-3ad6-4d48-ae88-1864408c86af\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.165068 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"110d0cb6-3ad6-4d48-ae88-1864408c86af\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.177807 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/110d0cb6-3ad6-4d48-ae88-1864408c86af-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"110d0cb6-3ad6-4d48-ae88-1864408c86af\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.178608 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/110d0cb6-3ad6-4d48-ae88-1864408c86af-config-data\") pod \"glance-default-external-api-0\" (UID: \"110d0cb6-3ad6-4d48-ae88-1864408c86af\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.194812 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/110d0cb6-3ad6-4d48-ae88-1864408c86af-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"110d0cb6-3ad6-4d48-ae88-1864408c86af\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.206518 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66n6d\" (UniqueName: \"kubernetes.io/projected/110d0cb6-3ad6-4d48-ae88-1864408c86af-kube-api-access-66n6d\") pod \"glance-default-external-api-0\" (UID: \"110d0cb6-3ad6-4d48-ae88-1864408c86af\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.207218 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/110d0cb6-3ad6-4d48-ae88-1864408c86af-scripts\") pod \"glance-default-external-api-0\" (UID: \"110d0cb6-3ad6-4d48-ae88-1864408c86af\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.254037 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"110d0cb6-3ad6-4d48-ae88-1864408c86af\") " pod="openstack/glance-default-external-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.310947 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b4fe343-759d-470c-9e6e-1daba0a5d58d" path="/var/lib/kubelet/pods/9b4fe343-759d-470c-9e6e-1daba0a5d58d/volumes" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.328250 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-2dcb-account-create-xjrzk"] Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.357327 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-b1d5-account-create-lq99p"] Dec 04 10:35:35 crc kubenswrapper[4831]: W1204 10:35:35.391941 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0cfead2_fbc2_4c82_a779_c1419a2bdd13.slice/crio-283363c90bfd1352bae0dca81997274db1c99fdf22676ee21ca73ea4fa159c24 WatchSource:0}: Error finding container 283363c90bfd1352bae0dca81997274db1c99fdf22676ee21ca73ea4fa159c24: Status 404 returned error can't find the container with id 283363c90bfd1352bae0dca81997274db1c99fdf22676ee21ca73ea4fa159c24 Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.398733 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.475791 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-b1d5-account-create-lq99p" event={"ID":"d0cfead2-fbc2-4c82-a779-c1419a2bdd13","Type":"ContainerStarted","Data":"283363c90bfd1352bae0dca81997274db1c99fdf22676ee21ca73ea4fa159c24"} Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.485980 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2dcb-account-create-xjrzk" event={"ID":"bcf3269a-7a62-450b-b6e3-5b18451dd26f","Type":"ContainerStarted","Data":"a90f8ab588e5132387a2d68b36662e9da8fb197f49efb19fefdd0f5add55d849"} Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.493375 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"63dffe41-8eb3-4696-bfdc-de4fe1735e19","Type":"ContainerDied","Data":"7e81ad5300a0a7f4247e37d94f9ded775db9bc0a8cc2458784f4aefbe5b5f97c"} Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.493432 4831 scope.go:117] "RemoveContainer" containerID="9c711f4d0bf2a86ce430041bc836b37392f167f0530a4b24f3f2c99ff417d2fc" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.493624 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.508586 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5a0cad59-03e2-4ed0-84ee-c51f7c1b5e3f","Type":"ContainerStarted","Data":"d3e375d22b37e3e429eec432e8f0268a4c03f6ec8e3ce93592d56dfabf2e5b94"} Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.524876 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.538242 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.553839 4831 scope.go:117] "RemoveContainer" containerID="ffdf0493afea65d971fe4de728ed8cbca86bac9e71ba09e6a8be9114d83960fb" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.566209 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.566182106 podStartE2EDuration="4.566182106s" podCreationTimestamp="2025-12-04 10:35:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:35:35.542104446 +0000 UTC m=+1232.491279760" watchObservedRunningTime="2025-12-04 10:35:35.566182106 +0000 UTC m=+1232.515357430" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.574419 4831 generic.go:334] "Generic (PLEG): container finished" podID="46d9cb17-c424-47e2-a9f3-d00f479be770" containerID="73951d7e874c2e9c585d5ec5160ff814c456ecb47286d3c9f983181c13c50ef0" exitCode=0 Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.574465 4831 generic.go:334] "Generic (PLEG): container finished" podID="46d9cb17-c424-47e2-a9f3-d00f479be770" containerID="898643a493e42f1244a61658bf2f03b1d6c3570659ce68edd2983922982979dd" exitCode=2 Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.574475 4831 generic.go:334] "Generic (PLEG): container finished" podID="46d9cb17-c424-47e2-a9f3-d00f479be770" containerID="fd71aaacbb01e9e38227d8ba792a2cc308ba6526c95152cd3bf28d95d4ec6807" exitCode=0 Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.574500 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"46d9cb17-c424-47e2-a9f3-d00f479be770","Type":"ContainerDied","Data":"73951d7e874c2e9c585d5ec5160ff814c456ecb47286d3c9f983181c13c50ef0"} Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.574534 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"46d9cb17-c424-47e2-a9f3-d00f479be770","Type":"ContainerDied","Data":"898643a493e42f1244a61658bf2f03b1d6c3570659ce68edd2983922982979dd"} Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.574547 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"46d9cb17-c424-47e2-a9f3-d00f479be770","Type":"ContainerDied","Data":"fd71aaacbb01e9e38227d8ba792a2cc308ba6526c95152cd3bf28d95d4ec6807"} Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.594434 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.598206 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.602041 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.602556 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.637826 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.683584 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4j66\" (UniqueName: \"kubernetes.io/projected/8fb48518-ba79-45b0-8f47-51305a47805a-kube-api-access-j4j66\") pod \"glance-default-internal-api-0\" (UID: \"8fb48518-ba79-45b0-8f47-51305a47805a\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.683818 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fb48518-ba79-45b0-8f47-51305a47805a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8fb48518-ba79-45b0-8f47-51305a47805a\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.683866 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8fb48518-ba79-45b0-8f47-51305a47805a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8fb48518-ba79-45b0-8f47-51305a47805a\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.683907 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fb48518-ba79-45b0-8f47-51305a47805a-logs\") pod \"glance-default-internal-api-0\" (UID: \"8fb48518-ba79-45b0-8f47-51305a47805a\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.683927 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fb48518-ba79-45b0-8f47-51305a47805a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8fb48518-ba79-45b0-8f47-51305a47805a\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.683958 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fb48518-ba79-45b0-8f47-51305a47805a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8fb48518-ba79-45b0-8f47-51305a47805a\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.684059 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"8fb48518-ba79-45b0-8f47-51305a47805a\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.684083 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fb48518-ba79-45b0-8f47-51305a47805a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8fb48518-ba79-45b0-8f47-51305a47805a\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.716415 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-63ed-account-create-lqdgk"] Dec 04 10:35:35 crc kubenswrapper[4831]: W1204 10:35:35.768780 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65a7e487_a7d3_491f_bc96_e7e2ff378a2c.slice/crio-e1de2508134953c4aef313573916476b7e5211a5bf66be7736af12f55e54084a WatchSource:0}: Error finding container e1de2508134953c4aef313573916476b7e5211a5bf66be7736af12f55e54084a: Status 404 returned error can't find the container with id e1de2508134953c4aef313573916476b7e5211a5bf66be7736af12f55e54084a Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.785940 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fb48518-ba79-45b0-8f47-51305a47805a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8fb48518-ba79-45b0-8f47-51305a47805a\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.786023 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8fb48518-ba79-45b0-8f47-51305a47805a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8fb48518-ba79-45b0-8f47-51305a47805a\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.786081 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fb48518-ba79-45b0-8f47-51305a47805a-logs\") pod \"glance-default-internal-api-0\" (UID: \"8fb48518-ba79-45b0-8f47-51305a47805a\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.786103 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fb48518-ba79-45b0-8f47-51305a47805a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8fb48518-ba79-45b0-8f47-51305a47805a\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.786138 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fb48518-ba79-45b0-8f47-51305a47805a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8fb48518-ba79-45b0-8f47-51305a47805a\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.786211 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"8fb48518-ba79-45b0-8f47-51305a47805a\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.786234 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fb48518-ba79-45b0-8f47-51305a47805a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8fb48518-ba79-45b0-8f47-51305a47805a\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.786304 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4j66\" (UniqueName: \"kubernetes.io/projected/8fb48518-ba79-45b0-8f47-51305a47805a-kube-api-access-j4j66\") pod \"glance-default-internal-api-0\" (UID: \"8fb48518-ba79-45b0-8f47-51305a47805a\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.788090 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"8fb48518-ba79-45b0-8f47-51305a47805a\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.788142 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fb48518-ba79-45b0-8f47-51305a47805a-logs\") pod \"glance-default-internal-api-0\" (UID: \"8fb48518-ba79-45b0-8f47-51305a47805a\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.788214 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8fb48518-ba79-45b0-8f47-51305a47805a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8fb48518-ba79-45b0-8f47-51305a47805a\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.799610 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fb48518-ba79-45b0-8f47-51305a47805a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8fb48518-ba79-45b0-8f47-51305a47805a\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.799882 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fb48518-ba79-45b0-8f47-51305a47805a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8fb48518-ba79-45b0-8f47-51305a47805a\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.800179 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fb48518-ba79-45b0-8f47-51305a47805a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8fb48518-ba79-45b0-8f47-51305a47805a\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.800877 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fb48518-ba79-45b0-8f47-51305a47805a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8fb48518-ba79-45b0-8f47-51305a47805a\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.804951 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4j66\" (UniqueName: \"kubernetes.io/projected/8fb48518-ba79-45b0-8f47-51305a47805a-kube-api-access-j4j66\") pod \"glance-default-internal-api-0\" (UID: \"8fb48518-ba79-45b0-8f47-51305a47805a\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.823398 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"8fb48518-ba79-45b0-8f47-51305a47805a\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:35:35 crc kubenswrapper[4831]: I1204 10:35:35.919254 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 10:35:36 crc kubenswrapper[4831]: I1204 10:35:36.114317 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:35:36 crc kubenswrapper[4831]: I1204 10:35:36.399989 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79f849bb84-btxkg" Dec 04 10:35:36 crc kubenswrapper[4831]: I1204 10:35:36.503291 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a02d0bff-55e1-4de5-95e4-98d65018cbf0-config\") pod \"a02d0bff-55e1-4de5-95e4-98d65018cbf0\" (UID: \"a02d0bff-55e1-4de5-95e4-98d65018cbf0\") " Dec 04 10:35:36 crc kubenswrapper[4831]: I1204 10:35:36.503360 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a02d0bff-55e1-4de5-95e4-98d65018cbf0-httpd-config\") pod \"a02d0bff-55e1-4de5-95e4-98d65018cbf0\" (UID: \"a02d0bff-55e1-4de5-95e4-98d65018cbf0\") " Dec 04 10:35:36 crc kubenswrapper[4831]: I1204 10:35:36.503426 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lf6ft\" (UniqueName: \"kubernetes.io/projected/a02d0bff-55e1-4de5-95e4-98d65018cbf0-kube-api-access-lf6ft\") pod \"a02d0bff-55e1-4de5-95e4-98d65018cbf0\" (UID: \"a02d0bff-55e1-4de5-95e4-98d65018cbf0\") " Dec 04 10:35:36 crc kubenswrapper[4831]: I1204 10:35:36.503448 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a02d0bff-55e1-4de5-95e4-98d65018cbf0-combined-ca-bundle\") pod \"a02d0bff-55e1-4de5-95e4-98d65018cbf0\" (UID: \"a02d0bff-55e1-4de5-95e4-98d65018cbf0\") " Dec 04 10:35:36 crc kubenswrapper[4831]: I1204 10:35:36.503531 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a02d0bff-55e1-4de5-95e4-98d65018cbf0-ovndb-tls-certs\") pod \"a02d0bff-55e1-4de5-95e4-98d65018cbf0\" (UID: \"a02d0bff-55e1-4de5-95e4-98d65018cbf0\") " Dec 04 10:35:36 crc kubenswrapper[4831]: I1204 10:35:36.520116 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a02d0bff-55e1-4de5-95e4-98d65018cbf0-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "a02d0bff-55e1-4de5-95e4-98d65018cbf0" (UID: "a02d0bff-55e1-4de5-95e4-98d65018cbf0"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:36 crc kubenswrapper[4831]: I1204 10:35:36.527879 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a02d0bff-55e1-4de5-95e4-98d65018cbf0-kube-api-access-lf6ft" (OuterVolumeSpecName: "kube-api-access-lf6ft") pod "a02d0bff-55e1-4de5-95e4-98d65018cbf0" (UID: "a02d0bff-55e1-4de5-95e4-98d65018cbf0"). InnerVolumeSpecName "kube-api-access-lf6ft". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:35:36 crc kubenswrapper[4831]: I1204 10:35:36.581804 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a02d0bff-55e1-4de5-95e4-98d65018cbf0-config" (OuterVolumeSpecName: "config") pod "a02d0bff-55e1-4de5-95e4-98d65018cbf0" (UID: "a02d0bff-55e1-4de5-95e4-98d65018cbf0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:36 crc kubenswrapper[4831]: I1204 10:35:36.605269 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a02d0bff-55e1-4de5-95e4-98d65018cbf0-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:36 crc kubenswrapper[4831]: I1204 10:35:36.605296 4831 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a02d0bff-55e1-4de5-95e4-98d65018cbf0-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:36 crc kubenswrapper[4831]: I1204 10:35:36.605307 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lf6ft\" (UniqueName: \"kubernetes.io/projected/a02d0bff-55e1-4de5-95e4-98d65018cbf0-kube-api-access-lf6ft\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:36 crc kubenswrapper[4831]: I1204 10:35:36.623892 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a02d0bff-55e1-4de5-95e4-98d65018cbf0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a02d0bff-55e1-4de5-95e4-98d65018cbf0" (UID: "a02d0bff-55e1-4de5-95e4-98d65018cbf0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:36 crc kubenswrapper[4831]: I1204 10:35:36.630053 4831 generic.go:334] "Generic (PLEG): container finished" podID="a02d0bff-55e1-4de5-95e4-98d65018cbf0" containerID="c78af9fedf96b9361597f1e3c2127dce97c3295d889d879db9bc8d9d35aeecb0" exitCode=0 Dec 04 10:35:36 crc kubenswrapper[4831]: I1204 10:35:36.630387 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79f849bb84-btxkg" Dec 04 10:35:36 crc kubenswrapper[4831]: I1204 10:35:36.630566 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79f849bb84-btxkg" event={"ID":"a02d0bff-55e1-4de5-95e4-98d65018cbf0","Type":"ContainerDied","Data":"c78af9fedf96b9361597f1e3c2127dce97c3295d889d879db9bc8d9d35aeecb0"} Dec 04 10:35:36 crc kubenswrapper[4831]: I1204 10:35:36.630646 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79f849bb84-btxkg" event={"ID":"a02d0bff-55e1-4de5-95e4-98d65018cbf0","Type":"ContainerDied","Data":"95648dbaee9e864ba2ccf55317ebcdb751ed7d8234452c93f34ad216526502b4"} Dec 04 10:35:36 crc kubenswrapper[4831]: I1204 10:35:36.630690 4831 scope.go:117] "RemoveContainer" containerID="a555dd93ad49421b95a6d98a6065de3c3559238fc0d523fcb3ca02d4764348c3" Dec 04 10:35:36 crc kubenswrapper[4831]: I1204 10:35:36.635025 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"110d0cb6-3ad6-4d48-ae88-1864408c86af","Type":"ContainerStarted","Data":"94055f0abcb29bee03b9c8e0256fd11121c9f98ae9e40d7b57f707f4de42ed45"} Dec 04 10:35:36 crc kubenswrapper[4831]: I1204 10:35:36.643640 4831 generic.go:334] "Generic (PLEG): container finished" podID="d0cfead2-fbc2-4c82-a779-c1419a2bdd13" containerID="ff06089c757328619340b12c1a486391c56eb662de2ec8be9963d598b2045d3a" exitCode=0 Dec 04 10:35:36 crc kubenswrapper[4831]: I1204 10:35:36.643714 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-b1d5-account-create-lq99p" event={"ID":"d0cfead2-fbc2-4c82-a779-c1419a2bdd13","Type":"ContainerDied","Data":"ff06089c757328619340b12c1a486391c56eb662de2ec8be9963d598b2045d3a"} Dec 04 10:35:36 crc kubenswrapper[4831]: I1204 10:35:36.645090 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 10:35:36 crc kubenswrapper[4831]: I1204 10:35:36.646796 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a02d0bff-55e1-4de5-95e4-98d65018cbf0-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "a02d0bff-55e1-4de5-95e4-98d65018cbf0" (UID: "a02d0bff-55e1-4de5-95e4-98d65018cbf0"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:36 crc kubenswrapper[4831]: I1204 10:35:36.648861 4831 generic.go:334] "Generic (PLEG): container finished" podID="bcf3269a-7a62-450b-b6e3-5b18451dd26f" containerID="fac8026c758d9cd1924bd9e9aa0add42ed0ae439dccf199606ae46498ddb299c" exitCode=0 Dec 04 10:35:36 crc kubenswrapper[4831]: I1204 10:35:36.648961 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2dcb-account-create-xjrzk" event={"ID":"bcf3269a-7a62-450b-b6e3-5b18451dd26f","Type":"ContainerDied","Data":"fac8026c758d9cd1924bd9e9aa0add42ed0ae439dccf199606ae46498ddb299c"} Dec 04 10:35:36 crc kubenswrapper[4831]: I1204 10:35:36.659943 4831 generic.go:334] "Generic (PLEG): container finished" podID="65a7e487-a7d3-491f-bc96-e7e2ff378a2c" containerID="f1f6ba1c9cfda0d1b792b32aa7c5ecd02169e2bed716e062c9ea7a74dabd2a2b" exitCode=0 Dec 04 10:35:36 crc kubenswrapper[4831]: I1204 10:35:36.660239 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-63ed-account-create-lqdgk" event={"ID":"65a7e487-a7d3-491f-bc96-e7e2ff378a2c","Type":"ContainerDied","Data":"f1f6ba1c9cfda0d1b792b32aa7c5ecd02169e2bed716e062c9ea7a74dabd2a2b"} Dec 04 10:35:36 crc kubenswrapper[4831]: I1204 10:35:36.660335 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-63ed-account-create-lqdgk" event={"ID":"65a7e487-a7d3-491f-bc96-e7e2ff378a2c","Type":"ContainerStarted","Data":"e1de2508134953c4aef313573916476b7e5211a5bf66be7736af12f55e54084a"} Dec 04 10:35:36 crc kubenswrapper[4831]: I1204 10:35:36.702964 4831 scope.go:117] "RemoveContainer" containerID="c78af9fedf96b9361597f1e3c2127dce97c3295d889d879db9bc8d9d35aeecb0" Dec 04 10:35:36 crc kubenswrapper[4831]: I1204 10:35:36.707027 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a02d0bff-55e1-4de5-95e4-98d65018cbf0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:36 crc kubenswrapper[4831]: I1204 10:35:36.707055 4831 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a02d0bff-55e1-4de5-95e4-98d65018cbf0-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:36 crc kubenswrapper[4831]: I1204 10:35:36.761412 4831 scope.go:117] "RemoveContainer" containerID="a555dd93ad49421b95a6d98a6065de3c3559238fc0d523fcb3ca02d4764348c3" Dec 04 10:35:36 crc kubenswrapper[4831]: E1204 10:35:36.762034 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a555dd93ad49421b95a6d98a6065de3c3559238fc0d523fcb3ca02d4764348c3\": container with ID starting with a555dd93ad49421b95a6d98a6065de3c3559238fc0d523fcb3ca02d4764348c3 not found: ID does not exist" containerID="a555dd93ad49421b95a6d98a6065de3c3559238fc0d523fcb3ca02d4764348c3" Dec 04 10:35:36 crc kubenswrapper[4831]: I1204 10:35:36.762068 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a555dd93ad49421b95a6d98a6065de3c3559238fc0d523fcb3ca02d4764348c3"} err="failed to get container status \"a555dd93ad49421b95a6d98a6065de3c3559238fc0d523fcb3ca02d4764348c3\": rpc error: code = NotFound desc = could not find container \"a555dd93ad49421b95a6d98a6065de3c3559238fc0d523fcb3ca02d4764348c3\": container with ID starting with a555dd93ad49421b95a6d98a6065de3c3559238fc0d523fcb3ca02d4764348c3 not found: ID does not exist" Dec 04 10:35:36 crc kubenswrapper[4831]: I1204 10:35:36.762088 4831 scope.go:117] "RemoveContainer" containerID="c78af9fedf96b9361597f1e3c2127dce97c3295d889d879db9bc8d9d35aeecb0" Dec 04 10:35:36 crc kubenswrapper[4831]: E1204 10:35:36.762269 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c78af9fedf96b9361597f1e3c2127dce97c3295d889d879db9bc8d9d35aeecb0\": container with ID starting with c78af9fedf96b9361597f1e3c2127dce97c3295d889d879db9bc8d9d35aeecb0 not found: ID does not exist" containerID="c78af9fedf96b9361597f1e3c2127dce97c3295d889d879db9bc8d9d35aeecb0" Dec 04 10:35:36 crc kubenswrapper[4831]: I1204 10:35:36.762290 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c78af9fedf96b9361597f1e3c2127dce97c3295d889d879db9bc8d9d35aeecb0"} err="failed to get container status \"c78af9fedf96b9361597f1e3c2127dce97c3295d889d879db9bc8d9d35aeecb0\": rpc error: code = NotFound desc = could not find container \"c78af9fedf96b9361597f1e3c2127dce97c3295d889d879db9bc8d9d35aeecb0\": container with ID starting with c78af9fedf96b9361597f1e3c2127dce97c3295d889d879db9bc8d9d35aeecb0 not found: ID does not exist" Dec 04 10:35:36 crc kubenswrapper[4831]: I1204 10:35:36.975499 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-79f849bb84-btxkg"] Dec 04 10:35:36 crc kubenswrapper[4831]: I1204 10:35:36.986935 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-79f849bb84-btxkg"] Dec 04 10:35:36 crc kubenswrapper[4831]: I1204 10:35:36.998705 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 04 10:35:37 crc kubenswrapper[4831]: I1204 10:35:37.304012 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63dffe41-8eb3-4696-bfdc-de4fe1735e19" path="/var/lib/kubelet/pods/63dffe41-8eb3-4696-bfdc-de4fe1735e19/volumes" Dec 04 10:35:37 crc kubenswrapper[4831]: I1204 10:35:37.305148 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a02d0bff-55e1-4de5-95e4-98d65018cbf0" path="/var/lib/kubelet/pods/a02d0bff-55e1-4de5-95e4-98d65018cbf0/volumes" Dec 04 10:35:37 crc kubenswrapper[4831]: I1204 10:35:37.722715 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"110d0cb6-3ad6-4d48-ae88-1864408c86af","Type":"ContainerStarted","Data":"5d0fbafe980f99668d35ec0bf5ccfde36833679bac9f86fc513dcd4a0e4046aa"} Dec 04 10:35:37 crc kubenswrapper[4831]: I1204 10:35:37.733283 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8fb48518-ba79-45b0-8f47-51305a47805a","Type":"ContainerStarted","Data":"40cfd12fc862a1e6d5e996b33e66cfff847c08a4335454f3aa9610b9ed4ba4bc"} Dec 04 10:35:37 crc kubenswrapper[4831]: I1204 10:35:37.733329 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8fb48518-ba79-45b0-8f47-51305a47805a","Type":"ContainerStarted","Data":"420008a1495199ad082596a2504a4d0f84ff26bb84ec6a1b034ca2481fc3e800"} Dec 04 10:35:38 crc kubenswrapper[4831]: I1204 10:35:38.199543 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2dcb-account-create-xjrzk" Dec 04 10:35:38 crc kubenswrapper[4831]: I1204 10:35:38.246586 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvjn6\" (UniqueName: \"kubernetes.io/projected/bcf3269a-7a62-450b-b6e3-5b18451dd26f-kube-api-access-xvjn6\") pod \"bcf3269a-7a62-450b-b6e3-5b18451dd26f\" (UID: \"bcf3269a-7a62-450b-b6e3-5b18451dd26f\") " Dec 04 10:35:38 crc kubenswrapper[4831]: I1204 10:35:38.267758 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcf3269a-7a62-450b-b6e3-5b18451dd26f-kube-api-access-xvjn6" (OuterVolumeSpecName: "kube-api-access-xvjn6") pod "bcf3269a-7a62-450b-b6e3-5b18451dd26f" (UID: "bcf3269a-7a62-450b-b6e3-5b18451dd26f"). InnerVolumeSpecName "kube-api-access-xvjn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:35:38 crc kubenswrapper[4831]: I1204 10:35:38.350133 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvjn6\" (UniqueName: \"kubernetes.io/projected/bcf3269a-7a62-450b-b6e3-5b18451dd26f-kube-api-access-xvjn6\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:38 crc kubenswrapper[4831]: I1204 10:35:38.363552 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 04 10:35:38 crc kubenswrapper[4831]: I1204 10:35:38.370203 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-b1d5-account-create-lq99p" Dec 04 10:35:38 crc kubenswrapper[4831]: I1204 10:35:38.375494 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-63ed-account-create-lqdgk" Dec 04 10:35:38 crc kubenswrapper[4831]: I1204 10:35:38.559850 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwrds\" (UniqueName: \"kubernetes.io/projected/d0cfead2-fbc2-4c82-a779-c1419a2bdd13-kube-api-access-xwrds\") pod \"d0cfead2-fbc2-4c82-a779-c1419a2bdd13\" (UID: \"d0cfead2-fbc2-4c82-a779-c1419a2bdd13\") " Dec 04 10:35:38 crc kubenswrapper[4831]: I1204 10:35:38.560053 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sm4r8\" (UniqueName: \"kubernetes.io/projected/65a7e487-a7d3-491f-bc96-e7e2ff378a2c-kube-api-access-sm4r8\") pod \"65a7e487-a7d3-491f-bc96-e7e2ff378a2c\" (UID: \"65a7e487-a7d3-491f-bc96-e7e2ff378a2c\") " Dec 04 10:35:38 crc kubenswrapper[4831]: I1204 10:35:38.571131 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0cfead2-fbc2-4c82-a779-c1419a2bdd13-kube-api-access-xwrds" (OuterVolumeSpecName: "kube-api-access-xwrds") pod "d0cfead2-fbc2-4c82-a779-c1419a2bdd13" (UID: "d0cfead2-fbc2-4c82-a779-c1419a2bdd13"). InnerVolumeSpecName "kube-api-access-xwrds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:35:38 crc kubenswrapper[4831]: I1204 10:35:38.575784 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65a7e487-a7d3-491f-bc96-e7e2ff378a2c-kube-api-access-sm4r8" (OuterVolumeSpecName: "kube-api-access-sm4r8") pod "65a7e487-a7d3-491f-bc96-e7e2ff378a2c" (UID: "65a7e487-a7d3-491f-bc96-e7e2ff378a2c"). InnerVolumeSpecName "kube-api-access-sm4r8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:35:38 crc kubenswrapper[4831]: I1204 10:35:38.662181 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwrds\" (UniqueName: \"kubernetes.io/projected/d0cfead2-fbc2-4c82-a779-c1419a2bdd13-kube-api-access-xwrds\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:38 crc kubenswrapper[4831]: I1204 10:35:38.662480 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sm4r8\" (UniqueName: \"kubernetes.io/projected/65a7e487-a7d3-491f-bc96-e7e2ff378a2c-kube-api-access-sm4r8\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:38 crc kubenswrapper[4831]: I1204 10:35:38.745393 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8fb48518-ba79-45b0-8f47-51305a47805a","Type":"ContainerStarted","Data":"1187dfdfb03bde98e44ee3b35aa722918096fae472858fe642e9b45c63cc20c2"} Dec 04 10:35:38 crc kubenswrapper[4831]: I1204 10:35:38.748352 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"110d0cb6-3ad6-4d48-ae88-1864408c86af","Type":"ContainerStarted","Data":"3ec57338c8de3e3b4bb1ca2487af035880af079e54f581d8fd1124fbd090442c"} Dec 04 10:35:38 crc kubenswrapper[4831]: I1204 10:35:38.750240 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-b1d5-account-create-lq99p" Dec 04 10:35:38 crc kubenswrapper[4831]: I1204 10:35:38.750815 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-b1d5-account-create-lq99p" event={"ID":"d0cfead2-fbc2-4c82-a779-c1419a2bdd13","Type":"ContainerDied","Data":"283363c90bfd1352bae0dca81997274db1c99fdf22676ee21ca73ea4fa159c24"} Dec 04 10:35:38 crc kubenswrapper[4831]: I1204 10:35:38.750919 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="283363c90bfd1352bae0dca81997274db1c99fdf22676ee21ca73ea4fa159c24" Dec 04 10:35:38 crc kubenswrapper[4831]: I1204 10:35:38.751842 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2dcb-account-create-xjrzk" event={"ID":"bcf3269a-7a62-450b-b6e3-5b18451dd26f","Type":"ContainerDied","Data":"a90f8ab588e5132387a2d68b36662e9da8fb197f49efb19fefdd0f5add55d849"} Dec 04 10:35:38 crc kubenswrapper[4831]: I1204 10:35:38.751934 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a90f8ab588e5132387a2d68b36662e9da8fb197f49efb19fefdd0f5add55d849" Dec 04 10:35:38 crc kubenswrapper[4831]: I1204 10:35:38.752043 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2dcb-account-create-xjrzk" Dec 04 10:35:38 crc kubenswrapper[4831]: I1204 10:35:38.759765 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-63ed-account-create-lqdgk" event={"ID":"65a7e487-a7d3-491f-bc96-e7e2ff378a2c","Type":"ContainerDied","Data":"e1de2508134953c4aef313573916476b7e5211a5bf66be7736af12f55e54084a"} Dec 04 10:35:38 crc kubenswrapper[4831]: I1204 10:35:38.759810 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1de2508134953c4aef313573916476b7e5211a5bf66be7736af12f55e54084a" Dec 04 10:35:38 crc kubenswrapper[4831]: I1204 10:35:38.759874 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-63ed-account-create-lqdgk" Dec 04 10:35:38 crc kubenswrapper[4831]: I1204 10:35:38.788892 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.788873885 podStartE2EDuration="3.788873885s" podCreationTimestamp="2025-12-04 10:35:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:35:38.775381637 +0000 UTC m=+1235.724556951" watchObservedRunningTime="2025-12-04 10:35:38.788873885 +0000 UTC m=+1235.738049199" Dec 04 10:35:38 crc kubenswrapper[4831]: I1204 10:35:38.822521 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.822506048 podStartE2EDuration="4.822506048s" podCreationTimestamp="2025-12-04 10:35:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:35:38.817059194 +0000 UTC m=+1235.766234518" watchObservedRunningTime="2025-12-04 10:35:38.822506048 +0000 UTC m=+1235.771681362" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.251311 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.375642 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46d9cb17-c424-47e2-a9f3-d00f479be770-combined-ca-bundle\") pod \"46d9cb17-c424-47e2-a9f3-d00f479be770\" (UID: \"46d9cb17-c424-47e2-a9f3-d00f479be770\") " Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.375948 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46d9cb17-c424-47e2-a9f3-d00f479be770-log-httpd\") pod \"46d9cb17-c424-47e2-a9f3-d00f479be770\" (UID: \"46d9cb17-c424-47e2-a9f3-d00f479be770\") " Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.375988 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/46d9cb17-c424-47e2-a9f3-d00f479be770-sg-core-conf-yaml\") pod \"46d9cb17-c424-47e2-a9f3-d00f479be770\" (UID: \"46d9cb17-c424-47e2-a9f3-d00f479be770\") " Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.376025 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46d9cb17-c424-47e2-a9f3-d00f479be770-scripts\") pod \"46d9cb17-c424-47e2-a9f3-d00f479be770\" (UID: \"46d9cb17-c424-47e2-a9f3-d00f479be770\") " Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.376053 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46d9cb17-c424-47e2-a9f3-d00f479be770-config-data\") pod \"46d9cb17-c424-47e2-a9f3-d00f479be770\" (UID: \"46d9cb17-c424-47e2-a9f3-d00f479be770\") " Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.376101 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46d9cb17-c424-47e2-a9f3-d00f479be770-run-httpd\") pod \"46d9cb17-c424-47e2-a9f3-d00f479be770\" (UID: \"46d9cb17-c424-47e2-a9f3-d00f479be770\") " Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.376185 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmkbp\" (UniqueName: \"kubernetes.io/projected/46d9cb17-c424-47e2-a9f3-d00f479be770-kube-api-access-nmkbp\") pod \"46d9cb17-c424-47e2-a9f3-d00f479be770\" (UID: \"46d9cb17-c424-47e2-a9f3-d00f479be770\") " Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.376553 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46d9cb17-c424-47e2-a9f3-d00f479be770-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "46d9cb17-c424-47e2-a9f3-d00f479be770" (UID: "46d9cb17-c424-47e2-a9f3-d00f479be770"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.376569 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46d9cb17-c424-47e2-a9f3-d00f479be770-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "46d9cb17-c424-47e2-a9f3-d00f479be770" (UID: "46d9cb17-c424-47e2-a9f3-d00f479be770"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.381874 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46d9cb17-c424-47e2-a9f3-d00f479be770-kube-api-access-nmkbp" (OuterVolumeSpecName: "kube-api-access-nmkbp") pod "46d9cb17-c424-47e2-a9f3-d00f479be770" (UID: "46d9cb17-c424-47e2-a9f3-d00f479be770"). InnerVolumeSpecName "kube-api-access-nmkbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.384893 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46d9cb17-c424-47e2-a9f3-d00f479be770-scripts" (OuterVolumeSpecName: "scripts") pod "46d9cb17-c424-47e2-a9f3-d00f479be770" (UID: "46d9cb17-c424-47e2-a9f3-d00f479be770"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.419052 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46d9cb17-c424-47e2-a9f3-d00f479be770-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "46d9cb17-c424-47e2-a9f3-d00f479be770" (UID: "46d9cb17-c424-47e2-a9f3-d00f479be770"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.475770 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46d9cb17-c424-47e2-a9f3-d00f479be770-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46d9cb17-c424-47e2-a9f3-d00f479be770" (UID: "46d9cb17-c424-47e2-a9f3-d00f479be770"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.479353 4831 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46d9cb17-c424-47e2-a9f3-d00f479be770-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.479392 4831 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/46d9cb17-c424-47e2-a9f3-d00f479be770-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.479426 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46d9cb17-c424-47e2-a9f3-d00f479be770-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.479438 4831 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46d9cb17-c424-47e2-a9f3-d00f479be770-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.479450 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmkbp\" (UniqueName: \"kubernetes.io/projected/46d9cb17-c424-47e2-a9f3-d00f479be770-kube-api-access-nmkbp\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.479461 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46d9cb17-c424-47e2-a9f3-d00f479be770-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.512857 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46d9cb17-c424-47e2-a9f3-d00f479be770-config-data" (OuterVolumeSpecName: "config-data") pod "46d9cb17-c424-47e2-a9f3-d00f479be770" (UID: "46d9cb17-c424-47e2-a9f3-d00f479be770"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.581503 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46d9cb17-c424-47e2-a9f3-d00f479be770-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.780718 4831 generic.go:334] "Generic (PLEG): container finished" podID="46d9cb17-c424-47e2-a9f3-d00f479be770" containerID="290ec4343d8a9e924021fb8be00b63c56030866278fdcf442da7e37e39566f87" exitCode=0 Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.780770 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"46d9cb17-c424-47e2-a9f3-d00f479be770","Type":"ContainerDied","Data":"290ec4343d8a9e924021fb8be00b63c56030866278fdcf442da7e37e39566f87"} Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.780810 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.780819 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"46d9cb17-c424-47e2-a9f3-d00f479be770","Type":"ContainerDied","Data":"3b8c12573adf18db918ed0fa3b95780b2a3f6b5406fda505d663c22c5fabd99b"} Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.780829 4831 scope.go:117] "RemoveContainer" containerID="73951d7e874c2e9c585d5ec5160ff814c456ecb47286d3c9f983181c13c50ef0" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.801731 4831 scope.go:117] "RemoveContainer" containerID="898643a493e42f1244a61658bf2f03b1d6c3570659ce68edd2983922982979dd" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.823508 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.832551 4831 scope.go:117] "RemoveContainer" containerID="fd71aaacbb01e9e38227d8ba792a2cc308ba6526c95152cd3bf28d95d4ec6807" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.833122 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.853101 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:35:39 crc kubenswrapper[4831]: E1204 10:35:39.853642 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46d9cb17-c424-47e2-a9f3-d00f479be770" containerName="ceilometer-notification-agent" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.853721 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="46d9cb17-c424-47e2-a9f3-d00f479be770" containerName="ceilometer-notification-agent" Dec 04 10:35:39 crc kubenswrapper[4831]: E1204 10:35:39.853775 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0cfead2-fbc2-4c82-a779-c1419a2bdd13" containerName="mariadb-account-create" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.853836 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0cfead2-fbc2-4c82-a779-c1419a2bdd13" containerName="mariadb-account-create" Dec 04 10:35:39 crc kubenswrapper[4831]: E1204 10:35:39.853895 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46d9cb17-c424-47e2-a9f3-d00f479be770" containerName="proxy-httpd" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.853945 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="46d9cb17-c424-47e2-a9f3-d00f479be770" containerName="proxy-httpd" Dec 04 10:35:39 crc kubenswrapper[4831]: E1204 10:35:39.854011 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46d9cb17-c424-47e2-a9f3-d00f479be770" containerName="ceilometer-central-agent" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.854081 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="46d9cb17-c424-47e2-a9f3-d00f479be770" containerName="ceilometer-central-agent" Dec 04 10:35:39 crc kubenswrapper[4831]: E1204 10:35:39.854150 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a02d0bff-55e1-4de5-95e4-98d65018cbf0" containerName="neutron-api" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.854200 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a02d0bff-55e1-4de5-95e4-98d65018cbf0" containerName="neutron-api" Dec 04 10:35:39 crc kubenswrapper[4831]: E1204 10:35:39.854259 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcf3269a-7a62-450b-b6e3-5b18451dd26f" containerName="mariadb-account-create" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.854310 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcf3269a-7a62-450b-b6e3-5b18451dd26f" containerName="mariadb-account-create" Dec 04 10:35:39 crc kubenswrapper[4831]: E1204 10:35:39.854368 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a02d0bff-55e1-4de5-95e4-98d65018cbf0" containerName="neutron-httpd" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.854416 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a02d0bff-55e1-4de5-95e4-98d65018cbf0" containerName="neutron-httpd" Dec 04 10:35:39 crc kubenswrapper[4831]: E1204 10:35:39.854478 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65a7e487-a7d3-491f-bc96-e7e2ff378a2c" containerName="mariadb-account-create" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.854531 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a7e487-a7d3-491f-bc96-e7e2ff378a2c" containerName="mariadb-account-create" Dec 04 10:35:39 crc kubenswrapper[4831]: E1204 10:35:39.854591 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46d9cb17-c424-47e2-a9f3-d00f479be770" containerName="sg-core" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.854639 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="46d9cb17-c424-47e2-a9f3-d00f479be770" containerName="sg-core" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.854888 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="46d9cb17-c424-47e2-a9f3-d00f479be770" containerName="ceilometer-central-agent" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.854955 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0cfead2-fbc2-4c82-a779-c1419a2bdd13" containerName="mariadb-account-create" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.855010 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="46d9cb17-c424-47e2-a9f3-d00f479be770" containerName="sg-core" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.855065 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="46d9cb17-c424-47e2-a9f3-d00f479be770" containerName="ceilometer-notification-agent" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.855122 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a02d0bff-55e1-4de5-95e4-98d65018cbf0" containerName="neutron-httpd" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.855182 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="65a7e487-a7d3-491f-bc96-e7e2ff378a2c" containerName="mariadb-account-create" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.855265 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="46d9cb17-c424-47e2-a9f3-d00f479be770" containerName="proxy-httpd" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.855318 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcf3269a-7a62-450b-b6e3-5b18451dd26f" containerName="mariadb-account-create" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.855389 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a02d0bff-55e1-4de5-95e4-98d65018cbf0" containerName="neutron-api" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.857095 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.857986 4831 scope.go:117] "RemoveContainer" containerID="290ec4343d8a9e924021fb8be00b63c56030866278fdcf442da7e37e39566f87" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.860861 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.860915 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.866170 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.887570 4831 scope.go:117] "RemoveContainer" containerID="73951d7e874c2e9c585d5ec5160ff814c456ecb47286d3c9f983181c13c50ef0" Dec 04 10:35:39 crc kubenswrapper[4831]: E1204 10:35:39.899135 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73951d7e874c2e9c585d5ec5160ff814c456ecb47286d3c9f983181c13c50ef0\": container with ID starting with 73951d7e874c2e9c585d5ec5160ff814c456ecb47286d3c9f983181c13c50ef0 not found: ID does not exist" containerID="73951d7e874c2e9c585d5ec5160ff814c456ecb47286d3c9f983181c13c50ef0" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.899181 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73951d7e874c2e9c585d5ec5160ff814c456ecb47286d3c9f983181c13c50ef0"} err="failed to get container status \"73951d7e874c2e9c585d5ec5160ff814c456ecb47286d3c9f983181c13c50ef0\": rpc error: code = NotFound desc = could not find container \"73951d7e874c2e9c585d5ec5160ff814c456ecb47286d3c9f983181c13c50ef0\": container with ID starting with 73951d7e874c2e9c585d5ec5160ff814c456ecb47286d3c9f983181c13c50ef0 not found: ID does not exist" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.899216 4831 scope.go:117] "RemoveContainer" containerID="898643a493e42f1244a61658bf2f03b1d6c3570659ce68edd2983922982979dd" Dec 04 10:35:39 crc kubenswrapper[4831]: E1204 10:35:39.903164 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"898643a493e42f1244a61658bf2f03b1d6c3570659ce68edd2983922982979dd\": container with ID starting with 898643a493e42f1244a61658bf2f03b1d6c3570659ce68edd2983922982979dd not found: ID does not exist" containerID="898643a493e42f1244a61658bf2f03b1d6c3570659ce68edd2983922982979dd" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.903219 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"898643a493e42f1244a61658bf2f03b1d6c3570659ce68edd2983922982979dd"} err="failed to get container status \"898643a493e42f1244a61658bf2f03b1d6c3570659ce68edd2983922982979dd\": rpc error: code = NotFound desc = could not find container \"898643a493e42f1244a61658bf2f03b1d6c3570659ce68edd2983922982979dd\": container with ID starting with 898643a493e42f1244a61658bf2f03b1d6c3570659ce68edd2983922982979dd not found: ID does not exist" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.903246 4831 scope.go:117] "RemoveContainer" containerID="fd71aaacbb01e9e38227d8ba792a2cc308ba6526c95152cd3bf28d95d4ec6807" Dec 04 10:35:39 crc kubenswrapper[4831]: E1204 10:35:39.903617 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd71aaacbb01e9e38227d8ba792a2cc308ba6526c95152cd3bf28d95d4ec6807\": container with ID starting with fd71aaacbb01e9e38227d8ba792a2cc308ba6526c95152cd3bf28d95d4ec6807 not found: ID does not exist" containerID="fd71aaacbb01e9e38227d8ba792a2cc308ba6526c95152cd3bf28d95d4ec6807" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.903648 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd71aaacbb01e9e38227d8ba792a2cc308ba6526c95152cd3bf28d95d4ec6807"} err="failed to get container status \"fd71aaacbb01e9e38227d8ba792a2cc308ba6526c95152cd3bf28d95d4ec6807\": rpc error: code = NotFound desc = could not find container \"fd71aaacbb01e9e38227d8ba792a2cc308ba6526c95152cd3bf28d95d4ec6807\": container with ID starting with fd71aaacbb01e9e38227d8ba792a2cc308ba6526c95152cd3bf28d95d4ec6807 not found: ID does not exist" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.903683 4831 scope.go:117] "RemoveContainer" containerID="290ec4343d8a9e924021fb8be00b63c56030866278fdcf442da7e37e39566f87" Dec 04 10:35:39 crc kubenswrapper[4831]: E1204 10:35:39.907828 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"290ec4343d8a9e924021fb8be00b63c56030866278fdcf442da7e37e39566f87\": container with ID starting with 290ec4343d8a9e924021fb8be00b63c56030866278fdcf442da7e37e39566f87 not found: ID does not exist" containerID="290ec4343d8a9e924021fb8be00b63c56030866278fdcf442da7e37e39566f87" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.907880 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"290ec4343d8a9e924021fb8be00b63c56030866278fdcf442da7e37e39566f87"} err="failed to get container status \"290ec4343d8a9e924021fb8be00b63c56030866278fdcf442da7e37e39566f87\": rpc error: code = NotFound desc = could not find container \"290ec4343d8a9e924021fb8be00b63c56030866278fdcf442da7e37e39566f87\": container with ID starting with 290ec4343d8a9e924021fb8be00b63c56030866278fdcf442da7e37e39566f87 not found: ID does not exist" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.988067 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/784cfe4d-8417-459a-a56d-221ee316859b-log-httpd\") pod \"ceilometer-0\" (UID: \"784cfe4d-8417-459a-a56d-221ee316859b\") " pod="openstack/ceilometer-0" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.988144 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/784cfe4d-8417-459a-a56d-221ee316859b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"784cfe4d-8417-459a-a56d-221ee316859b\") " pod="openstack/ceilometer-0" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.988311 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn58z\" (UniqueName: \"kubernetes.io/projected/784cfe4d-8417-459a-a56d-221ee316859b-kube-api-access-mn58z\") pod \"ceilometer-0\" (UID: \"784cfe4d-8417-459a-a56d-221ee316859b\") " pod="openstack/ceilometer-0" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.988539 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/784cfe4d-8417-459a-a56d-221ee316859b-scripts\") pod \"ceilometer-0\" (UID: \"784cfe4d-8417-459a-a56d-221ee316859b\") " pod="openstack/ceilometer-0" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.988595 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/784cfe4d-8417-459a-a56d-221ee316859b-config-data\") pod \"ceilometer-0\" (UID: \"784cfe4d-8417-459a-a56d-221ee316859b\") " pod="openstack/ceilometer-0" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.988632 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/784cfe4d-8417-459a-a56d-221ee316859b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"784cfe4d-8417-459a-a56d-221ee316859b\") " pod="openstack/ceilometer-0" Dec 04 10:35:39 crc kubenswrapper[4831]: I1204 10:35:39.988653 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/784cfe4d-8417-459a-a56d-221ee316859b-run-httpd\") pod \"ceilometer-0\" (UID: \"784cfe4d-8417-459a-a56d-221ee316859b\") " pod="openstack/ceilometer-0" Dec 04 10:35:40 crc kubenswrapper[4831]: I1204 10:35:40.089808 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/784cfe4d-8417-459a-a56d-221ee316859b-scripts\") pod \"ceilometer-0\" (UID: \"784cfe4d-8417-459a-a56d-221ee316859b\") " pod="openstack/ceilometer-0" Dec 04 10:35:40 crc kubenswrapper[4831]: I1204 10:35:40.089845 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/784cfe4d-8417-459a-a56d-221ee316859b-config-data\") pod \"ceilometer-0\" (UID: \"784cfe4d-8417-459a-a56d-221ee316859b\") " pod="openstack/ceilometer-0" Dec 04 10:35:40 crc kubenswrapper[4831]: I1204 10:35:40.089868 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/784cfe4d-8417-459a-a56d-221ee316859b-run-httpd\") pod \"ceilometer-0\" (UID: \"784cfe4d-8417-459a-a56d-221ee316859b\") " pod="openstack/ceilometer-0" Dec 04 10:35:40 crc kubenswrapper[4831]: I1204 10:35:40.089886 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/784cfe4d-8417-459a-a56d-221ee316859b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"784cfe4d-8417-459a-a56d-221ee316859b\") " pod="openstack/ceilometer-0" Dec 04 10:35:40 crc kubenswrapper[4831]: I1204 10:35:40.089919 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/784cfe4d-8417-459a-a56d-221ee316859b-log-httpd\") pod \"ceilometer-0\" (UID: \"784cfe4d-8417-459a-a56d-221ee316859b\") " pod="openstack/ceilometer-0" Dec 04 10:35:40 crc kubenswrapper[4831]: I1204 10:35:40.089945 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/784cfe4d-8417-459a-a56d-221ee316859b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"784cfe4d-8417-459a-a56d-221ee316859b\") " pod="openstack/ceilometer-0" Dec 04 10:35:40 crc kubenswrapper[4831]: I1204 10:35:40.089992 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn58z\" (UniqueName: \"kubernetes.io/projected/784cfe4d-8417-459a-a56d-221ee316859b-kube-api-access-mn58z\") pod \"ceilometer-0\" (UID: \"784cfe4d-8417-459a-a56d-221ee316859b\") " pod="openstack/ceilometer-0" Dec 04 10:35:40 crc kubenswrapper[4831]: I1204 10:35:40.090273 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/784cfe4d-8417-459a-a56d-221ee316859b-run-httpd\") pod \"ceilometer-0\" (UID: \"784cfe4d-8417-459a-a56d-221ee316859b\") " pod="openstack/ceilometer-0" Dec 04 10:35:40 crc kubenswrapper[4831]: I1204 10:35:40.090493 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/784cfe4d-8417-459a-a56d-221ee316859b-log-httpd\") pod \"ceilometer-0\" (UID: \"784cfe4d-8417-459a-a56d-221ee316859b\") " pod="openstack/ceilometer-0" Dec 04 10:35:40 crc kubenswrapper[4831]: I1204 10:35:40.095462 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/784cfe4d-8417-459a-a56d-221ee316859b-config-data\") pod \"ceilometer-0\" (UID: \"784cfe4d-8417-459a-a56d-221ee316859b\") " pod="openstack/ceilometer-0" Dec 04 10:35:40 crc kubenswrapper[4831]: I1204 10:35:40.095912 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/784cfe4d-8417-459a-a56d-221ee316859b-scripts\") pod \"ceilometer-0\" (UID: \"784cfe4d-8417-459a-a56d-221ee316859b\") " pod="openstack/ceilometer-0" Dec 04 10:35:40 crc kubenswrapper[4831]: I1204 10:35:40.096250 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/784cfe4d-8417-459a-a56d-221ee316859b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"784cfe4d-8417-459a-a56d-221ee316859b\") " pod="openstack/ceilometer-0" Dec 04 10:35:40 crc kubenswrapper[4831]: I1204 10:35:40.106063 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/784cfe4d-8417-459a-a56d-221ee316859b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"784cfe4d-8417-459a-a56d-221ee316859b\") " pod="openstack/ceilometer-0" Dec 04 10:35:40 crc kubenswrapper[4831]: I1204 10:35:40.106738 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn58z\" (UniqueName: \"kubernetes.io/projected/784cfe4d-8417-459a-a56d-221ee316859b-kube-api-access-mn58z\") pod \"ceilometer-0\" (UID: \"784cfe4d-8417-459a-a56d-221ee316859b\") " pod="openstack/ceilometer-0" Dec 04 10:35:40 crc kubenswrapper[4831]: I1204 10:35:40.186491 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:35:40 crc kubenswrapper[4831]: I1204 10:35:40.542454 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:35:40 crc kubenswrapper[4831]: I1204 10:35:40.656133 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:35:40 crc kubenswrapper[4831]: W1204 10:35:40.663353 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod784cfe4d_8417_459a_a56d_221ee316859b.slice/crio-7df012686914dcf5fba83585191534dd435938f2a8972fc61555cef41c8bf74b WatchSource:0}: Error finding container 7df012686914dcf5fba83585191534dd435938f2a8972fc61555cef41c8bf74b: Status 404 returned error can't find the container with id 7df012686914dcf5fba83585191534dd435938f2a8972fc61555cef41c8bf74b Dec 04 10:35:40 crc kubenswrapper[4831]: I1204 10:35:40.808351 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"784cfe4d-8417-459a-a56d-221ee316859b","Type":"ContainerStarted","Data":"7df012686914dcf5fba83585191534dd435938f2a8972fc61555cef41c8bf74b"} Dec 04 10:35:41 crc kubenswrapper[4831]: I1204 10:35:41.288153 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46d9cb17-c424-47e2-a9f3-d00f479be770" path="/var/lib/kubelet/pods/46d9cb17-c424-47e2-a9f3-d00f479be770/volumes" Dec 04 10:35:41 crc kubenswrapper[4831]: I1204 10:35:41.821420 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"784cfe4d-8417-459a-a56d-221ee316859b","Type":"ContainerStarted","Data":"52c393222e3a4fc6614d5741029ef54d6d1c9c5860753fe3337a43d6f6b7193b"} Dec 04 10:35:41 crc kubenswrapper[4831]: I1204 10:35:41.821467 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"784cfe4d-8417-459a-a56d-221ee316859b","Type":"ContainerStarted","Data":"5f6e821c5116254899063d0cc16e8eefa012f5e542ce4dfcb1d0f6b4aebc4aa9"} Dec 04 10:35:42 crc kubenswrapper[4831]: I1204 10:35:42.151460 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 04 10:35:42 crc kubenswrapper[4831]: I1204 10:35:42.836039 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"784cfe4d-8417-459a-a56d-221ee316859b","Type":"ContainerStarted","Data":"965e9d8a2cff187748d0f71b33cf3304cfee77d894bb0134e1627a3a1db82c60"} Dec 04 10:35:43 crc kubenswrapper[4831]: I1204 10:35:43.846801 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"784cfe4d-8417-459a-a56d-221ee316859b","Type":"ContainerStarted","Data":"119e5804b3fe25596e8f0ee37102b5c01efcc0a83811d02c12151ac81cf5bb8b"} Dec 04 10:35:43 crc kubenswrapper[4831]: I1204 10:35:43.846942 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="784cfe4d-8417-459a-a56d-221ee316859b" containerName="ceilometer-central-agent" containerID="cri-o://5f6e821c5116254899063d0cc16e8eefa012f5e542ce4dfcb1d0f6b4aebc4aa9" gracePeriod=30 Dec 04 10:35:43 crc kubenswrapper[4831]: I1204 10:35:43.847024 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="784cfe4d-8417-459a-a56d-221ee316859b" containerName="sg-core" containerID="cri-o://965e9d8a2cff187748d0f71b33cf3304cfee77d894bb0134e1627a3a1db82c60" gracePeriod=30 Dec 04 10:35:43 crc kubenswrapper[4831]: I1204 10:35:43.847055 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="784cfe4d-8417-459a-a56d-221ee316859b" containerName="proxy-httpd" containerID="cri-o://119e5804b3fe25596e8f0ee37102b5c01efcc0a83811d02c12151ac81cf5bb8b" gracePeriod=30 Dec 04 10:35:43 crc kubenswrapper[4831]: I1204 10:35:43.847075 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="784cfe4d-8417-459a-a56d-221ee316859b" containerName="ceilometer-notification-agent" containerID="cri-o://52c393222e3a4fc6614d5741029ef54d6d1c9c5860753fe3337a43d6f6b7193b" gracePeriod=30 Dec 04 10:35:43 crc kubenswrapper[4831]: I1204 10:35:43.849569 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 10:35:43 crc kubenswrapper[4831]: I1204 10:35:43.888167 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.723440845 podStartE2EDuration="4.888147653s" podCreationTimestamp="2025-12-04 10:35:39 +0000 UTC" firstStartedPulling="2025-12-04 10:35:40.670702082 +0000 UTC m=+1237.619877396" lastFinishedPulling="2025-12-04 10:35:42.83540889 +0000 UTC m=+1239.784584204" observedRunningTime="2025-12-04 10:35:43.88424033 +0000 UTC m=+1240.833415644" watchObservedRunningTime="2025-12-04 10:35:43.888147653 +0000 UTC m=+1240.837322967" Dec 04 10:35:44 crc kubenswrapper[4831]: I1204 10:35:44.410975 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-x2xtp"] Dec 04 10:35:44 crc kubenswrapper[4831]: I1204 10:35:44.414204 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-x2xtp" Dec 04 10:35:44 crc kubenswrapper[4831]: I1204 10:35:44.416787 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 04 10:35:44 crc kubenswrapper[4831]: I1204 10:35:44.417022 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 04 10:35:44 crc kubenswrapper[4831]: I1204 10:35:44.418079 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-z2mjw" Dec 04 10:35:44 crc kubenswrapper[4831]: I1204 10:35:44.428859 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-x2xtp"] Dec 04 10:35:44 crc kubenswrapper[4831]: I1204 10:35:44.483977 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83153b32-d324-4aec-a5f3-ff0c0a47c0ee-config-data\") pod \"nova-cell0-conductor-db-sync-x2xtp\" (UID: \"83153b32-d324-4aec-a5f3-ff0c0a47c0ee\") " pod="openstack/nova-cell0-conductor-db-sync-x2xtp" Dec 04 10:35:44 crc kubenswrapper[4831]: I1204 10:35:44.484409 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83153b32-d324-4aec-a5f3-ff0c0a47c0ee-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-x2xtp\" (UID: \"83153b32-d324-4aec-a5f3-ff0c0a47c0ee\") " pod="openstack/nova-cell0-conductor-db-sync-x2xtp" Dec 04 10:35:44 crc kubenswrapper[4831]: I1204 10:35:44.484677 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f474p\" (UniqueName: \"kubernetes.io/projected/83153b32-d324-4aec-a5f3-ff0c0a47c0ee-kube-api-access-f474p\") pod \"nova-cell0-conductor-db-sync-x2xtp\" (UID: \"83153b32-d324-4aec-a5f3-ff0c0a47c0ee\") " pod="openstack/nova-cell0-conductor-db-sync-x2xtp" Dec 04 10:35:44 crc kubenswrapper[4831]: I1204 10:35:44.484875 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83153b32-d324-4aec-a5f3-ff0c0a47c0ee-scripts\") pod \"nova-cell0-conductor-db-sync-x2xtp\" (UID: \"83153b32-d324-4aec-a5f3-ff0c0a47c0ee\") " pod="openstack/nova-cell0-conductor-db-sync-x2xtp" Dec 04 10:35:44 crc kubenswrapper[4831]: I1204 10:35:44.586606 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83153b32-d324-4aec-a5f3-ff0c0a47c0ee-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-x2xtp\" (UID: \"83153b32-d324-4aec-a5f3-ff0c0a47c0ee\") " pod="openstack/nova-cell0-conductor-db-sync-x2xtp" Dec 04 10:35:44 crc kubenswrapper[4831]: I1204 10:35:44.586737 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f474p\" (UniqueName: \"kubernetes.io/projected/83153b32-d324-4aec-a5f3-ff0c0a47c0ee-kube-api-access-f474p\") pod \"nova-cell0-conductor-db-sync-x2xtp\" (UID: \"83153b32-d324-4aec-a5f3-ff0c0a47c0ee\") " pod="openstack/nova-cell0-conductor-db-sync-x2xtp" Dec 04 10:35:44 crc kubenswrapper[4831]: I1204 10:35:44.586820 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83153b32-d324-4aec-a5f3-ff0c0a47c0ee-scripts\") pod \"nova-cell0-conductor-db-sync-x2xtp\" (UID: \"83153b32-d324-4aec-a5f3-ff0c0a47c0ee\") " pod="openstack/nova-cell0-conductor-db-sync-x2xtp" Dec 04 10:35:44 crc kubenswrapper[4831]: I1204 10:35:44.586938 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83153b32-d324-4aec-a5f3-ff0c0a47c0ee-config-data\") pod \"nova-cell0-conductor-db-sync-x2xtp\" (UID: \"83153b32-d324-4aec-a5f3-ff0c0a47c0ee\") " pod="openstack/nova-cell0-conductor-db-sync-x2xtp" Dec 04 10:35:44 crc kubenswrapper[4831]: I1204 10:35:44.603499 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83153b32-d324-4aec-a5f3-ff0c0a47c0ee-scripts\") pod \"nova-cell0-conductor-db-sync-x2xtp\" (UID: \"83153b32-d324-4aec-a5f3-ff0c0a47c0ee\") " pod="openstack/nova-cell0-conductor-db-sync-x2xtp" Dec 04 10:35:44 crc kubenswrapper[4831]: I1204 10:35:44.604254 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83153b32-d324-4aec-a5f3-ff0c0a47c0ee-config-data\") pod \"nova-cell0-conductor-db-sync-x2xtp\" (UID: \"83153b32-d324-4aec-a5f3-ff0c0a47c0ee\") " pod="openstack/nova-cell0-conductor-db-sync-x2xtp" Dec 04 10:35:44 crc kubenswrapper[4831]: I1204 10:35:44.604425 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83153b32-d324-4aec-a5f3-ff0c0a47c0ee-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-x2xtp\" (UID: \"83153b32-d324-4aec-a5f3-ff0c0a47c0ee\") " pod="openstack/nova-cell0-conductor-db-sync-x2xtp" Dec 04 10:35:44 crc kubenswrapper[4831]: I1204 10:35:44.612400 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f474p\" (UniqueName: \"kubernetes.io/projected/83153b32-d324-4aec-a5f3-ff0c0a47c0ee-kube-api-access-f474p\") pod \"nova-cell0-conductor-db-sync-x2xtp\" (UID: \"83153b32-d324-4aec-a5f3-ff0c0a47c0ee\") " pod="openstack/nova-cell0-conductor-db-sync-x2xtp" Dec 04 10:35:44 crc kubenswrapper[4831]: I1204 10:35:44.749392 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-x2xtp" Dec 04 10:35:44 crc kubenswrapper[4831]: I1204 10:35:44.874509 4831 generic.go:334] "Generic (PLEG): container finished" podID="784cfe4d-8417-459a-a56d-221ee316859b" containerID="119e5804b3fe25596e8f0ee37102b5c01efcc0a83811d02c12151ac81cf5bb8b" exitCode=0 Dec 04 10:35:44 crc kubenswrapper[4831]: I1204 10:35:44.874537 4831 generic.go:334] "Generic (PLEG): container finished" podID="784cfe4d-8417-459a-a56d-221ee316859b" containerID="965e9d8a2cff187748d0f71b33cf3304cfee77d894bb0134e1627a3a1db82c60" exitCode=2 Dec 04 10:35:44 crc kubenswrapper[4831]: I1204 10:35:44.874544 4831 generic.go:334] "Generic (PLEG): container finished" podID="784cfe4d-8417-459a-a56d-221ee316859b" containerID="52c393222e3a4fc6614d5741029ef54d6d1c9c5860753fe3337a43d6f6b7193b" exitCode=0 Dec 04 10:35:44 crc kubenswrapper[4831]: I1204 10:35:44.874563 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"784cfe4d-8417-459a-a56d-221ee316859b","Type":"ContainerDied","Data":"119e5804b3fe25596e8f0ee37102b5c01efcc0a83811d02c12151ac81cf5bb8b"} Dec 04 10:35:44 crc kubenswrapper[4831]: I1204 10:35:44.874592 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"784cfe4d-8417-459a-a56d-221ee316859b","Type":"ContainerDied","Data":"965e9d8a2cff187748d0f71b33cf3304cfee77d894bb0134e1627a3a1db82c60"} Dec 04 10:35:44 crc kubenswrapper[4831]: I1204 10:35:44.874602 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"784cfe4d-8417-459a-a56d-221ee316859b","Type":"ContainerDied","Data":"52c393222e3a4fc6614d5741029ef54d6d1c9c5860753fe3337a43d6f6b7193b"} Dec 04 10:35:45 crc kubenswrapper[4831]: I1204 10:35:45.225444 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-x2xtp"] Dec 04 10:35:45 crc kubenswrapper[4831]: I1204 10:35:45.237978 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 10:35:45 crc kubenswrapper[4831]: I1204 10:35:45.399967 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 04 10:35:45 crc kubenswrapper[4831]: I1204 10:35:45.400007 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 04 10:35:45 crc kubenswrapper[4831]: I1204 10:35:45.434590 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 04 10:35:45 crc kubenswrapper[4831]: I1204 10:35:45.449415 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 04 10:35:45 crc kubenswrapper[4831]: I1204 10:35:45.891872 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-x2xtp" event={"ID":"83153b32-d324-4aec-a5f3-ff0c0a47c0ee","Type":"ContainerStarted","Data":"acd7de39765ac91bf370ebeed6bf311a689f61131d309eb28f28842e294f9225"} Dec 04 10:35:45 crc kubenswrapper[4831]: I1204 10:35:45.892301 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 04 10:35:45 crc kubenswrapper[4831]: I1204 10:35:45.892317 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 04 10:35:45 crc kubenswrapper[4831]: I1204 10:35:45.921787 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 04 10:35:45 crc kubenswrapper[4831]: I1204 10:35:45.921836 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 04 10:35:45 crc kubenswrapper[4831]: I1204 10:35:45.955439 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 04 10:35:45 crc kubenswrapper[4831]: I1204 10:35:45.968684 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 04 10:35:46 crc kubenswrapper[4831]: I1204 10:35:46.900798 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 04 10:35:46 crc kubenswrapper[4831]: I1204 10:35:46.901094 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 04 10:35:47 crc kubenswrapper[4831]: I1204 10:35:47.817731 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:35:47 crc kubenswrapper[4831]: I1204 10:35:47.852559 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/784cfe4d-8417-459a-a56d-221ee316859b-run-httpd\") pod \"784cfe4d-8417-459a-a56d-221ee316859b\" (UID: \"784cfe4d-8417-459a-a56d-221ee316859b\") " Dec 04 10:35:47 crc kubenswrapper[4831]: I1204 10:35:47.852647 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/784cfe4d-8417-459a-a56d-221ee316859b-log-httpd\") pod \"784cfe4d-8417-459a-a56d-221ee316859b\" (UID: \"784cfe4d-8417-459a-a56d-221ee316859b\") " Dec 04 10:35:47 crc kubenswrapper[4831]: I1204 10:35:47.852695 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mn58z\" (UniqueName: \"kubernetes.io/projected/784cfe4d-8417-459a-a56d-221ee316859b-kube-api-access-mn58z\") pod \"784cfe4d-8417-459a-a56d-221ee316859b\" (UID: \"784cfe4d-8417-459a-a56d-221ee316859b\") " Dec 04 10:35:47 crc kubenswrapper[4831]: I1204 10:35:47.852757 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/784cfe4d-8417-459a-a56d-221ee316859b-scripts\") pod \"784cfe4d-8417-459a-a56d-221ee316859b\" (UID: \"784cfe4d-8417-459a-a56d-221ee316859b\") " Dec 04 10:35:47 crc kubenswrapper[4831]: I1204 10:35:47.852832 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/784cfe4d-8417-459a-a56d-221ee316859b-config-data\") pod \"784cfe4d-8417-459a-a56d-221ee316859b\" (UID: \"784cfe4d-8417-459a-a56d-221ee316859b\") " Dec 04 10:35:47 crc kubenswrapper[4831]: I1204 10:35:47.852864 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/784cfe4d-8417-459a-a56d-221ee316859b-sg-core-conf-yaml\") pod \"784cfe4d-8417-459a-a56d-221ee316859b\" (UID: \"784cfe4d-8417-459a-a56d-221ee316859b\") " Dec 04 10:35:47 crc kubenswrapper[4831]: I1204 10:35:47.852916 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/784cfe4d-8417-459a-a56d-221ee316859b-combined-ca-bundle\") pod \"784cfe4d-8417-459a-a56d-221ee316859b\" (UID: \"784cfe4d-8417-459a-a56d-221ee316859b\") " Dec 04 10:35:47 crc kubenswrapper[4831]: I1204 10:35:47.854775 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/784cfe4d-8417-459a-a56d-221ee316859b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "784cfe4d-8417-459a-a56d-221ee316859b" (UID: "784cfe4d-8417-459a-a56d-221ee316859b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:35:47 crc kubenswrapper[4831]: I1204 10:35:47.854955 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/784cfe4d-8417-459a-a56d-221ee316859b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "784cfe4d-8417-459a-a56d-221ee316859b" (UID: "784cfe4d-8417-459a-a56d-221ee316859b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:35:47 crc kubenswrapper[4831]: I1204 10:35:47.881890 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/784cfe4d-8417-459a-a56d-221ee316859b-scripts" (OuterVolumeSpecName: "scripts") pod "784cfe4d-8417-459a-a56d-221ee316859b" (UID: "784cfe4d-8417-459a-a56d-221ee316859b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:47 crc kubenswrapper[4831]: I1204 10:35:47.882074 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/784cfe4d-8417-459a-a56d-221ee316859b-kube-api-access-mn58z" (OuterVolumeSpecName: "kube-api-access-mn58z") pod "784cfe4d-8417-459a-a56d-221ee316859b" (UID: "784cfe4d-8417-459a-a56d-221ee316859b"). InnerVolumeSpecName "kube-api-access-mn58z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:35:47 crc kubenswrapper[4831]: I1204 10:35:47.888904 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/784cfe4d-8417-459a-a56d-221ee316859b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "784cfe4d-8417-459a-a56d-221ee316859b" (UID: "784cfe4d-8417-459a-a56d-221ee316859b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:47 crc kubenswrapper[4831]: I1204 10:35:47.917799 4831 generic.go:334] "Generic (PLEG): container finished" podID="784cfe4d-8417-459a-a56d-221ee316859b" containerID="5f6e821c5116254899063d0cc16e8eefa012f5e542ce4dfcb1d0f6b4aebc4aa9" exitCode=0 Dec 04 10:35:47 crc kubenswrapper[4831]: I1204 10:35:47.918138 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"784cfe4d-8417-459a-a56d-221ee316859b","Type":"ContainerDied","Data":"5f6e821c5116254899063d0cc16e8eefa012f5e542ce4dfcb1d0f6b4aebc4aa9"} Dec 04 10:35:47 crc kubenswrapper[4831]: I1204 10:35:47.918197 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"784cfe4d-8417-459a-a56d-221ee316859b","Type":"ContainerDied","Data":"7df012686914dcf5fba83585191534dd435938f2a8972fc61555cef41c8bf74b"} Dec 04 10:35:47 crc kubenswrapper[4831]: I1204 10:35:47.918218 4831 scope.go:117] "RemoveContainer" containerID="119e5804b3fe25596e8f0ee37102b5c01efcc0a83811d02c12151ac81cf5bb8b" Dec 04 10:35:47 crc kubenswrapper[4831]: I1204 10:35:47.918250 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:35:47 crc kubenswrapper[4831]: I1204 10:35:47.953511 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/784cfe4d-8417-459a-a56d-221ee316859b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "784cfe4d-8417-459a-a56d-221ee316859b" (UID: "784cfe4d-8417-459a-a56d-221ee316859b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:47 crc kubenswrapper[4831]: I1204 10:35:47.954053 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/784cfe4d-8417-459a-a56d-221ee316859b-combined-ca-bundle\") pod \"784cfe4d-8417-459a-a56d-221ee316859b\" (UID: \"784cfe4d-8417-459a-a56d-221ee316859b\") " Dec 04 10:35:47 crc kubenswrapper[4831]: W1204 10:35:47.954196 4831 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/784cfe4d-8417-459a-a56d-221ee316859b/volumes/kubernetes.io~secret/combined-ca-bundle Dec 04 10:35:47 crc kubenswrapper[4831]: I1204 10:35:47.954233 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/784cfe4d-8417-459a-a56d-221ee316859b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "784cfe4d-8417-459a-a56d-221ee316859b" (UID: "784cfe4d-8417-459a-a56d-221ee316859b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:47 crc kubenswrapper[4831]: I1204 10:35:47.954843 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/784cfe4d-8417-459a-a56d-221ee316859b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:47 crc kubenswrapper[4831]: I1204 10:35:47.954874 4831 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/784cfe4d-8417-459a-a56d-221ee316859b-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:47 crc kubenswrapper[4831]: I1204 10:35:47.954887 4831 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/784cfe4d-8417-459a-a56d-221ee316859b-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:47 crc kubenswrapper[4831]: I1204 10:35:47.954899 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mn58z\" (UniqueName: \"kubernetes.io/projected/784cfe4d-8417-459a-a56d-221ee316859b-kube-api-access-mn58z\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:47 crc kubenswrapper[4831]: I1204 10:35:47.955033 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/784cfe4d-8417-459a-a56d-221ee316859b-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:47 crc kubenswrapper[4831]: I1204 10:35:47.955077 4831 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/784cfe4d-8417-459a-a56d-221ee316859b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:47 crc kubenswrapper[4831]: I1204 10:35:47.971608 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/784cfe4d-8417-459a-a56d-221ee316859b-config-data" (OuterVolumeSpecName: "config-data") pod "784cfe4d-8417-459a-a56d-221ee316859b" (UID: "784cfe4d-8417-459a-a56d-221ee316859b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:47 crc kubenswrapper[4831]: I1204 10:35:47.986098 4831 scope.go:117] "RemoveContainer" containerID="965e9d8a2cff187748d0f71b33cf3304cfee77d894bb0134e1627a3a1db82c60" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.004902 4831 scope.go:117] "RemoveContainer" containerID="52c393222e3a4fc6614d5741029ef54d6d1c9c5860753fe3337a43d6f6b7193b" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.028448 4831 scope.go:117] "RemoveContainer" containerID="5f6e821c5116254899063d0cc16e8eefa012f5e542ce4dfcb1d0f6b4aebc4aa9" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.057699 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/784cfe4d-8417-459a-a56d-221ee316859b-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.059249 4831 scope.go:117] "RemoveContainer" containerID="119e5804b3fe25596e8f0ee37102b5c01efcc0a83811d02c12151ac81cf5bb8b" Dec 04 10:35:48 crc kubenswrapper[4831]: E1204 10:35:48.059710 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"119e5804b3fe25596e8f0ee37102b5c01efcc0a83811d02c12151ac81cf5bb8b\": container with ID starting with 119e5804b3fe25596e8f0ee37102b5c01efcc0a83811d02c12151ac81cf5bb8b not found: ID does not exist" containerID="119e5804b3fe25596e8f0ee37102b5c01efcc0a83811d02c12151ac81cf5bb8b" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.059745 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"119e5804b3fe25596e8f0ee37102b5c01efcc0a83811d02c12151ac81cf5bb8b"} err="failed to get container status \"119e5804b3fe25596e8f0ee37102b5c01efcc0a83811d02c12151ac81cf5bb8b\": rpc error: code = NotFound desc = could not find container \"119e5804b3fe25596e8f0ee37102b5c01efcc0a83811d02c12151ac81cf5bb8b\": container with ID starting with 119e5804b3fe25596e8f0ee37102b5c01efcc0a83811d02c12151ac81cf5bb8b not found: ID does not exist" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.059770 4831 scope.go:117] "RemoveContainer" containerID="965e9d8a2cff187748d0f71b33cf3304cfee77d894bb0134e1627a3a1db82c60" Dec 04 10:35:48 crc kubenswrapper[4831]: E1204 10:35:48.060380 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"965e9d8a2cff187748d0f71b33cf3304cfee77d894bb0134e1627a3a1db82c60\": container with ID starting with 965e9d8a2cff187748d0f71b33cf3304cfee77d894bb0134e1627a3a1db82c60 not found: ID does not exist" containerID="965e9d8a2cff187748d0f71b33cf3304cfee77d894bb0134e1627a3a1db82c60" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.060422 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"965e9d8a2cff187748d0f71b33cf3304cfee77d894bb0134e1627a3a1db82c60"} err="failed to get container status \"965e9d8a2cff187748d0f71b33cf3304cfee77d894bb0134e1627a3a1db82c60\": rpc error: code = NotFound desc = could not find container \"965e9d8a2cff187748d0f71b33cf3304cfee77d894bb0134e1627a3a1db82c60\": container with ID starting with 965e9d8a2cff187748d0f71b33cf3304cfee77d894bb0134e1627a3a1db82c60 not found: ID does not exist" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.060454 4831 scope.go:117] "RemoveContainer" containerID="52c393222e3a4fc6614d5741029ef54d6d1c9c5860753fe3337a43d6f6b7193b" Dec 04 10:35:48 crc kubenswrapper[4831]: E1204 10:35:48.060918 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52c393222e3a4fc6614d5741029ef54d6d1c9c5860753fe3337a43d6f6b7193b\": container with ID starting with 52c393222e3a4fc6614d5741029ef54d6d1c9c5860753fe3337a43d6f6b7193b not found: ID does not exist" containerID="52c393222e3a4fc6614d5741029ef54d6d1c9c5860753fe3337a43d6f6b7193b" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.060942 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52c393222e3a4fc6614d5741029ef54d6d1c9c5860753fe3337a43d6f6b7193b"} err="failed to get container status \"52c393222e3a4fc6614d5741029ef54d6d1c9c5860753fe3337a43d6f6b7193b\": rpc error: code = NotFound desc = could not find container \"52c393222e3a4fc6614d5741029ef54d6d1c9c5860753fe3337a43d6f6b7193b\": container with ID starting with 52c393222e3a4fc6614d5741029ef54d6d1c9c5860753fe3337a43d6f6b7193b not found: ID does not exist" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.060956 4831 scope.go:117] "RemoveContainer" containerID="5f6e821c5116254899063d0cc16e8eefa012f5e542ce4dfcb1d0f6b4aebc4aa9" Dec 04 10:35:48 crc kubenswrapper[4831]: E1204 10:35:48.061278 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f6e821c5116254899063d0cc16e8eefa012f5e542ce4dfcb1d0f6b4aebc4aa9\": container with ID starting with 5f6e821c5116254899063d0cc16e8eefa012f5e542ce4dfcb1d0f6b4aebc4aa9 not found: ID does not exist" containerID="5f6e821c5116254899063d0cc16e8eefa012f5e542ce4dfcb1d0f6b4aebc4aa9" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.061304 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f6e821c5116254899063d0cc16e8eefa012f5e542ce4dfcb1d0f6b4aebc4aa9"} err="failed to get container status \"5f6e821c5116254899063d0cc16e8eefa012f5e542ce4dfcb1d0f6b4aebc4aa9\": rpc error: code = NotFound desc = could not find container \"5f6e821c5116254899063d0cc16e8eefa012f5e542ce4dfcb1d0f6b4aebc4aa9\": container with ID starting with 5f6e821c5116254899063d0cc16e8eefa012f5e542ce4dfcb1d0f6b4aebc4aa9 not found: ID does not exist" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.147479 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.147585 4831 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.150984 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.267333 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.288989 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.298227 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:35:48 crc kubenswrapper[4831]: E1204 10:35:48.299577 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="784cfe4d-8417-459a-a56d-221ee316859b" containerName="sg-core" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.299598 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="784cfe4d-8417-459a-a56d-221ee316859b" containerName="sg-core" Dec 04 10:35:48 crc kubenswrapper[4831]: E1204 10:35:48.299610 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="784cfe4d-8417-459a-a56d-221ee316859b" containerName="proxy-httpd" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.299620 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="784cfe4d-8417-459a-a56d-221ee316859b" containerName="proxy-httpd" Dec 04 10:35:48 crc kubenswrapper[4831]: E1204 10:35:48.299649 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="784cfe4d-8417-459a-a56d-221ee316859b" containerName="ceilometer-central-agent" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.299670 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="784cfe4d-8417-459a-a56d-221ee316859b" containerName="ceilometer-central-agent" Dec 04 10:35:48 crc kubenswrapper[4831]: E1204 10:35:48.299688 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="784cfe4d-8417-459a-a56d-221ee316859b" containerName="ceilometer-notification-agent" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.299695 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="784cfe4d-8417-459a-a56d-221ee316859b" containerName="ceilometer-notification-agent" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.299898 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="784cfe4d-8417-459a-a56d-221ee316859b" containerName="ceilometer-notification-agent" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.299913 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="784cfe4d-8417-459a-a56d-221ee316859b" containerName="ceilometer-central-agent" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.299922 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="784cfe4d-8417-459a-a56d-221ee316859b" containerName="sg-core" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.300044 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="784cfe4d-8417-459a-a56d-221ee316859b" containerName="proxy-httpd" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.301647 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.304910 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.306471 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.328031 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.379886 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/761b97f0-8901-4e6c-926d-02b7f1f13278-run-httpd\") pod \"ceilometer-0\" (UID: \"761b97f0-8901-4e6c-926d-02b7f1f13278\") " pod="openstack/ceilometer-0" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.379970 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/761b97f0-8901-4e6c-926d-02b7f1f13278-scripts\") pod \"ceilometer-0\" (UID: \"761b97f0-8901-4e6c-926d-02b7f1f13278\") " pod="openstack/ceilometer-0" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.379992 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/761b97f0-8901-4e6c-926d-02b7f1f13278-config-data\") pod \"ceilometer-0\" (UID: \"761b97f0-8901-4e6c-926d-02b7f1f13278\") " pod="openstack/ceilometer-0" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.380021 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/761b97f0-8901-4e6c-926d-02b7f1f13278-log-httpd\") pod \"ceilometer-0\" (UID: \"761b97f0-8901-4e6c-926d-02b7f1f13278\") " pod="openstack/ceilometer-0" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.380062 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/761b97f0-8901-4e6c-926d-02b7f1f13278-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"761b97f0-8901-4e6c-926d-02b7f1f13278\") " pod="openstack/ceilometer-0" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.380124 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wsjj\" (UniqueName: \"kubernetes.io/projected/761b97f0-8901-4e6c-926d-02b7f1f13278-kube-api-access-4wsjj\") pod \"ceilometer-0\" (UID: \"761b97f0-8901-4e6c-926d-02b7f1f13278\") " pod="openstack/ceilometer-0" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.380257 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/761b97f0-8901-4e6c-926d-02b7f1f13278-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"761b97f0-8901-4e6c-926d-02b7f1f13278\") " pod="openstack/ceilometer-0" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.481863 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/761b97f0-8901-4e6c-926d-02b7f1f13278-run-httpd\") pod \"ceilometer-0\" (UID: \"761b97f0-8901-4e6c-926d-02b7f1f13278\") " pod="openstack/ceilometer-0" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.481937 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/761b97f0-8901-4e6c-926d-02b7f1f13278-scripts\") pod \"ceilometer-0\" (UID: \"761b97f0-8901-4e6c-926d-02b7f1f13278\") " pod="openstack/ceilometer-0" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.481963 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/761b97f0-8901-4e6c-926d-02b7f1f13278-config-data\") pod \"ceilometer-0\" (UID: \"761b97f0-8901-4e6c-926d-02b7f1f13278\") " pod="openstack/ceilometer-0" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.481998 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/761b97f0-8901-4e6c-926d-02b7f1f13278-log-httpd\") pod \"ceilometer-0\" (UID: \"761b97f0-8901-4e6c-926d-02b7f1f13278\") " pod="openstack/ceilometer-0" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.482036 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/761b97f0-8901-4e6c-926d-02b7f1f13278-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"761b97f0-8901-4e6c-926d-02b7f1f13278\") " pod="openstack/ceilometer-0" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.482080 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wsjj\" (UniqueName: \"kubernetes.io/projected/761b97f0-8901-4e6c-926d-02b7f1f13278-kube-api-access-4wsjj\") pod \"ceilometer-0\" (UID: \"761b97f0-8901-4e6c-926d-02b7f1f13278\") " pod="openstack/ceilometer-0" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.482171 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/761b97f0-8901-4e6c-926d-02b7f1f13278-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"761b97f0-8901-4e6c-926d-02b7f1f13278\") " pod="openstack/ceilometer-0" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.483628 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/761b97f0-8901-4e6c-926d-02b7f1f13278-log-httpd\") pod \"ceilometer-0\" (UID: \"761b97f0-8901-4e6c-926d-02b7f1f13278\") " pod="openstack/ceilometer-0" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.484074 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/761b97f0-8901-4e6c-926d-02b7f1f13278-run-httpd\") pod \"ceilometer-0\" (UID: \"761b97f0-8901-4e6c-926d-02b7f1f13278\") " pod="openstack/ceilometer-0" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.489737 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/761b97f0-8901-4e6c-926d-02b7f1f13278-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"761b97f0-8901-4e6c-926d-02b7f1f13278\") " pod="openstack/ceilometer-0" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.490355 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/761b97f0-8901-4e6c-926d-02b7f1f13278-scripts\") pod \"ceilometer-0\" (UID: \"761b97f0-8901-4e6c-926d-02b7f1f13278\") " pod="openstack/ceilometer-0" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.491612 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/761b97f0-8901-4e6c-926d-02b7f1f13278-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"761b97f0-8901-4e6c-926d-02b7f1f13278\") " pod="openstack/ceilometer-0" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.504875 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/761b97f0-8901-4e6c-926d-02b7f1f13278-config-data\") pod \"ceilometer-0\" (UID: \"761b97f0-8901-4e6c-926d-02b7f1f13278\") " pod="openstack/ceilometer-0" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.506359 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wsjj\" (UniqueName: \"kubernetes.io/projected/761b97f0-8901-4e6c-926d-02b7f1f13278-kube-api-access-4wsjj\") pod \"ceilometer-0\" (UID: \"761b97f0-8901-4e6c-926d-02b7f1f13278\") " pod="openstack/ceilometer-0" Dec 04 10:35:48 crc kubenswrapper[4831]: I1204 10:35:48.633032 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:35:49 crc kubenswrapper[4831]: I1204 10:35:49.207633 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 04 10:35:49 crc kubenswrapper[4831]: I1204 10:35:49.208013 4831 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 10:35:49 crc kubenswrapper[4831]: I1204 10:35:49.209111 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 04 10:35:49 crc kubenswrapper[4831]: I1204 10:35:49.297993 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="784cfe4d-8417-459a-a56d-221ee316859b" path="/var/lib/kubelet/pods/784cfe4d-8417-459a-a56d-221ee316859b/volumes" Dec 04 10:35:50 crc kubenswrapper[4831]: I1204 10:35:50.135474 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:35:50 crc kubenswrapper[4831]: I1204 10:35:50.950907 4831 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podf292b8ac-6250-4a8a-b73e-75c6aeebe9d5"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podf292b8ac-6250-4a8a-b73e-75c6aeebe9d5] : Timed out while waiting for systemd to remove kubepods-besteffort-podf292b8ac_6250_4a8a_b73e_75c6aeebe9d5.slice" Dec 04 10:35:50 crc kubenswrapper[4831]: E1204 10:35:50.950962 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort podf292b8ac-6250-4a8a-b73e-75c6aeebe9d5] : unable to destroy cgroup paths for cgroup [kubepods besteffort podf292b8ac-6250-4a8a-b73e-75c6aeebe9d5] : Timed out while waiting for systemd to remove kubepods-besteffort-podf292b8ac_6250_4a8a_b73e_75c6aeebe9d5.slice" pod="openstack/nova-api-db-create-x7lqv" podUID="f292b8ac-6250-4a8a-b73e-75c6aeebe9d5" Dec 04 10:35:51 crc kubenswrapper[4831]: I1204 10:35:51.969351 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-x7lqv" Dec 04 10:35:51 crc kubenswrapper[4831]: I1204 10:35:51.971515 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:35:51 crc kubenswrapper[4831]: I1204 10:35:51.971552 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:35:54 crc kubenswrapper[4831]: W1204 10:35:54.173242 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod761b97f0_8901_4e6c_926d_02b7f1f13278.slice/crio-01efad91461c4e6e52ecd86a16692ebc5fd0ffb2a599c91caa752b5cfdbed844 WatchSource:0}: Error finding container 01efad91461c4e6e52ecd86a16692ebc5fd0ffb2a599c91caa752b5cfdbed844: Status 404 returned error can't find the container with id 01efad91461c4e6e52ecd86a16692ebc5fd0ffb2a599c91caa752b5cfdbed844 Dec 04 10:35:54 crc kubenswrapper[4831]: I1204 10:35:54.176948 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:35:55 crc kubenswrapper[4831]: I1204 10:35:55.014892 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-x2xtp" event={"ID":"83153b32-d324-4aec-a5f3-ff0c0a47c0ee","Type":"ContainerStarted","Data":"3803d71b1b1890b734afe453af5bc2ca843a43328ac506860f50d2a6674b0fbb"} Dec 04 10:35:55 crc kubenswrapper[4831]: I1204 10:35:55.019500 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"761b97f0-8901-4e6c-926d-02b7f1f13278","Type":"ContainerStarted","Data":"56fa413f3ac83ec79bd32365e08b33c040c45f647ce56d900b1414003693e954"} Dec 04 10:35:55 crc kubenswrapper[4831]: I1204 10:35:55.019544 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"761b97f0-8901-4e6c-926d-02b7f1f13278","Type":"ContainerStarted","Data":"9281a9d4a3eda136b670928e0ae37b2e793b05c26a83f4287b5eca02998421ba"} Dec 04 10:35:55 crc kubenswrapper[4831]: I1204 10:35:55.019553 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"761b97f0-8901-4e6c-926d-02b7f1f13278","Type":"ContainerStarted","Data":"01efad91461c4e6e52ecd86a16692ebc5fd0ffb2a599c91caa752b5cfdbed844"} Dec 04 10:35:55 crc kubenswrapper[4831]: I1204 10:35:55.034514 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-x2xtp" podStartSLOduration=2.52740759 podStartE2EDuration="11.034497954s" podCreationTimestamp="2025-12-04 10:35:44 +0000 UTC" firstStartedPulling="2025-12-04 10:35:45.237759668 +0000 UTC m=+1242.186934982" lastFinishedPulling="2025-12-04 10:35:53.744850032 +0000 UTC m=+1250.694025346" observedRunningTime="2025-12-04 10:35:55.029412529 +0000 UTC m=+1251.978587843" watchObservedRunningTime="2025-12-04 10:35:55.034497954 +0000 UTC m=+1251.983673268" Dec 04 10:35:56 crc kubenswrapper[4831]: I1204 10:35:56.031585 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"761b97f0-8901-4e6c-926d-02b7f1f13278","Type":"ContainerStarted","Data":"4788e27e60b79fb1ac9e98ef6912e6188aa593cfc97585c480b97f285c78e48f"} Dec 04 10:35:58 crc kubenswrapper[4831]: I1204 10:35:58.062483 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"761b97f0-8901-4e6c-926d-02b7f1f13278","Type":"ContainerStarted","Data":"0cb6a92dec3f7029000d487ef34fc4b61dc05dad7f42975a29bae631d8fba774"} Dec 04 10:35:58 crc kubenswrapper[4831]: I1204 10:35:58.063365 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 10:35:58 crc kubenswrapper[4831]: I1204 10:35:58.062851 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="761b97f0-8901-4e6c-926d-02b7f1f13278" containerName="sg-core" containerID="cri-o://4788e27e60b79fb1ac9e98ef6912e6188aa593cfc97585c480b97f285c78e48f" gracePeriod=30 Dec 04 10:35:58 crc kubenswrapper[4831]: I1204 10:35:58.062817 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="761b97f0-8901-4e6c-926d-02b7f1f13278" containerName="ceilometer-central-agent" containerID="cri-o://9281a9d4a3eda136b670928e0ae37b2e793b05c26a83f4287b5eca02998421ba" gracePeriod=30 Dec 04 10:35:58 crc kubenswrapper[4831]: I1204 10:35:58.062922 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="761b97f0-8901-4e6c-926d-02b7f1f13278" containerName="proxy-httpd" containerID="cri-o://0cb6a92dec3f7029000d487ef34fc4b61dc05dad7f42975a29bae631d8fba774" gracePeriod=30 Dec 04 10:35:58 crc kubenswrapper[4831]: I1204 10:35:58.062905 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="761b97f0-8901-4e6c-926d-02b7f1f13278" containerName="ceilometer-notification-agent" containerID="cri-o://56fa413f3ac83ec79bd32365e08b33c040c45f647ce56d900b1414003693e954" gracePeriod=30 Dec 04 10:35:58 crc kubenswrapper[4831]: I1204 10:35:58.108279 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=6.832494271 podStartE2EDuration="10.108258341s" podCreationTimestamp="2025-12-04 10:35:48 +0000 UTC" firstStartedPulling="2025-12-04 10:35:54.175680281 +0000 UTC m=+1251.124855605" lastFinishedPulling="2025-12-04 10:35:57.451444361 +0000 UTC m=+1254.400619675" observedRunningTime="2025-12-04 10:35:58.099698913 +0000 UTC m=+1255.048874237" watchObservedRunningTime="2025-12-04 10:35:58.108258341 +0000 UTC m=+1255.057433665" Dec 04 10:35:59 crc kubenswrapper[4831]: I1204 10:35:59.082634 4831 generic.go:334] "Generic (PLEG): container finished" podID="761b97f0-8901-4e6c-926d-02b7f1f13278" containerID="0cb6a92dec3f7029000d487ef34fc4b61dc05dad7f42975a29bae631d8fba774" exitCode=0 Dec 04 10:35:59 crc kubenswrapper[4831]: I1204 10:35:59.083074 4831 generic.go:334] "Generic (PLEG): container finished" podID="761b97f0-8901-4e6c-926d-02b7f1f13278" containerID="4788e27e60b79fb1ac9e98ef6912e6188aa593cfc97585c480b97f285c78e48f" exitCode=2 Dec 04 10:35:59 crc kubenswrapper[4831]: I1204 10:35:59.083092 4831 generic.go:334] "Generic (PLEG): container finished" podID="761b97f0-8901-4e6c-926d-02b7f1f13278" containerID="56fa413f3ac83ec79bd32365e08b33c040c45f647ce56d900b1414003693e954" exitCode=0 Dec 04 10:35:59 crc kubenswrapper[4831]: I1204 10:35:59.082714 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"761b97f0-8901-4e6c-926d-02b7f1f13278","Type":"ContainerDied","Data":"0cb6a92dec3f7029000d487ef34fc4b61dc05dad7f42975a29bae631d8fba774"} Dec 04 10:35:59 crc kubenswrapper[4831]: I1204 10:35:59.083148 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"761b97f0-8901-4e6c-926d-02b7f1f13278","Type":"ContainerDied","Data":"4788e27e60b79fb1ac9e98ef6912e6188aa593cfc97585c480b97f285c78e48f"} Dec 04 10:35:59 crc kubenswrapper[4831]: I1204 10:35:59.083177 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"761b97f0-8901-4e6c-926d-02b7f1f13278","Type":"ContainerDied","Data":"56fa413f3ac83ec79bd32365e08b33c040c45f647ce56d900b1414003693e954"} Dec 04 10:36:00 crc kubenswrapper[4831]: I1204 10:36:00.852430 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.054231 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/761b97f0-8901-4e6c-926d-02b7f1f13278-log-httpd\") pod \"761b97f0-8901-4e6c-926d-02b7f1f13278\" (UID: \"761b97f0-8901-4e6c-926d-02b7f1f13278\") " Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.054313 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/761b97f0-8901-4e6c-926d-02b7f1f13278-scripts\") pod \"761b97f0-8901-4e6c-926d-02b7f1f13278\" (UID: \"761b97f0-8901-4e6c-926d-02b7f1f13278\") " Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.054455 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wsjj\" (UniqueName: \"kubernetes.io/projected/761b97f0-8901-4e6c-926d-02b7f1f13278-kube-api-access-4wsjj\") pod \"761b97f0-8901-4e6c-926d-02b7f1f13278\" (UID: \"761b97f0-8901-4e6c-926d-02b7f1f13278\") " Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.054563 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/761b97f0-8901-4e6c-926d-02b7f1f13278-run-httpd\") pod \"761b97f0-8901-4e6c-926d-02b7f1f13278\" (UID: \"761b97f0-8901-4e6c-926d-02b7f1f13278\") " Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.054622 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/761b97f0-8901-4e6c-926d-02b7f1f13278-config-data\") pod \"761b97f0-8901-4e6c-926d-02b7f1f13278\" (UID: \"761b97f0-8901-4e6c-926d-02b7f1f13278\") " Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.054726 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/761b97f0-8901-4e6c-926d-02b7f1f13278-combined-ca-bundle\") pod \"761b97f0-8901-4e6c-926d-02b7f1f13278\" (UID: \"761b97f0-8901-4e6c-926d-02b7f1f13278\") " Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.054775 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/761b97f0-8901-4e6c-926d-02b7f1f13278-sg-core-conf-yaml\") pod \"761b97f0-8901-4e6c-926d-02b7f1f13278\" (UID: \"761b97f0-8901-4e6c-926d-02b7f1f13278\") " Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.054971 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/761b97f0-8901-4e6c-926d-02b7f1f13278-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "761b97f0-8901-4e6c-926d-02b7f1f13278" (UID: "761b97f0-8901-4e6c-926d-02b7f1f13278"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.055217 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/761b97f0-8901-4e6c-926d-02b7f1f13278-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "761b97f0-8901-4e6c-926d-02b7f1f13278" (UID: "761b97f0-8901-4e6c-926d-02b7f1f13278"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.055534 4831 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/761b97f0-8901-4e6c-926d-02b7f1f13278-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.055558 4831 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/761b97f0-8901-4e6c-926d-02b7f1f13278-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.069863 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/761b97f0-8901-4e6c-926d-02b7f1f13278-scripts" (OuterVolumeSpecName: "scripts") pod "761b97f0-8901-4e6c-926d-02b7f1f13278" (UID: "761b97f0-8901-4e6c-926d-02b7f1f13278"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.071590 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/761b97f0-8901-4e6c-926d-02b7f1f13278-kube-api-access-4wsjj" (OuterVolumeSpecName: "kube-api-access-4wsjj") pod "761b97f0-8901-4e6c-926d-02b7f1f13278" (UID: "761b97f0-8901-4e6c-926d-02b7f1f13278"). InnerVolumeSpecName "kube-api-access-4wsjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.112725 4831 generic.go:334] "Generic (PLEG): container finished" podID="761b97f0-8901-4e6c-926d-02b7f1f13278" containerID="9281a9d4a3eda136b670928e0ae37b2e793b05c26a83f4287b5eca02998421ba" exitCode=0 Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.112789 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"761b97f0-8901-4e6c-926d-02b7f1f13278","Type":"ContainerDied","Data":"9281a9d4a3eda136b670928e0ae37b2e793b05c26a83f4287b5eca02998421ba"} Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.112826 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"761b97f0-8901-4e6c-926d-02b7f1f13278","Type":"ContainerDied","Data":"01efad91461c4e6e52ecd86a16692ebc5fd0ffb2a599c91caa752b5cfdbed844"} Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.112864 4831 scope.go:117] "RemoveContainer" containerID="0cb6a92dec3f7029000d487ef34fc4b61dc05dad7f42975a29bae631d8fba774" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.113109 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.128235 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/761b97f0-8901-4e6c-926d-02b7f1f13278-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "761b97f0-8901-4e6c-926d-02b7f1f13278" (UID: "761b97f0-8901-4e6c-926d-02b7f1f13278"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.157306 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wsjj\" (UniqueName: \"kubernetes.io/projected/761b97f0-8901-4e6c-926d-02b7f1f13278-kube-api-access-4wsjj\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.157336 4831 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/761b97f0-8901-4e6c-926d-02b7f1f13278-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.157346 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/761b97f0-8901-4e6c-926d-02b7f1f13278-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.162448 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/761b97f0-8901-4e6c-926d-02b7f1f13278-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "761b97f0-8901-4e6c-926d-02b7f1f13278" (UID: "761b97f0-8901-4e6c-926d-02b7f1f13278"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.176864 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/761b97f0-8901-4e6c-926d-02b7f1f13278-config-data" (OuterVolumeSpecName: "config-data") pod "761b97f0-8901-4e6c-926d-02b7f1f13278" (UID: "761b97f0-8901-4e6c-926d-02b7f1f13278"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.185857 4831 scope.go:117] "RemoveContainer" containerID="4788e27e60b79fb1ac9e98ef6912e6188aa593cfc97585c480b97f285c78e48f" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.208221 4831 scope.go:117] "RemoveContainer" containerID="56fa413f3ac83ec79bd32365e08b33c040c45f647ce56d900b1414003693e954" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.227909 4831 scope.go:117] "RemoveContainer" containerID="9281a9d4a3eda136b670928e0ae37b2e793b05c26a83f4287b5eca02998421ba" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.258743 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/761b97f0-8901-4e6c-926d-02b7f1f13278-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.258788 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/761b97f0-8901-4e6c-926d-02b7f1f13278-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.258892 4831 scope.go:117] "RemoveContainer" containerID="0cb6a92dec3f7029000d487ef34fc4b61dc05dad7f42975a29bae631d8fba774" Dec 04 10:36:01 crc kubenswrapper[4831]: E1204 10:36:01.260008 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cb6a92dec3f7029000d487ef34fc4b61dc05dad7f42975a29bae631d8fba774\": container with ID starting with 0cb6a92dec3f7029000d487ef34fc4b61dc05dad7f42975a29bae631d8fba774 not found: ID does not exist" containerID="0cb6a92dec3f7029000d487ef34fc4b61dc05dad7f42975a29bae631d8fba774" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.260062 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cb6a92dec3f7029000d487ef34fc4b61dc05dad7f42975a29bae631d8fba774"} err="failed to get container status \"0cb6a92dec3f7029000d487ef34fc4b61dc05dad7f42975a29bae631d8fba774\": rpc error: code = NotFound desc = could not find container \"0cb6a92dec3f7029000d487ef34fc4b61dc05dad7f42975a29bae631d8fba774\": container with ID starting with 0cb6a92dec3f7029000d487ef34fc4b61dc05dad7f42975a29bae631d8fba774 not found: ID does not exist" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.260095 4831 scope.go:117] "RemoveContainer" containerID="4788e27e60b79fb1ac9e98ef6912e6188aa593cfc97585c480b97f285c78e48f" Dec 04 10:36:01 crc kubenswrapper[4831]: E1204 10:36:01.260350 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4788e27e60b79fb1ac9e98ef6912e6188aa593cfc97585c480b97f285c78e48f\": container with ID starting with 4788e27e60b79fb1ac9e98ef6912e6188aa593cfc97585c480b97f285c78e48f not found: ID does not exist" containerID="4788e27e60b79fb1ac9e98ef6912e6188aa593cfc97585c480b97f285c78e48f" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.260391 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4788e27e60b79fb1ac9e98ef6912e6188aa593cfc97585c480b97f285c78e48f"} err="failed to get container status \"4788e27e60b79fb1ac9e98ef6912e6188aa593cfc97585c480b97f285c78e48f\": rpc error: code = NotFound desc = could not find container \"4788e27e60b79fb1ac9e98ef6912e6188aa593cfc97585c480b97f285c78e48f\": container with ID starting with 4788e27e60b79fb1ac9e98ef6912e6188aa593cfc97585c480b97f285c78e48f not found: ID does not exist" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.260416 4831 scope.go:117] "RemoveContainer" containerID="56fa413f3ac83ec79bd32365e08b33c040c45f647ce56d900b1414003693e954" Dec 04 10:36:01 crc kubenswrapper[4831]: E1204 10:36:01.260707 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56fa413f3ac83ec79bd32365e08b33c040c45f647ce56d900b1414003693e954\": container with ID starting with 56fa413f3ac83ec79bd32365e08b33c040c45f647ce56d900b1414003693e954 not found: ID does not exist" containerID="56fa413f3ac83ec79bd32365e08b33c040c45f647ce56d900b1414003693e954" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.260745 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56fa413f3ac83ec79bd32365e08b33c040c45f647ce56d900b1414003693e954"} err="failed to get container status \"56fa413f3ac83ec79bd32365e08b33c040c45f647ce56d900b1414003693e954\": rpc error: code = NotFound desc = could not find container \"56fa413f3ac83ec79bd32365e08b33c040c45f647ce56d900b1414003693e954\": container with ID starting with 56fa413f3ac83ec79bd32365e08b33c040c45f647ce56d900b1414003693e954 not found: ID does not exist" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.260767 4831 scope.go:117] "RemoveContainer" containerID="9281a9d4a3eda136b670928e0ae37b2e793b05c26a83f4287b5eca02998421ba" Dec 04 10:36:01 crc kubenswrapper[4831]: E1204 10:36:01.261005 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9281a9d4a3eda136b670928e0ae37b2e793b05c26a83f4287b5eca02998421ba\": container with ID starting with 9281a9d4a3eda136b670928e0ae37b2e793b05c26a83f4287b5eca02998421ba not found: ID does not exist" containerID="9281a9d4a3eda136b670928e0ae37b2e793b05c26a83f4287b5eca02998421ba" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.261038 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9281a9d4a3eda136b670928e0ae37b2e793b05c26a83f4287b5eca02998421ba"} err="failed to get container status \"9281a9d4a3eda136b670928e0ae37b2e793b05c26a83f4287b5eca02998421ba\": rpc error: code = NotFound desc = could not find container \"9281a9d4a3eda136b670928e0ae37b2e793b05c26a83f4287b5eca02998421ba\": container with ID starting with 9281a9d4a3eda136b670928e0ae37b2e793b05c26a83f4287b5eca02998421ba not found: ID does not exist" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.456404 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.467785 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.499756 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:36:01 crc kubenswrapper[4831]: E1204 10:36:01.500314 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="761b97f0-8901-4e6c-926d-02b7f1f13278" containerName="proxy-httpd" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.500338 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="761b97f0-8901-4e6c-926d-02b7f1f13278" containerName="proxy-httpd" Dec 04 10:36:01 crc kubenswrapper[4831]: E1204 10:36:01.500375 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="761b97f0-8901-4e6c-926d-02b7f1f13278" containerName="ceilometer-notification-agent" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.500400 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="761b97f0-8901-4e6c-926d-02b7f1f13278" containerName="ceilometer-notification-agent" Dec 04 10:36:01 crc kubenswrapper[4831]: E1204 10:36:01.500415 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="761b97f0-8901-4e6c-926d-02b7f1f13278" containerName="ceilometer-central-agent" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.500423 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="761b97f0-8901-4e6c-926d-02b7f1f13278" containerName="ceilometer-central-agent" Dec 04 10:36:01 crc kubenswrapper[4831]: E1204 10:36:01.500447 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="761b97f0-8901-4e6c-926d-02b7f1f13278" containerName="sg-core" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.500456 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="761b97f0-8901-4e6c-926d-02b7f1f13278" containerName="sg-core" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.500714 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="761b97f0-8901-4e6c-926d-02b7f1f13278" containerName="sg-core" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.500740 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="761b97f0-8901-4e6c-926d-02b7f1f13278" containerName="proxy-httpd" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.500758 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="761b97f0-8901-4e6c-926d-02b7f1f13278" containerName="ceilometer-central-agent" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.500783 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="761b97f0-8901-4e6c-926d-02b7f1f13278" containerName="ceilometer-notification-agent" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.503055 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.508764 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.508866 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.515197 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.566760 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e416fb67-8343-42f4-9214-9b1d0cc9a2bd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e416fb67-8343-42f4-9214-9b1d0cc9a2bd\") " pod="openstack/ceilometer-0" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.567383 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e416fb67-8343-42f4-9214-9b1d0cc9a2bd-config-data\") pod \"ceilometer-0\" (UID: \"e416fb67-8343-42f4-9214-9b1d0cc9a2bd\") " pod="openstack/ceilometer-0" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.567454 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e416fb67-8343-42f4-9214-9b1d0cc9a2bd-run-httpd\") pod \"ceilometer-0\" (UID: \"e416fb67-8343-42f4-9214-9b1d0cc9a2bd\") " pod="openstack/ceilometer-0" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.567519 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzql5\" (UniqueName: \"kubernetes.io/projected/e416fb67-8343-42f4-9214-9b1d0cc9a2bd-kube-api-access-nzql5\") pod \"ceilometer-0\" (UID: \"e416fb67-8343-42f4-9214-9b1d0cc9a2bd\") " pod="openstack/ceilometer-0" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.567639 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e416fb67-8343-42f4-9214-9b1d0cc9a2bd-log-httpd\") pod \"ceilometer-0\" (UID: \"e416fb67-8343-42f4-9214-9b1d0cc9a2bd\") " pod="openstack/ceilometer-0" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.567733 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e416fb67-8343-42f4-9214-9b1d0cc9a2bd-scripts\") pod \"ceilometer-0\" (UID: \"e416fb67-8343-42f4-9214-9b1d0cc9a2bd\") " pod="openstack/ceilometer-0" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.567782 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e416fb67-8343-42f4-9214-9b1d0cc9a2bd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e416fb67-8343-42f4-9214-9b1d0cc9a2bd\") " pod="openstack/ceilometer-0" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.669443 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e416fb67-8343-42f4-9214-9b1d0cc9a2bd-config-data\") pod \"ceilometer-0\" (UID: \"e416fb67-8343-42f4-9214-9b1d0cc9a2bd\") " pod="openstack/ceilometer-0" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.669520 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e416fb67-8343-42f4-9214-9b1d0cc9a2bd-run-httpd\") pod \"ceilometer-0\" (UID: \"e416fb67-8343-42f4-9214-9b1d0cc9a2bd\") " pod="openstack/ceilometer-0" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.669562 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzql5\" (UniqueName: \"kubernetes.io/projected/e416fb67-8343-42f4-9214-9b1d0cc9a2bd-kube-api-access-nzql5\") pod \"ceilometer-0\" (UID: \"e416fb67-8343-42f4-9214-9b1d0cc9a2bd\") " pod="openstack/ceilometer-0" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.669614 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e416fb67-8343-42f4-9214-9b1d0cc9a2bd-log-httpd\") pod \"ceilometer-0\" (UID: \"e416fb67-8343-42f4-9214-9b1d0cc9a2bd\") " pod="openstack/ceilometer-0" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.669708 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e416fb67-8343-42f4-9214-9b1d0cc9a2bd-scripts\") pod \"ceilometer-0\" (UID: \"e416fb67-8343-42f4-9214-9b1d0cc9a2bd\") " pod="openstack/ceilometer-0" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.669819 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e416fb67-8343-42f4-9214-9b1d0cc9a2bd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e416fb67-8343-42f4-9214-9b1d0cc9a2bd\") " pod="openstack/ceilometer-0" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.669876 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e416fb67-8343-42f4-9214-9b1d0cc9a2bd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e416fb67-8343-42f4-9214-9b1d0cc9a2bd\") " pod="openstack/ceilometer-0" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.670704 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e416fb67-8343-42f4-9214-9b1d0cc9a2bd-run-httpd\") pod \"ceilometer-0\" (UID: \"e416fb67-8343-42f4-9214-9b1d0cc9a2bd\") " pod="openstack/ceilometer-0" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.670755 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e416fb67-8343-42f4-9214-9b1d0cc9a2bd-log-httpd\") pod \"ceilometer-0\" (UID: \"e416fb67-8343-42f4-9214-9b1d0cc9a2bd\") " pod="openstack/ceilometer-0" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.673653 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e416fb67-8343-42f4-9214-9b1d0cc9a2bd-config-data\") pod \"ceilometer-0\" (UID: \"e416fb67-8343-42f4-9214-9b1d0cc9a2bd\") " pod="openstack/ceilometer-0" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.674077 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e416fb67-8343-42f4-9214-9b1d0cc9a2bd-scripts\") pod \"ceilometer-0\" (UID: \"e416fb67-8343-42f4-9214-9b1d0cc9a2bd\") " pod="openstack/ceilometer-0" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.675594 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e416fb67-8343-42f4-9214-9b1d0cc9a2bd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e416fb67-8343-42f4-9214-9b1d0cc9a2bd\") " pod="openstack/ceilometer-0" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.678056 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e416fb67-8343-42f4-9214-9b1d0cc9a2bd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e416fb67-8343-42f4-9214-9b1d0cc9a2bd\") " pod="openstack/ceilometer-0" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.697551 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzql5\" (UniqueName: \"kubernetes.io/projected/e416fb67-8343-42f4-9214-9b1d0cc9a2bd-kube-api-access-nzql5\") pod \"ceilometer-0\" (UID: \"e416fb67-8343-42f4-9214-9b1d0cc9a2bd\") " pod="openstack/ceilometer-0" Dec 04 10:36:01 crc kubenswrapper[4831]: I1204 10:36:01.852158 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:36:02 crc kubenswrapper[4831]: W1204 10:36:02.332598 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode416fb67_8343_42f4_9214_9b1d0cc9a2bd.slice/crio-31aba505a8c80b6bd429bb8e194332c8c81716e3f24b2bf8f270e15f6a484106 WatchSource:0}: Error finding container 31aba505a8c80b6bd429bb8e194332c8c81716e3f24b2bf8f270e15f6a484106: Status 404 returned error can't find the container with id 31aba505a8c80b6bd429bb8e194332c8c81716e3f24b2bf8f270e15f6a484106 Dec 04 10:36:02 crc kubenswrapper[4831]: I1204 10:36:02.336437 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:36:03 crc kubenswrapper[4831]: I1204 10:36:03.133039 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e416fb67-8343-42f4-9214-9b1d0cc9a2bd","Type":"ContainerStarted","Data":"2c632582e5a5645e980b6e363572e3d784b1ef503e10c30320b5b76771567edf"} Dec 04 10:36:03 crc kubenswrapper[4831]: I1204 10:36:03.133360 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e416fb67-8343-42f4-9214-9b1d0cc9a2bd","Type":"ContainerStarted","Data":"0f9f9089ad8c95355b59e8c12e5ba27fef74afe0c975d0f18bd7c727b7db8997"} Dec 04 10:36:03 crc kubenswrapper[4831]: I1204 10:36:03.133374 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e416fb67-8343-42f4-9214-9b1d0cc9a2bd","Type":"ContainerStarted","Data":"31aba505a8c80b6bd429bb8e194332c8c81716e3f24b2bf8f270e15f6a484106"} Dec 04 10:36:03 crc kubenswrapper[4831]: I1204 10:36:03.192987 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 04 10:36:03 crc kubenswrapper[4831]: I1204 10:36:03.193263 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="6040f79c-a151-4681-a758-d2741bff68b6" containerName="watcher-decision-engine" containerID="cri-o://362f1baaa691428bb0b53e1a5646ee86e99621f28c605cae4217d58e7e827ff7" gracePeriod=30 Dec 04 10:36:03 crc kubenswrapper[4831]: I1204 10:36:03.238789 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:36:03 crc kubenswrapper[4831]: I1204 10:36:03.311552 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="761b97f0-8901-4e6c-926d-02b7f1f13278" path="/var/lib/kubelet/pods/761b97f0-8901-4e6c-926d-02b7f1f13278/volumes" Dec 04 10:36:04 crc kubenswrapper[4831]: I1204 10:36:04.146585 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e416fb67-8343-42f4-9214-9b1d0cc9a2bd","Type":"ContainerStarted","Data":"585716c3b8363f3cf20cb589e33d7dc7c39205ec5de1abec06d28fc63a171430"} Dec 04 10:36:05 crc kubenswrapper[4831]: I1204 10:36:05.157279 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e416fb67-8343-42f4-9214-9b1d0cc9a2bd","Type":"ContainerStarted","Data":"fd991c7345d36b8c22484dd88b156b14878dfab9ed908f14f8f2e6ff1eb67fdc"} Dec 04 10:36:05 crc kubenswrapper[4831]: I1204 10:36:05.157868 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e416fb67-8343-42f4-9214-9b1d0cc9a2bd" containerName="ceilometer-central-agent" containerID="cri-o://0f9f9089ad8c95355b59e8c12e5ba27fef74afe0c975d0f18bd7c727b7db8997" gracePeriod=30 Dec 04 10:36:05 crc kubenswrapper[4831]: I1204 10:36:05.158160 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 10:36:05 crc kubenswrapper[4831]: I1204 10:36:05.159647 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e416fb67-8343-42f4-9214-9b1d0cc9a2bd" containerName="proxy-httpd" containerID="cri-o://fd991c7345d36b8c22484dd88b156b14878dfab9ed908f14f8f2e6ff1eb67fdc" gracePeriod=30 Dec 04 10:36:05 crc kubenswrapper[4831]: I1204 10:36:05.159749 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e416fb67-8343-42f4-9214-9b1d0cc9a2bd" containerName="sg-core" containerID="cri-o://585716c3b8363f3cf20cb589e33d7dc7c39205ec5de1abec06d28fc63a171430" gracePeriod=30 Dec 04 10:36:05 crc kubenswrapper[4831]: I1204 10:36:05.159789 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e416fb67-8343-42f4-9214-9b1d0cc9a2bd" containerName="ceilometer-notification-agent" containerID="cri-o://2c632582e5a5645e980b6e363572e3d784b1ef503e10c30320b5b76771567edf" gracePeriod=30 Dec 04 10:36:05 crc kubenswrapper[4831]: I1204 10:36:05.193491 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.936016329 podStartE2EDuration="4.19347142s" podCreationTimestamp="2025-12-04 10:36:01 +0000 UTC" firstStartedPulling="2025-12-04 10:36:02.334364233 +0000 UTC m=+1259.283539547" lastFinishedPulling="2025-12-04 10:36:04.591819324 +0000 UTC m=+1261.540994638" observedRunningTime="2025-12-04 10:36:05.18668569 +0000 UTC m=+1262.135861014" watchObservedRunningTime="2025-12-04 10:36:05.19347142 +0000 UTC m=+1262.142646724" Dec 04 10:36:06 crc kubenswrapper[4831]: I1204 10:36:06.173424 4831 generic.go:334] "Generic (PLEG): container finished" podID="e416fb67-8343-42f4-9214-9b1d0cc9a2bd" containerID="fd991c7345d36b8c22484dd88b156b14878dfab9ed908f14f8f2e6ff1eb67fdc" exitCode=0 Dec 04 10:36:06 crc kubenswrapper[4831]: I1204 10:36:06.173911 4831 generic.go:334] "Generic (PLEG): container finished" podID="e416fb67-8343-42f4-9214-9b1d0cc9a2bd" containerID="585716c3b8363f3cf20cb589e33d7dc7c39205ec5de1abec06d28fc63a171430" exitCode=2 Dec 04 10:36:06 crc kubenswrapper[4831]: I1204 10:36:06.173967 4831 generic.go:334] "Generic (PLEG): container finished" podID="e416fb67-8343-42f4-9214-9b1d0cc9a2bd" containerID="2c632582e5a5645e980b6e363572e3d784b1ef503e10c30320b5b76771567edf" exitCode=0 Dec 04 10:36:06 crc kubenswrapper[4831]: I1204 10:36:06.173590 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e416fb67-8343-42f4-9214-9b1d0cc9a2bd","Type":"ContainerDied","Data":"fd991c7345d36b8c22484dd88b156b14878dfab9ed908f14f8f2e6ff1eb67fdc"} Dec 04 10:36:06 crc kubenswrapper[4831]: I1204 10:36:06.174028 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e416fb67-8343-42f4-9214-9b1d0cc9a2bd","Type":"ContainerDied","Data":"585716c3b8363f3cf20cb589e33d7dc7c39205ec5de1abec06d28fc63a171430"} Dec 04 10:36:06 crc kubenswrapper[4831]: I1204 10:36:06.174051 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e416fb67-8343-42f4-9214-9b1d0cc9a2bd","Type":"ContainerDied","Data":"2c632582e5a5645e980b6e363572e3d784b1ef503e10c30320b5b76771567edf"} Dec 04 10:36:06 crc kubenswrapper[4831]: I1204 10:36:06.784035 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 04 10:36:06 crc kubenswrapper[4831]: I1204 10:36:06.889089 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6040f79c-a151-4681-a758-d2741bff68b6-custom-prometheus-ca\") pod \"6040f79c-a151-4681-a758-d2741bff68b6\" (UID: \"6040f79c-a151-4681-a758-d2741bff68b6\") " Dec 04 10:36:06 crc kubenswrapper[4831]: I1204 10:36:06.889453 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6040f79c-a151-4681-a758-d2741bff68b6-config-data\") pod \"6040f79c-a151-4681-a758-d2741bff68b6\" (UID: \"6040f79c-a151-4681-a758-d2741bff68b6\") " Dec 04 10:36:06 crc kubenswrapper[4831]: I1204 10:36:06.889505 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6040f79c-a151-4681-a758-d2741bff68b6-combined-ca-bundle\") pod \"6040f79c-a151-4681-a758-d2741bff68b6\" (UID: \"6040f79c-a151-4681-a758-d2741bff68b6\") " Dec 04 10:36:06 crc kubenswrapper[4831]: I1204 10:36:06.889652 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h89xx\" (UniqueName: \"kubernetes.io/projected/6040f79c-a151-4681-a758-d2741bff68b6-kube-api-access-h89xx\") pod \"6040f79c-a151-4681-a758-d2741bff68b6\" (UID: \"6040f79c-a151-4681-a758-d2741bff68b6\") " Dec 04 10:36:06 crc kubenswrapper[4831]: I1204 10:36:06.889750 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6040f79c-a151-4681-a758-d2741bff68b6-logs\") pod \"6040f79c-a151-4681-a758-d2741bff68b6\" (UID: \"6040f79c-a151-4681-a758-d2741bff68b6\") " Dec 04 10:36:06 crc kubenswrapper[4831]: I1204 10:36:06.890625 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6040f79c-a151-4681-a758-d2741bff68b6-logs" (OuterVolumeSpecName: "logs") pod "6040f79c-a151-4681-a758-d2741bff68b6" (UID: "6040f79c-a151-4681-a758-d2741bff68b6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:36:06 crc kubenswrapper[4831]: I1204 10:36:06.899950 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6040f79c-a151-4681-a758-d2741bff68b6-kube-api-access-h89xx" (OuterVolumeSpecName: "kube-api-access-h89xx") pod "6040f79c-a151-4681-a758-d2741bff68b6" (UID: "6040f79c-a151-4681-a758-d2741bff68b6"). InnerVolumeSpecName "kube-api-access-h89xx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:36:06 crc kubenswrapper[4831]: I1204 10:36:06.923414 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6040f79c-a151-4681-a758-d2741bff68b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6040f79c-a151-4681-a758-d2741bff68b6" (UID: "6040f79c-a151-4681-a758-d2741bff68b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:36:06 crc kubenswrapper[4831]: I1204 10:36:06.932714 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6040f79c-a151-4681-a758-d2741bff68b6-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "6040f79c-a151-4681-a758-d2741bff68b6" (UID: "6040f79c-a151-4681-a758-d2741bff68b6"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:36:06 crc kubenswrapper[4831]: I1204 10:36:06.961525 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6040f79c-a151-4681-a758-d2741bff68b6-config-data" (OuterVolumeSpecName: "config-data") pod "6040f79c-a151-4681-a758-d2741bff68b6" (UID: "6040f79c-a151-4681-a758-d2741bff68b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:36:06 crc kubenswrapper[4831]: I1204 10:36:06.991807 4831 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6040f79c-a151-4681-a758-d2741bff68b6-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:06 crc kubenswrapper[4831]: I1204 10:36:06.991844 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6040f79c-a151-4681-a758-d2741bff68b6-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:06 crc kubenswrapper[4831]: I1204 10:36:06.991853 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6040f79c-a151-4681-a758-d2741bff68b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:06 crc kubenswrapper[4831]: I1204 10:36:06.991862 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h89xx\" (UniqueName: \"kubernetes.io/projected/6040f79c-a151-4681-a758-d2741bff68b6-kube-api-access-h89xx\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:06 crc kubenswrapper[4831]: I1204 10:36:06.991871 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6040f79c-a151-4681-a758-d2741bff68b6-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:07 crc kubenswrapper[4831]: I1204 10:36:07.184415 4831 generic.go:334] "Generic (PLEG): container finished" podID="6040f79c-a151-4681-a758-d2741bff68b6" containerID="362f1baaa691428bb0b53e1a5646ee86e99621f28c605cae4217d58e7e827ff7" exitCode=0 Dec 04 10:36:07 crc kubenswrapper[4831]: I1204 10:36:07.184462 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"6040f79c-a151-4681-a758-d2741bff68b6","Type":"ContainerDied","Data":"362f1baaa691428bb0b53e1a5646ee86e99621f28c605cae4217d58e7e827ff7"} Dec 04 10:36:07 crc kubenswrapper[4831]: I1204 10:36:07.184502 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"6040f79c-a151-4681-a758-d2741bff68b6","Type":"ContainerDied","Data":"3ced7baec4e0db4fc5d0edc8669d31b19513384a9477f798f028a7a0cb3f7a39"} Dec 04 10:36:07 crc kubenswrapper[4831]: I1204 10:36:07.184523 4831 scope.go:117] "RemoveContainer" containerID="362f1baaa691428bb0b53e1a5646ee86e99621f28c605cae4217d58e7e827ff7" Dec 04 10:36:07 crc kubenswrapper[4831]: I1204 10:36:07.184654 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 04 10:36:07 crc kubenswrapper[4831]: I1204 10:36:07.235713 4831 scope.go:117] "RemoveContainer" containerID="bac90f7915063288de67a2e139c40837b5708a4e13373e20cb3ac91496de72b3" Dec 04 10:36:07 crc kubenswrapper[4831]: I1204 10:36:07.253066 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 04 10:36:07 crc kubenswrapper[4831]: I1204 10:36:07.265513 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 04 10:36:07 crc kubenswrapper[4831]: I1204 10:36:07.318156 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6040f79c-a151-4681-a758-d2741bff68b6" path="/var/lib/kubelet/pods/6040f79c-a151-4681-a758-d2741bff68b6/volumes" Dec 04 10:36:07 crc kubenswrapper[4831]: I1204 10:36:07.319590 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 04 10:36:07 crc kubenswrapper[4831]: E1204 10:36:07.320301 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6040f79c-a151-4681-a758-d2741bff68b6" containerName="watcher-decision-engine" Dec 04 10:36:07 crc kubenswrapper[4831]: I1204 10:36:07.320323 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6040f79c-a151-4681-a758-d2741bff68b6" containerName="watcher-decision-engine" Dec 04 10:36:07 crc kubenswrapper[4831]: E1204 10:36:07.320385 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6040f79c-a151-4681-a758-d2741bff68b6" containerName="watcher-decision-engine" Dec 04 10:36:07 crc kubenswrapper[4831]: I1204 10:36:07.320394 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6040f79c-a151-4681-a758-d2741bff68b6" containerName="watcher-decision-engine" Dec 04 10:36:07 crc kubenswrapper[4831]: E1204 10:36:07.320406 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6040f79c-a151-4681-a758-d2741bff68b6" containerName="watcher-decision-engine" Dec 04 10:36:07 crc kubenswrapper[4831]: I1204 10:36:07.320414 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6040f79c-a151-4681-a758-d2741bff68b6" containerName="watcher-decision-engine" Dec 04 10:36:07 crc kubenswrapper[4831]: E1204 10:36:07.320457 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6040f79c-a151-4681-a758-d2741bff68b6" containerName="watcher-decision-engine" Dec 04 10:36:07 crc kubenswrapper[4831]: I1204 10:36:07.320467 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6040f79c-a151-4681-a758-d2741bff68b6" containerName="watcher-decision-engine" Dec 04 10:36:07 crc kubenswrapper[4831]: I1204 10:36:07.321981 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="6040f79c-a151-4681-a758-d2741bff68b6" containerName="watcher-decision-engine" Dec 04 10:36:07 crc kubenswrapper[4831]: I1204 10:36:07.322018 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="6040f79c-a151-4681-a758-d2741bff68b6" containerName="watcher-decision-engine" Dec 04 10:36:07 crc kubenswrapper[4831]: I1204 10:36:07.322051 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="6040f79c-a151-4681-a758-d2741bff68b6" containerName="watcher-decision-engine" Dec 04 10:36:07 crc kubenswrapper[4831]: I1204 10:36:07.322908 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 04 10:36:07 crc kubenswrapper[4831]: I1204 10:36:07.323046 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 04 10:36:07 crc kubenswrapper[4831]: I1204 10:36:07.327521 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Dec 04 10:36:07 crc kubenswrapper[4831]: I1204 10:36:07.332239 4831 scope.go:117] "RemoveContainer" containerID="362f1baaa691428bb0b53e1a5646ee86e99621f28c605cae4217d58e7e827ff7" Dec 04 10:36:07 crc kubenswrapper[4831]: E1204 10:36:07.332761 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"362f1baaa691428bb0b53e1a5646ee86e99621f28c605cae4217d58e7e827ff7\": container with ID starting with 362f1baaa691428bb0b53e1a5646ee86e99621f28c605cae4217d58e7e827ff7 not found: ID does not exist" containerID="362f1baaa691428bb0b53e1a5646ee86e99621f28c605cae4217d58e7e827ff7" Dec 04 10:36:07 crc kubenswrapper[4831]: I1204 10:36:07.332828 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"362f1baaa691428bb0b53e1a5646ee86e99621f28c605cae4217d58e7e827ff7"} err="failed to get container status \"362f1baaa691428bb0b53e1a5646ee86e99621f28c605cae4217d58e7e827ff7\": rpc error: code = NotFound desc = could not find container \"362f1baaa691428bb0b53e1a5646ee86e99621f28c605cae4217d58e7e827ff7\": container with ID starting with 362f1baaa691428bb0b53e1a5646ee86e99621f28c605cae4217d58e7e827ff7 not found: ID does not exist" Dec 04 10:36:07 crc kubenswrapper[4831]: I1204 10:36:07.332855 4831 scope.go:117] "RemoveContainer" containerID="bac90f7915063288de67a2e139c40837b5708a4e13373e20cb3ac91496de72b3" Dec 04 10:36:07 crc kubenswrapper[4831]: E1204 10:36:07.333165 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bac90f7915063288de67a2e139c40837b5708a4e13373e20cb3ac91496de72b3\": container with ID starting with bac90f7915063288de67a2e139c40837b5708a4e13373e20cb3ac91496de72b3 not found: ID does not exist" containerID="bac90f7915063288de67a2e139c40837b5708a4e13373e20cb3ac91496de72b3" Dec 04 10:36:07 crc kubenswrapper[4831]: I1204 10:36:07.333202 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bac90f7915063288de67a2e139c40837b5708a4e13373e20cb3ac91496de72b3"} err="failed to get container status \"bac90f7915063288de67a2e139c40837b5708a4e13373e20cb3ac91496de72b3\": rpc error: code = NotFound desc = could not find container \"bac90f7915063288de67a2e139c40837b5708a4e13373e20cb3ac91496de72b3\": container with ID starting with bac90f7915063288de67a2e139c40837b5708a4e13373e20cb3ac91496de72b3 not found: ID does not exist" Dec 04 10:36:07 crc kubenswrapper[4831]: I1204 10:36:07.411741 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0edaf6c9-d794-4d0c-ab0c-35d46001545a-logs\") pod \"watcher-decision-engine-0\" (UID: \"0edaf6c9-d794-4d0c-ab0c-35d46001545a\") " pod="openstack/watcher-decision-engine-0" Dec 04 10:36:07 crc kubenswrapper[4831]: I1204 10:36:07.412062 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0edaf6c9-d794-4d0c-ab0c-35d46001545a-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"0edaf6c9-d794-4d0c-ab0c-35d46001545a\") " pod="openstack/watcher-decision-engine-0" Dec 04 10:36:07 crc kubenswrapper[4831]: I1204 10:36:07.412233 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0edaf6c9-d794-4d0c-ab0c-35d46001545a-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"0edaf6c9-d794-4d0c-ab0c-35d46001545a\") " pod="openstack/watcher-decision-engine-0" Dec 04 10:36:07 crc kubenswrapper[4831]: I1204 10:36:07.412471 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0edaf6c9-d794-4d0c-ab0c-35d46001545a-config-data\") pod \"watcher-decision-engine-0\" (UID: \"0edaf6c9-d794-4d0c-ab0c-35d46001545a\") " pod="openstack/watcher-decision-engine-0" Dec 04 10:36:07 crc kubenswrapper[4831]: I1204 10:36:07.412618 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcclb\" (UniqueName: \"kubernetes.io/projected/0edaf6c9-d794-4d0c-ab0c-35d46001545a-kube-api-access-pcclb\") pod \"watcher-decision-engine-0\" (UID: \"0edaf6c9-d794-4d0c-ab0c-35d46001545a\") " pod="openstack/watcher-decision-engine-0" Dec 04 10:36:07 crc kubenswrapper[4831]: I1204 10:36:07.514696 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0edaf6c9-d794-4d0c-ab0c-35d46001545a-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"0edaf6c9-d794-4d0c-ab0c-35d46001545a\") " pod="openstack/watcher-decision-engine-0" Dec 04 10:36:07 crc kubenswrapper[4831]: I1204 10:36:07.514984 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0edaf6c9-d794-4d0c-ab0c-35d46001545a-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"0edaf6c9-d794-4d0c-ab0c-35d46001545a\") " pod="openstack/watcher-decision-engine-0" Dec 04 10:36:07 crc kubenswrapper[4831]: I1204 10:36:07.515177 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0edaf6c9-d794-4d0c-ab0c-35d46001545a-config-data\") pod \"watcher-decision-engine-0\" (UID: \"0edaf6c9-d794-4d0c-ab0c-35d46001545a\") " pod="openstack/watcher-decision-engine-0" Dec 04 10:36:07 crc kubenswrapper[4831]: I1204 10:36:07.515272 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcclb\" (UniqueName: \"kubernetes.io/projected/0edaf6c9-d794-4d0c-ab0c-35d46001545a-kube-api-access-pcclb\") pod \"watcher-decision-engine-0\" (UID: \"0edaf6c9-d794-4d0c-ab0c-35d46001545a\") " pod="openstack/watcher-decision-engine-0" Dec 04 10:36:07 crc kubenswrapper[4831]: I1204 10:36:07.515414 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0edaf6c9-d794-4d0c-ab0c-35d46001545a-logs\") pod \"watcher-decision-engine-0\" (UID: \"0edaf6c9-d794-4d0c-ab0c-35d46001545a\") " pod="openstack/watcher-decision-engine-0" Dec 04 10:36:07 crc kubenswrapper[4831]: I1204 10:36:07.515869 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0edaf6c9-d794-4d0c-ab0c-35d46001545a-logs\") pod \"watcher-decision-engine-0\" (UID: \"0edaf6c9-d794-4d0c-ab0c-35d46001545a\") " pod="openstack/watcher-decision-engine-0" Dec 04 10:36:07 crc kubenswrapper[4831]: I1204 10:36:07.519953 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0edaf6c9-d794-4d0c-ab0c-35d46001545a-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"0edaf6c9-d794-4d0c-ab0c-35d46001545a\") " pod="openstack/watcher-decision-engine-0" Dec 04 10:36:07 crc kubenswrapper[4831]: I1204 10:36:07.520322 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0edaf6c9-d794-4d0c-ab0c-35d46001545a-config-data\") pod \"watcher-decision-engine-0\" (UID: \"0edaf6c9-d794-4d0c-ab0c-35d46001545a\") " pod="openstack/watcher-decision-engine-0" Dec 04 10:36:07 crc kubenswrapper[4831]: I1204 10:36:07.520344 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0edaf6c9-d794-4d0c-ab0c-35d46001545a-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"0edaf6c9-d794-4d0c-ab0c-35d46001545a\") " pod="openstack/watcher-decision-engine-0" Dec 04 10:36:07 crc kubenswrapper[4831]: I1204 10:36:07.533881 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcclb\" (UniqueName: \"kubernetes.io/projected/0edaf6c9-d794-4d0c-ab0c-35d46001545a-kube-api-access-pcclb\") pod \"watcher-decision-engine-0\" (UID: \"0edaf6c9-d794-4d0c-ab0c-35d46001545a\") " pod="openstack/watcher-decision-engine-0" Dec 04 10:36:07 crc kubenswrapper[4831]: I1204 10:36:07.651283 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 04 10:36:08 crc kubenswrapper[4831]: I1204 10:36:08.084701 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 04 10:36:08 crc kubenswrapper[4831]: I1204 10:36:08.195393 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"0edaf6c9-d794-4d0c-ab0c-35d46001545a","Type":"ContainerStarted","Data":"8c28e523cd5a071ef9e4ae5531649edb8e37d490cd40221d353e258260d7b736"} Dec 04 10:36:09 crc kubenswrapper[4831]: I1204 10:36:09.208589 4831 generic.go:334] "Generic (PLEG): container finished" podID="83153b32-d324-4aec-a5f3-ff0c0a47c0ee" containerID="3803d71b1b1890b734afe453af5bc2ca843a43328ac506860f50d2a6674b0fbb" exitCode=0 Dec 04 10:36:09 crc kubenswrapper[4831]: I1204 10:36:09.208709 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-x2xtp" event={"ID":"83153b32-d324-4aec-a5f3-ff0c0a47c0ee","Type":"ContainerDied","Data":"3803d71b1b1890b734afe453af5bc2ca843a43328ac506860f50d2a6674b0fbb"} Dec 04 10:36:09 crc kubenswrapper[4831]: I1204 10:36:09.211079 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"0edaf6c9-d794-4d0c-ab0c-35d46001545a","Type":"ContainerStarted","Data":"5cb2b11f710ea9fe10b29c5d398ee9932e11bcdf990b23afd92713f06734d550"} Dec 04 10:36:09 crc kubenswrapper[4831]: I1204 10:36:09.238245 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=2.238226365 podStartE2EDuration="2.238226365s" podCreationTimestamp="2025-12-04 10:36:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:36:09.238094152 +0000 UTC m=+1266.187269466" watchObservedRunningTime="2025-12-04 10:36:09.238226365 +0000 UTC m=+1266.187401689" Dec 04 10:36:10 crc kubenswrapper[4831]: I1204 10:36:10.556385 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-x2xtp" Dec 04 10:36:10 crc kubenswrapper[4831]: I1204 10:36:10.680065 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f474p\" (UniqueName: \"kubernetes.io/projected/83153b32-d324-4aec-a5f3-ff0c0a47c0ee-kube-api-access-f474p\") pod \"83153b32-d324-4aec-a5f3-ff0c0a47c0ee\" (UID: \"83153b32-d324-4aec-a5f3-ff0c0a47c0ee\") " Dec 04 10:36:10 crc kubenswrapper[4831]: I1204 10:36:10.680140 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83153b32-d324-4aec-a5f3-ff0c0a47c0ee-config-data\") pod \"83153b32-d324-4aec-a5f3-ff0c0a47c0ee\" (UID: \"83153b32-d324-4aec-a5f3-ff0c0a47c0ee\") " Dec 04 10:36:10 crc kubenswrapper[4831]: I1204 10:36:10.680202 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83153b32-d324-4aec-a5f3-ff0c0a47c0ee-scripts\") pod \"83153b32-d324-4aec-a5f3-ff0c0a47c0ee\" (UID: \"83153b32-d324-4aec-a5f3-ff0c0a47c0ee\") " Dec 04 10:36:10 crc kubenswrapper[4831]: I1204 10:36:10.680253 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83153b32-d324-4aec-a5f3-ff0c0a47c0ee-combined-ca-bundle\") pod \"83153b32-d324-4aec-a5f3-ff0c0a47c0ee\" (UID: \"83153b32-d324-4aec-a5f3-ff0c0a47c0ee\") " Dec 04 10:36:10 crc kubenswrapper[4831]: I1204 10:36:10.685539 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83153b32-d324-4aec-a5f3-ff0c0a47c0ee-scripts" (OuterVolumeSpecName: "scripts") pod "83153b32-d324-4aec-a5f3-ff0c0a47c0ee" (UID: "83153b32-d324-4aec-a5f3-ff0c0a47c0ee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:36:10 crc kubenswrapper[4831]: I1204 10:36:10.686828 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83153b32-d324-4aec-a5f3-ff0c0a47c0ee-kube-api-access-f474p" (OuterVolumeSpecName: "kube-api-access-f474p") pod "83153b32-d324-4aec-a5f3-ff0c0a47c0ee" (UID: "83153b32-d324-4aec-a5f3-ff0c0a47c0ee"). InnerVolumeSpecName "kube-api-access-f474p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:36:10 crc kubenswrapper[4831]: I1204 10:36:10.713969 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83153b32-d324-4aec-a5f3-ff0c0a47c0ee-config-data" (OuterVolumeSpecName: "config-data") pod "83153b32-d324-4aec-a5f3-ff0c0a47c0ee" (UID: "83153b32-d324-4aec-a5f3-ff0c0a47c0ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:36:10 crc kubenswrapper[4831]: I1204 10:36:10.716998 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83153b32-d324-4aec-a5f3-ff0c0a47c0ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "83153b32-d324-4aec-a5f3-ff0c0a47c0ee" (UID: "83153b32-d324-4aec-a5f3-ff0c0a47c0ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:36:10 crc kubenswrapper[4831]: I1204 10:36:10.782495 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f474p\" (UniqueName: \"kubernetes.io/projected/83153b32-d324-4aec-a5f3-ff0c0a47c0ee-kube-api-access-f474p\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:10 crc kubenswrapper[4831]: I1204 10:36:10.782528 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83153b32-d324-4aec-a5f3-ff0c0a47c0ee-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:10 crc kubenswrapper[4831]: I1204 10:36:10.782552 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83153b32-d324-4aec-a5f3-ff0c0a47c0ee-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:10 crc kubenswrapper[4831]: I1204 10:36:10.782562 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83153b32-d324-4aec-a5f3-ff0c0a47c0ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:11 crc kubenswrapper[4831]: I1204 10:36:11.254589 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-x2xtp" event={"ID":"83153b32-d324-4aec-a5f3-ff0c0a47c0ee","Type":"ContainerDied","Data":"acd7de39765ac91bf370ebeed6bf311a689f61131d309eb28f28842e294f9225"} Dec 04 10:36:11 crc kubenswrapper[4831]: I1204 10:36:11.256309 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acd7de39765ac91bf370ebeed6bf311a689f61131d309eb28f28842e294f9225" Dec 04 10:36:11 crc kubenswrapper[4831]: I1204 10:36:11.256338 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-x2xtp" Dec 04 10:36:11 crc kubenswrapper[4831]: I1204 10:36:11.320138 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 10:36:11 crc kubenswrapper[4831]: E1204 10:36:11.320539 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83153b32-d324-4aec-a5f3-ff0c0a47c0ee" containerName="nova-cell0-conductor-db-sync" Dec 04 10:36:11 crc kubenswrapper[4831]: I1204 10:36:11.320554 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="83153b32-d324-4aec-a5f3-ff0c0a47c0ee" containerName="nova-cell0-conductor-db-sync" Dec 04 10:36:11 crc kubenswrapper[4831]: I1204 10:36:11.320733 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="6040f79c-a151-4681-a758-d2741bff68b6" containerName="watcher-decision-engine" Dec 04 10:36:11 crc kubenswrapper[4831]: I1204 10:36:11.320774 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="83153b32-d324-4aec-a5f3-ff0c0a47c0ee" containerName="nova-cell0-conductor-db-sync" Dec 04 10:36:11 crc kubenswrapper[4831]: I1204 10:36:11.321469 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 04 10:36:11 crc kubenswrapper[4831]: I1204 10:36:11.334129 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-z2mjw" Dec 04 10:36:11 crc kubenswrapper[4831]: I1204 10:36:11.334311 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 04 10:36:11 crc kubenswrapper[4831]: I1204 10:36:11.334639 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 10:36:11 crc kubenswrapper[4831]: I1204 10:36:11.397465 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlx5w\" (UniqueName: \"kubernetes.io/projected/8deabc31-a8c8-43f8-b854-d25d8e13f9ed-kube-api-access-hlx5w\") pod \"nova-cell0-conductor-0\" (UID: \"8deabc31-a8c8-43f8-b854-d25d8e13f9ed\") " pod="openstack/nova-cell0-conductor-0" Dec 04 10:36:11 crc kubenswrapper[4831]: I1204 10:36:11.397847 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8deabc31-a8c8-43f8-b854-d25d8e13f9ed-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"8deabc31-a8c8-43f8-b854-d25d8e13f9ed\") " pod="openstack/nova-cell0-conductor-0" Dec 04 10:36:11 crc kubenswrapper[4831]: I1204 10:36:11.398159 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8deabc31-a8c8-43f8-b854-d25d8e13f9ed-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"8deabc31-a8c8-43f8-b854-d25d8e13f9ed\") " pod="openstack/nova-cell0-conductor-0" Dec 04 10:36:11 crc kubenswrapper[4831]: I1204 10:36:11.499839 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8deabc31-a8c8-43f8-b854-d25d8e13f9ed-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"8deabc31-a8c8-43f8-b854-d25d8e13f9ed\") " pod="openstack/nova-cell0-conductor-0" Dec 04 10:36:11 crc kubenswrapper[4831]: I1204 10:36:11.499915 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8deabc31-a8c8-43f8-b854-d25d8e13f9ed-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"8deabc31-a8c8-43f8-b854-d25d8e13f9ed\") " pod="openstack/nova-cell0-conductor-0" Dec 04 10:36:11 crc kubenswrapper[4831]: I1204 10:36:11.500079 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlx5w\" (UniqueName: \"kubernetes.io/projected/8deabc31-a8c8-43f8-b854-d25d8e13f9ed-kube-api-access-hlx5w\") pod \"nova-cell0-conductor-0\" (UID: \"8deabc31-a8c8-43f8-b854-d25d8e13f9ed\") " pod="openstack/nova-cell0-conductor-0" Dec 04 10:36:11 crc kubenswrapper[4831]: I1204 10:36:11.505414 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8deabc31-a8c8-43f8-b854-d25d8e13f9ed-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"8deabc31-a8c8-43f8-b854-d25d8e13f9ed\") " pod="openstack/nova-cell0-conductor-0" Dec 04 10:36:11 crc kubenswrapper[4831]: I1204 10:36:11.514412 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8deabc31-a8c8-43f8-b854-d25d8e13f9ed-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"8deabc31-a8c8-43f8-b854-d25d8e13f9ed\") " pod="openstack/nova-cell0-conductor-0" Dec 04 10:36:11 crc kubenswrapper[4831]: I1204 10:36:11.518544 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlx5w\" (UniqueName: \"kubernetes.io/projected/8deabc31-a8c8-43f8-b854-d25d8e13f9ed-kube-api-access-hlx5w\") pod \"nova-cell0-conductor-0\" (UID: \"8deabc31-a8c8-43f8-b854-d25d8e13f9ed\") " pod="openstack/nova-cell0-conductor-0" Dec 04 10:36:11 crc kubenswrapper[4831]: I1204 10:36:11.644541 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 04 10:36:12 crc kubenswrapper[4831]: I1204 10:36:12.120983 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 10:36:12 crc kubenswrapper[4831]: W1204 10:36:12.128539 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8deabc31_a8c8_43f8_b854_d25d8e13f9ed.slice/crio-75b178ab494b62a5ee019b0878e3789a23ae5651f83245d656a679b77019e744 WatchSource:0}: Error finding container 75b178ab494b62a5ee019b0878e3789a23ae5651f83245d656a679b77019e744: Status 404 returned error can't find the container with id 75b178ab494b62a5ee019b0878e3789a23ae5651f83245d656a679b77019e744 Dec 04 10:36:12 crc kubenswrapper[4831]: I1204 10:36:12.273345 4831 generic.go:334] "Generic (PLEG): container finished" podID="e416fb67-8343-42f4-9214-9b1d0cc9a2bd" containerID="0f9f9089ad8c95355b59e8c12e5ba27fef74afe0c975d0f18bd7c727b7db8997" exitCode=0 Dec 04 10:36:12 crc kubenswrapper[4831]: I1204 10:36:12.273409 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e416fb67-8343-42f4-9214-9b1d0cc9a2bd","Type":"ContainerDied","Data":"0f9f9089ad8c95355b59e8c12e5ba27fef74afe0c975d0f18bd7c727b7db8997"} Dec 04 10:36:12 crc kubenswrapper[4831]: I1204 10:36:12.275402 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"8deabc31-a8c8-43f8-b854-d25d8e13f9ed","Type":"ContainerStarted","Data":"75b178ab494b62a5ee019b0878e3789a23ae5651f83245d656a679b77019e744"} Dec 04 10:36:12 crc kubenswrapper[4831]: I1204 10:36:12.301897 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:36:12 crc kubenswrapper[4831]: I1204 10:36:12.416108 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e416fb67-8343-42f4-9214-9b1d0cc9a2bd-log-httpd\") pod \"e416fb67-8343-42f4-9214-9b1d0cc9a2bd\" (UID: \"e416fb67-8343-42f4-9214-9b1d0cc9a2bd\") " Dec 04 10:36:12 crc kubenswrapper[4831]: I1204 10:36:12.416199 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e416fb67-8343-42f4-9214-9b1d0cc9a2bd-config-data\") pod \"e416fb67-8343-42f4-9214-9b1d0cc9a2bd\" (UID: \"e416fb67-8343-42f4-9214-9b1d0cc9a2bd\") " Dec 04 10:36:12 crc kubenswrapper[4831]: I1204 10:36:12.416288 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzql5\" (UniqueName: \"kubernetes.io/projected/e416fb67-8343-42f4-9214-9b1d0cc9a2bd-kube-api-access-nzql5\") pod \"e416fb67-8343-42f4-9214-9b1d0cc9a2bd\" (UID: \"e416fb67-8343-42f4-9214-9b1d0cc9a2bd\") " Dec 04 10:36:12 crc kubenswrapper[4831]: I1204 10:36:12.416349 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e416fb67-8343-42f4-9214-9b1d0cc9a2bd-combined-ca-bundle\") pod \"e416fb67-8343-42f4-9214-9b1d0cc9a2bd\" (UID: \"e416fb67-8343-42f4-9214-9b1d0cc9a2bd\") " Dec 04 10:36:12 crc kubenswrapper[4831]: I1204 10:36:12.416372 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e416fb67-8343-42f4-9214-9b1d0cc9a2bd-sg-core-conf-yaml\") pod \"e416fb67-8343-42f4-9214-9b1d0cc9a2bd\" (UID: \"e416fb67-8343-42f4-9214-9b1d0cc9a2bd\") " Dec 04 10:36:12 crc kubenswrapper[4831]: I1204 10:36:12.416426 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e416fb67-8343-42f4-9214-9b1d0cc9a2bd-scripts\") pod \"e416fb67-8343-42f4-9214-9b1d0cc9a2bd\" (UID: \"e416fb67-8343-42f4-9214-9b1d0cc9a2bd\") " Dec 04 10:36:12 crc kubenswrapper[4831]: I1204 10:36:12.416480 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e416fb67-8343-42f4-9214-9b1d0cc9a2bd-run-httpd\") pod \"e416fb67-8343-42f4-9214-9b1d0cc9a2bd\" (UID: \"e416fb67-8343-42f4-9214-9b1d0cc9a2bd\") " Dec 04 10:36:12 crc kubenswrapper[4831]: I1204 10:36:12.416795 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e416fb67-8343-42f4-9214-9b1d0cc9a2bd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e416fb67-8343-42f4-9214-9b1d0cc9a2bd" (UID: "e416fb67-8343-42f4-9214-9b1d0cc9a2bd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:36:12 crc kubenswrapper[4831]: I1204 10:36:12.416982 4831 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e416fb67-8343-42f4-9214-9b1d0cc9a2bd-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:12 crc kubenswrapper[4831]: I1204 10:36:12.417472 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e416fb67-8343-42f4-9214-9b1d0cc9a2bd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e416fb67-8343-42f4-9214-9b1d0cc9a2bd" (UID: "e416fb67-8343-42f4-9214-9b1d0cc9a2bd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:36:12 crc kubenswrapper[4831]: I1204 10:36:12.420820 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e416fb67-8343-42f4-9214-9b1d0cc9a2bd-scripts" (OuterVolumeSpecName: "scripts") pod "e416fb67-8343-42f4-9214-9b1d0cc9a2bd" (UID: "e416fb67-8343-42f4-9214-9b1d0cc9a2bd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:36:12 crc kubenswrapper[4831]: I1204 10:36:12.420979 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e416fb67-8343-42f4-9214-9b1d0cc9a2bd-kube-api-access-nzql5" (OuterVolumeSpecName: "kube-api-access-nzql5") pod "e416fb67-8343-42f4-9214-9b1d0cc9a2bd" (UID: "e416fb67-8343-42f4-9214-9b1d0cc9a2bd"). InnerVolumeSpecName "kube-api-access-nzql5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:36:12 crc kubenswrapper[4831]: I1204 10:36:12.445798 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e416fb67-8343-42f4-9214-9b1d0cc9a2bd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e416fb67-8343-42f4-9214-9b1d0cc9a2bd" (UID: "e416fb67-8343-42f4-9214-9b1d0cc9a2bd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:36:12 crc kubenswrapper[4831]: I1204 10:36:12.515203 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e416fb67-8343-42f4-9214-9b1d0cc9a2bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e416fb67-8343-42f4-9214-9b1d0cc9a2bd" (UID: "e416fb67-8343-42f4-9214-9b1d0cc9a2bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:36:12 crc kubenswrapper[4831]: I1204 10:36:12.516206 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e416fb67-8343-42f4-9214-9b1d0cc9a2bd-config-data" (OuterVolumeSpecName: "config-data") pod "e416fb67-8343-42f4-9214-9b1d0cc9a2bd" (UID: "e416fb67-8343-42f4-9214-9b1d0cc9a2bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:36:12 crc kubenswrapper[4831]: I1204 10:36:12.518957 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e416fb67-8343-42f4-9214-9b1d0cc9a2bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:12 crc kubenswrapper[4831]: I1204 10:36:12.518992 4831 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e416fb67-8343-42f4-9214-9b1d0cc9a2bd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:12 crc kubenswrapper[4831]: I1204 10:36:12.519005 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e416fb67-8343-42f4-9214-9b1d0cc9a2bd-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:12 crc kubenswrapper[4831]: I1204 10:36:12.519018 4831 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e416fb67-8343-42f4-9214-9b1d0cc9a2bd-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:12 crc kubenswrapper[4831]: I1204 10:36:12.519028 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e416fb67-8343-42f4-9214-9b1d0cc9a2bd-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:12 crc kubenswrapper[4831]: I1204 10:36:12.519043 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzql5\" (UniqueName: \"kubernetes.io/projected/e416fb67-8343-42f4-9214-9b1d0cc9a2bd-kube-api-access-nzql5\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.300801 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e416fb67-8343-42f4-9214-9b1d0cc9a2bd","Type":"ContainerDied","Data":"31aba505a8c80b6bd429bb8e194332c8c81716e3f24b2bf8f270e15f6a484106"} Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.303077 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.304954 4831 scope.go:117] "RemoveContainer" containerID="fd991c7345d36b8c22484dd88b156b14878dfab9ed908f14f8f2e6ff1eb67fdc" Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.305393 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"8deabc31-a8c8-43f8-b854-d25d8e13f9ed","Type":"ContainerStarted","Data":"d2c546ebb598460da919620fcbfdfa9d9aab3c80d1f7ce19a032c3a6c38f808e"} Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.305845 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.333102 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.333082733 podStartE2EDuration="2.333082733s" podCreationTimestamp="2025-12-04 10:36:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:36:13.32546715 +0000 UTC m=+1270.274642484" watchObservedRunningTime="2025-12-04 10:36:13.333082733 +0000 UTC m=+1270.282258047" Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.354929 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.379751 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.391605 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:36:13 crc kubenswrapper[4831]: E1204 10:36:13.392061 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e416fb67-8343-42f4-9214-9b1d0cc9a2bd" containerName="ceilometer-notification-agent" Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.392081 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e416fb67-8343-42f4-9214-9b1d0cc9a2bd" containerName="ceilometer-notification-agent" Dec 04 10:36:13 crc kubenswrapper[4831]: E1204 10:36:13.392100 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e416fb67-8343-42f4-9214-9b1d0cc9a2bd" containerName="sg-core" Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.392108 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e416fb67-8343-42f4-9214-9b1d0cc9a2bd" containerName="sg-core" Dec 04 10:36:13 crc kubenswrapper[4831]: E1204 10:36:13.392123 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e416fb67-8343-42f4-9214-9b1d0cc9a2bd" containerName="ceilometer-central-agent" Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.392130 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e416fb67-8343-42f4-9214-9b1d0cc9a2bd" containerName="ceilometer-central-agent" Dec 04 10:36:13 crc kubenswrapper[4831]: E1204 10:36:13.392149 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e416fb67-8343-42f4-9214-9b1d0cc9a2bd" containerName="proxy-httpd" Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.392156 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e416fb67-8343-42f4-9214-9b1d0cc9a2bd" containerName="proxy-httpd" Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.392362 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e416fb67-8343-42f4-9214-9b1d0cc9a2bd" containerName="proxy-httpd" Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.392384 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e416fb67-8343-42f4-9214-9b1d0cc9a2bd" containerName="ceilometer-central-agent" Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.392393 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e416fb67-8343-42f4-9214-9b1d0cc9a2bd" containerName="ceilometer-notification-agent" Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.392410 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e416fb67-8343-42f4-9214-9b1d0cc9a2bd" containerName="sg-core" Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.394313 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.397834 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.398296 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.408168 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.539807 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/497edb30-816e-45e2-9180-cfc1392f2c1c-config-data\") pod \"ceilometer-0\" (UID: \"497edb30-816e-45e2-9180-cfc1392f2c1c\") " pod="openstack/ceilometer-0" Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.539873 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/497edb30-816e-45e2-9180-cfc1392f2c1c-log-httpd\") pod \"ceilometer-0\" (UID: \"497edb30-816e-45e2-9180-cfc1392f2c1c\") " pod="openstack/ceilometer-0" Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.539909 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/497edb30-816e-45e2-9180-cfc1392f2c1c-run-httpd\") pod \"ceilometer-0\" (UID: \"497edb30-816e-45e2-9180-cfc1392f2c1c\") " pod="openstack/ceilometer-0" Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.540102 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d7lq\" (UniqueName: \"kubernetes.io/projected/497edb30-816e-45e2-9180-cfc1392f2c1c-kube-api-access-7d7lq\") pod \"ceilometer-0\" (UID: \"497edb30-816e-45e2-9180-cfc1392f2c1c\") " pod="openstack/ceilometer-0" Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.540203 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/497edb30-816e-45e2-9180-cfc1392f2c1c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"497edb30-816e-45e2-9180-cfc1392f2c1c\") " pod="openstack/ceilometer-0" Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.540331 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/497edb30-816e-45e2-9180-cfc1392f2c1c-scripts\") pod \"ceilometer-0\" (UID: \"497edb30-816e-45e2-9180-cfc1392f2c1c\") " pod="openstack/ceilometer-0" Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.540523 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/497edb30-816e-45e2-9180-cfc1392f2c1c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"497edb30-816e-45e2-9180-cfc1392f2c1c\") " pod="openstack/ceilometer-0" Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.642317 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/497edb30-816e-45e2-9180-cfc1392f2c1c-config-data\") pod \"ceilometer-0\" (UID: \"497edb30-816e-45e2-9180-cfc1392f2c1c\") " pod="openstack/ceilometer-0" Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.642388 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/497edb30-816e-45e2-9180-cfc1392f2c1c-log-httpd\") pod \"ceilometer-0\" (UID: \"497edb30-816e-45e2-9180-cfc1392f2c1c\") " pod="openstack/ceilometer-0" Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.642438 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/497edb30-816e-45e2-9180-cfc1392f2c1c-run-httpd\") pod \"ceilometer-0\" (UID: \"497edb30-816e-45e2-9180-cfc1392f2c1c\") " pod="openstack/ceilometer-0" Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.642488 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7d7lq\" (UniqueName: \"kubernetes.io/projected/497edb30-816e-45e2-9180-cfc1392f2c1c-kube-api-access-7d7lq\") pod \"ceilometer-0\" (UID: \"497edb30-816e-45e2-9180-cfc1392f2c1c\") " pod="openstack/ceilometer-0" Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.642526 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/497edb30-816e-45e2-9180-cfc1392f2c1c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"497edb30-816e-45e2-9180-cfc1392f2c1c\") " pod="openstack/ceilometer-0" Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.642564 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/497edb30-816e-45e2-9180-cfc1392f2c1c-scripts\") pod \"ceilometer-0\" (UID: \"497edb30-816e-45e2-9180-cfc1392f2c1c\") " pod="openstack/ceilometer-0" Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.642619 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/497edb30-816e-45e2-9180-cfc1392f2c1c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"497edb30-816e-45e2-9180-cfc1392f2c1c\") " pod="openstack/ceilometer-0" Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.643853 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/497edb30-816e-45e2-9180-cfc1392f2c1c-run-httpd\") pod \"ceilometer-0\" (UID: \"497edb30-816e-45e2-9180-cfc1392f2c1c\") " pod="openstack/ceilometer-0" Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.643841 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/497edb30-816e-45e2-9180-cfc1392f2c1c-log-httpd\") pod \"ceilometer-0\" (UID: \"497edb30-816e-45e2-9180-cfc1392f2c1c\") " pod="openstack/ceilometer-0" Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.644559 4831 scope.go:117] "RemoveContainer" containerID="585716c3b8363f3cf20cb589e33d7dc7c39205ec5de1abec06d28fc63a171430" Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.648611 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/497edb30-816e-45e2-9180-cfc1392f2c1c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"497edb30-816e-45e2-9180-cfc1392f2c1c\") " pod="openstack/ceilometer-0" Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.650473 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/497edb30-816e-45e2-9180-cfc1392f2c1c-config-data\") pod \"ceilometer-0\" (UID: \"497edb30-816e-45e2-9180-cfc1392f2c1c\") " pod="openstack/ceilometer-0" Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.651368 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/497edb30-816e-45e2-9180-cfc1392f2c1c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"497edb30-816e-45e2-9180-cfc1392f2c1c\") " pod="openstack/ceilometer-0" Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.655366 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/497edb30-816e-45e2-9180-cfc1392f2c1c-scripts\") pod \"ceilometer-0\" (UID: \"497edb30-816e-45e2-9180-cfc1392f2c1c\") " pod="openstack/ceilometer-0" Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.666465 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d7lq\" (UniqueName: \"kubernetes.io/projected/497edb30-816e-45e2-9180-cfc1392f2c1c-kube-api-access-7d7lq\") pod \"ceilometer-0\" (UID: \"497edb30-816e-45e2-9180-cfc1392f2c1c\") " pod="openstack/ceilometer-0" Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.724945 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.857742 4831 scope.go:117] "RemoveContainer" containerID="2c632582e5a5645e980b6e363572e3d784b1ef503e10c30320b5b76771567edf" Dec 04 10:36:13 crc kubenswrapper[4831]: I1204 10:36:13.888500 4831 scope.go:117] "RemoveContainer" containerID="0f9f9089ad8c95355b59e8c12e5ba27fef74afe0c975d0f18bd7c727b7db8997" Dec 04 10:36:14 crc kubenswrapper[4831]: I1204 10:36:14.239733 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:36:14 crc kubenswrapper[4831]: I1204 10:36:14.315181 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"497edb30-816e-45e2-9180-cfc1392f2c1c","Type":"ContainerStarted","Data":"177a4cc7b6b7877b0a90f7fc54fdd3802b74a893bf113981354dff7ca74c04e1"} Dec 04 10:36:15 crc kubenswrapper[4831]: I1204 10:36:15.306327 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e416fb67-8343-42f4-9214-9b1d0cc9a2bd" path="/var/lib/kubelet/pods/e416fb67-8343-42f4-9214-9b1d0cc9a2bd/volumes" Dec 04 10:36:16 crc kubenswrapper[4831]: I1204 10:36:16.344359 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"497edb30-816e-45e2-9180-cfc1392f2c1c","Type":"ContainerStarted","Data":"e275b793e55ff61a6841bba6df6a8edfacfa68c57633c0dcf3a30e8019927667"} Dec 04 10:36:16 crc kubenswrapper[4831]: I1204 10:36:16.344843 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"497edb30-816e-45e2-9180-cfc1392f2c1c","Type":"ContainerStarted","Data":"1cdb2f89346659e1489a3a26b8fd8d830f5d69984d0c880c48ec59a2523b29cd"} Dec 04 10:36:17 crc kubenswrapper[4831]: I1204 10:36:17.358981 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"497edb30-816e-45e2-9180-cfc1392f2c1c","Type":"ContainerStarted","Data":"ef95e8684c317ba39ddc1e8dd5a156ab0fe2edf5a5042395fb5dfb2e48b36d4f"} Dec 04 10:36:17 crc kubenswrapper[4831]: I1204 10:36:17.651496 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 04 10:36:17 crc kubenswrapper[4831]: I1204 10:36:17.680477 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Dec 04 10:36:18 crc kubenswrapper[4831]: I1204 10:36:18.366831 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Dec 04 10:36:18 crc kubenswrapper[4831]: I1204 10:36:18.405512 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Dec 04 10:36:19 crc kubenswrapper[4831]: I1204 10:36:19.378890 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"497edb30-816e-45e2-9180-cfc1392f2c1c","Type":"ContainerStarted","Data":"2ad59666456662eed54cc5463119b7fe38d168438dc698d4536fef71174693b0"} Dec 04 10:36:19 crc kubenswrapper[4831]: I1204 10:36:19.379263 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 10:36:19 crc kubenswrapper[4831]: I1204 10:36:19.399938 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.332815951 podStartE2EDuration="6.399922922s" podCreationTimestamp="2025-12-04 10:36:13 +0000 UTC" firstStartedPulling="2025-12-04 10:36:14.241501304 +0000 UTC m=+1271.190676618" lastFinishedPulling="2025-12-04 10:36:18.308608275 +0000 UTC m=+1275.257783589" observedRunningTime="2025-12-04 10:36:19.397138468 +0000 UTC m=+1276.346313782" watchObservedRunningTime="2025-12-04 10:36:19.399922922 +0000 UTC m=+1276.349098226" Dec 04 10:36:21 crc kubenswrapper[4831]: I1204 10:36:21.675562 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 04 10:36:21 crc kubenswrapper[4831]: I1204 10:36:21.971247 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:36:21 crc kubenswrapper[4831]: I1204 10:36:21.971292 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:36:21 crc kubenswrapper[4831]: I1204 10:36:21.971328 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" Dec 04 10:36:21 crc kubenswrapper[4831]: I1204 10:36:21.971982 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0123b3bc298be4c2ac62175c379dc6efb186183e599f1998133a95b106c98408"} pod="openshift-machine-config-operator/machine-config-daemon-g76nn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 10:36:21 crc kubenswrapper[4831]: I1204 10:36:21.972029 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" containerID="cri-o://0123b3bc298be4c2ac62175c379dc6efb186183e599f1998133a95b106c98408" gracePeriod=600 Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.129353 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-mmb2j"] Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.131022 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mmb2j" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.135615 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.135860 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.144588 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-mmb2j"] Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.223225 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dbc983b-0bfd-4646-9a07-4f0894f1c480-config-data\") pod \"nova-cell0-cell-mapping-mmb2j\" (UID: \"2dbc983b-0bfd-4646-9a07-4f0894f1c480\") " pod="openstack/nova-cell0-cell-mapping-mmb2j" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.223279 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dbc983b-0bfd-4646-9a07-4f0894f1c480-scripts\") pod \"nova-cell0-cell-mapping-mmb2j\" (UID: \"2dbc983b-0bfd-4646-9a07-4f0894f1c480\") " pod="openstack/nova-cell0-cell-mapping-mmb2j" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.223419 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kdtc\" (UniqueName: \"kubernetes.io/projected/2dbc983b-0bfd-4646-9a07-4f0894f1c480-kube-api-access-2kdtc\") pod \"nova-cell0-cell-mapping-mmb2j\" (UID: \"2dbc983b-0bfd-4646-9a07-4f0894f1c480\") " pod="openstack/nova-cell0-cell-mapping-mmb2j" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.223454 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dbc983b-0bfd-4646-9a07-4f0894f1c480-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mmb2j\" (UID: \"2dbc983b-0bfd-4646-9a07-4f0894f1c480\") " pod="openstack/nova-cell0-cell-mapping-mmb2j" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.328165 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dbc983b-0bfd-4646-9a07-4f0894f1c480-config-data\") pod \"nova-cell0-cell-mapping-mmb2j\" (UID: \"2dbc983b-0bfd-4646-9a07-4f0894f1c480\") " pod="openstack/nova-cell0-cell-mapping-mmb2j" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.328199 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dbc983b-0bfd-4646-9a07-4f0894f1c480-scripts\") pod \"nova-cell0-cell-mapping-mmb2j\" (UID: \"2dbc983b-0bfd-4646-9a07-4f0894f1c480\") " pod="openstack/nova-cell0-cell-mapping-mmb2j" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.328293 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kdtc\" (UniqueName: \"kubernetes.io/projected/2dbc983b-0bfd-4646-9a07-4f0894f1c480-kube-api-access-2kdtc\") pod \"nova-cell0-cell-mapping-mmb2j\" (UID: \"2dbc983b-0bfd-4646-9a07-4f0894f1c480\") " pod="openstack/nova-cell0-cell-mapping-mmb2j" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.328316 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dbc983b-0bfd-4646-9a07-4f0894f1c480-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mmb2j\" (UID: \"2dbc983b-0bfd-4646-9a07-4f0894f1c480\") " pod="openstack/nova-cell0-cell-mapping-mmb2j" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.336810 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dbc983b-0bfd-4646-9a07-4f0894f1c480-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mmb2j\" (UID: \"2dbc983b-0bfd-4646-9a07-4f0894f1c480\") " pod="openstack/nova-cell0-cell-mapping-mmb2j" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.346208 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.348309 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.352295 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dbc983b-0bfd-4646-9a07-4f0894f1c480-scripts\") pod \"nova-cell0-cell-mapping-mmb2j\" (UID: \"2dbc983b-0bfd-4646-9a07-4f0894f1c480\") " pod="openstack/nova-cell0-cell-mapping-mmb2j" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.354807 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dbc983b-0bfd-4646-9a07-4f0894f1c480-config-data\") pod \"nova-cell0-cell-mapping-mmb2j\" (UID: \"2dbc983b-0bfd-4646-9a07-4f0894f1c480\") " pod="openstack/nova-cell0-cell-mapping-mmb2j" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.364271 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.364627 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.422350 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kdtc\" (UniqueName: \"kubernetes.io/projected/2dbc983b-0bfd-4646-9a07-4f0894f1c480-kube-api-access-2kdtc\") pod \"nova-cell0-cell-mapping-mmb2j\" (UID: \"2dbc983b-0bfd-4646-9a07-4f0894f1c480\") " pod="openstack/nova-cell0-cell-mapping-mmb2j" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.432112 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bc76d60-b433-4a70-aeda-3866b89c9197-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0bc76d60-b433-4a70-aeda-3866b89c9197\") " pod="openstack/nova-api-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.432201 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpn46\" (UniqueName: \"kubernetes.io/projected/0bc76d60-b433-4a70-aeda-3866b89c9197-kube-api-access-fpn46\") pod \"nova-api-0\" (UID: \"0bc76d60-b433-4a70-aeda-3866b89c9197\") " pod="openstack/nova-api-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.432226 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bc76d60-b433-4a70-aeda-3866b89c9197-logs\") pod \"nova-api-0\" (UID: \"0bc76d60-b433-4a70-aeda-3866b89c9197\") " pod="openstack/nova-api-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.432358 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bc76d60-b433-4a70-aeda-3866b89c9197-config-data\") pod \"nova-api-0\" (UID: \"0bc76d60-b433-4a70-aeda-3866b89c9197\") " pod="openstack/nova-api-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.442008 4831 generic.go:334] "Generic (PLEG): container finished" podID="8475bb26-8864-4d49-935b-db7d4cb73387" containerID="0123b3bc298be4c2ac62175c379dc6efb186183e599f1998133a95b106c98408" exitCode=0 Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.442078 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" event={"ID":"8475bb26-8864-4d49-935b-db7d4cb73387","Type":"ContainerDied","Data":"0123b3bc298be4c2ac62175c379dc6efb186183e599f1998133a95b106c98408"} Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.442127 4831 scope.go:117] "RemoveContainer" containerID="e3d48d2b893c2eb24b90277e2cc2d8a14727460b3bf0732b9f1999efdd5e7c27" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.460173 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mmb2j" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.476927 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.478424 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.487729 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.489021 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.491393 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.494700 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.494948 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.506581 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.517593 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8df4bd59-49bbv"] Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.526319 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8df4bd59-49bbv" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.533532 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bc76d60-b433-4a70-aeda-3866b89c9197-config-data\") pod \"nova-api-0\" (UID: \"0bc76d60-b433-4a70-aeda-3866b89c9197\") " pod="openstack/nova-api-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.534614 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bc76d60-b433-4a70-aeda-3866b89c9197-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0bc76d60-b433-4a70-aeda-3866b89c9197\") " pod="openstack/nova-api-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.534762 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpn46\" (UniqueName: \"kubernetes.io/projected/0bc76d60-b433-4a70-aeda-3866b89c9197-kube-api-access-fpn46\") pod \"nova-api-0\" (UID: \"0bc76d60-b433-4a70-aeda-3866b89c9197\") " pod="openstack/nova-api-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.534855 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bc76d60-b433-4a70-aeda-3866b89c9197-logs\") pod \"nova-api-0\" (UID: \"0bc76d60-b433-4a70-aeda-3866b89c9197\") " pod="openstack/nova-api-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.535286 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bc76d60-b433-4a70-aeda-3866b89c9197-logs\") pod \"nova-api-0\" (UID: \"0bc76d60-b433-4a70-aeda-3866b89c9197\") " pod="openstack/nova-api-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.536484 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8df4bd59-49bbv"] Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.540257 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bc76d60-b433-4a70-aeda-3866b89c9197-config-data\") pod \"nova-api-0\" (UID: \"0bc76d60-b433-4a70-aeda-3866b89c9197\") " pod="openstack/nova-api-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.542335 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bc76d60-b433-4a70-aeda-3866b89c9197-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0bc76d60-b433-4a70-aeda-3866b89c9197\") " pod="openstack/nova-api-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.582348 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpn46\" (UniqueName: \"kubernetes.io/projected/0bc76d60-b433-4a70-aeda-3866b89c9197-kube-api-access-fpn46\") pod \"nova-api-0\" (UID: \"0bc76d60-b433-4a70-aeda-3866b89c9197\") " pod="openstack/nova-api-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.588403 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.597082 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.601070 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.607617 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.636337 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/948ac29a-d7ab-47a3-aa8e-d74b5f1de702-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"948ac29a-d7ab-47a3-aa8e-d74b5f1de702\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.636410 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f988f09-6ce5-4cf5-a32b-18fa5a0100d2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7f988f09-6ce5-4cf5-a32b-18fa5a0100d2\") " pod="openstack/nova-metadata-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.636473 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnrsx\" (UniqueName: \"kubernetes.io/projected/708027bc-d695-4f49-bb70-31173245a13f-kube-api-access-vnrsx\") pod \"dnsmasq-dns-8df4bd59-49bbv\" (UID: \"708027bc-d695-4f49-bb70-31173245a13f\") " pod="openstack/dnsmasq-dns-8df4bd59-49bbv" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.636522 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/708027bc-d695-4f49-bb70-31173245a13f-ovsdbserver-sb\") pod \"dnsmasq-dns-8df4bd59-49bbv\" (UID: \"708027bc-d695-4f49-bb70-31173245a13f\") " pod="openstack/dnsmasq-dns-8df4bd59-49bbv" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.636565 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/708027bc-d695-4f49-bb70-31173245a13f-dns-swift-storage-0\") pod \"dnsmasq-dns-8df4bd59-49bbv\" (UID: \"708027bc-d695-4f49-bb70-31173245a13f\") " pod="openstack/dnsmasq-dns-8df4bd59-49bbv" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.636617 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4cvc\" (UniqueName: \"kubernetes.io/projected/7f988f09-6ce5-4cf5-a32b-18fa5a0100d2-kube-api-access-v4cvc\") pod \"nova-metadata-0\" (UID: \"7f988f09-6ce5-4cf5-a32b-18fa5a0100d2\") " pod="openstack/nova-metadata-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.636638 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f988f09-6ce5-4cf5-a32b-18fa5a0100d2-logs\") pod \"nova-metadata-0\" (UID: \"7f988f09-6ce5-4cf5-a32b-18fa5a0100d2\") " pod="openstack/nova-metadata-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.636717 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/708027bc-d695-4f49-bb70-31173245a13f-dns-svc\") pod \"dnsmasq-dns-8df4bd59-49bbv\" (UID: \"708027bc-d695-4f49-bb70-31173245a13f\") " pod="openstack/dnsmasq-dns-8df4bd59-49bbv" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.636745 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/708027bc-d695-4f49-bb70-31173245a13f-ovsdbserver-nb\") pod \"dnsmasq-dns-8df4bd59-49bbv\" (UID: \"708027bc-d695-4f49-bb70-31173245a13f\") " pod="openstack/dnsmasq-dns-8df4bd59-49bbv" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.636772 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/708027bc-d695-4f49-bb70-31173245a13f-config\") pod \"dnsmasq-dns-8df4bd59-49bbv\" (UID: \"708027bc-d695-4f49-bb70-31173245a13f\") " pod="openstack/dnsmasq-dns-8df4bd59-49bbv" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.636805 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f988f09-6ce5-4cf5-a32b-18fa5a0100d2-config-data\") pod \"nova-metadata-0\" (UID: \"7f988f09-6ce5-4cf5-a32b-18fa5a0100d2\") " pod="openstack/nova-metadata-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.636830 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sttrf\" (UniqueName: \"kubernetes.io/projected/948ac29a-d7ab-47a3-aa8e-d74b5f1de702-kube-api-access-sttrf\") pod \"nova-cell1-novncproxy-0\" (UID: \"948ac29a-d7ab-47a3-aa8e-d74b5f1de702\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.636862 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/948ac29a-d7ab-47a3-aa8e-d74b5f1de702-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"948ac29a-d7ab-47a3-aa8e-d74b5f1de702\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.738525 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/708027bc-d695-4f49-bb70-31173245a13f-dns-svc\") pod \"dnsmasq-dns-8df4bd59-49bbv\" (UID: \"708027bc-d695-4f49-bb70-31173245a13f\") " pod="openstack/dnsmasq-dns-8df4bd59-49bbv" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.738575 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/708027bc-d695-4f49-bb70-31173245a13f-ovsdbserver-nb\") pod \"dnsmasq-dns-8df4bd59-49bbv\" (UID: \"708027bc-d695-4f49-bb70-31173245a13f\") " pod="openstack/dnsmasq-dns-8df4bd59-49bbv" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.738604 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/708027bc-d695-4f49-bb70-31173245a13f-config\") pod \"dnsmasq-dns-8df4bd59-49bbv\" (UID: \"708027bc-d695-4f49-bb70-31173245a13f\") " pod="openstack/dnsmasq-dns-8df4bd59-49bbv" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.738639 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f988f09-6ce5-4cf5-a32b-18fa5a0100d2-config-data\") pod \"nova-metadata-0\" (UID: \"7f988f09-6ce5-4cf5-a32b-18fa5a0100d2\") " pod="openstack/nova-metadata-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.738692 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sttrf\" (UniqueName: \"kubernetes.io/projected/948ac29a-d7ab-47a3-aa8e-d74b5f1de702-kube-api-access-sttrf\") pod \"nova-cell1-novncproxy-0\" (UID: \"948ac29a-d7ab-47a3-aa8e-d74b5f1de702\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.738726 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/948ac29a-d7ab-47a3-aa8e-d74b5f1de702-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"948ac29a-d7ab-47a3-aa8e-d74b5f1de702\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.738769 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80ed7f7c-004e-4690-b4c0-edbd0cb31b71-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"80ed7f7c-004e-4690-b4c0-edbd0cb31b71\") " pod="openstack/nova-scheduler-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.738796 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/948ac29a-d7ab-47a3-aa8e-d74b5f1de702-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"948ac29a-d7ab-47a3-aa8e-d74b5f1de702\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.738829 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f988f09-6ce5-4cf5-a32b-18fa5a0100d2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7f988f09-6ce5-4cf5-a32b-18fa5a0100d2\") " pod="openstack/nova-metadata-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.738873 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnrsx\" (UniqueName: \"kubernetes.io/projected/708027bc-d695-4f49-bb70-31173245a13f-kube-api-access-vnrsx\") pod \"dnsmasq-dns-8df4bd59-49bbv\" (UID: \"708027bc-d695-4f49-bb70-31173245a13f\") " pod="openstack/dnsmasq-dns-8df4bd59-49bbv" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.738899 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80ed7f7c-004e-4690-b4c0-edbd0cb31b71-config-data\") pod \"nova-scheduler-0\" (UID: \"80ed7f7c-004e-4690-b4c0-edbd0cb31b71\") " pod="openstack/nova-scheduler-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.738945 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/708027bc-d695-4f49-bb70-31173245a13f-ovsdbserver-sb\") pod \"dnsmasq-dns-8df4bd59-49bbv\" (UID: \"708027bc-d695-4f49-bb70-31173245a13f\") " pod="openstack/dnsmasq-dns-8df4bd59-49bbv" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.738968 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/708027bc-d695-4f49-bb70-31173245a13f-dns-swift-storage-0\") pod \"dnsmasq-dns-8df4bd59-49bbv\" (UID: \"708027bc-d695-4f49-bb70-31173245a13f\") " pod="openstack/dnsmasq-dns-8df4bd59-49bbv" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.739021 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4cvc\" (UniqueName: \"kubernetes.io/projected/7f988f09-6ce5-4cf5-a32b-18fa5a0100d2-kube-api-access-v4cvc\") pod \"nova-metadata-0\" (UID: \"7f988f09-6ce5-4cf5-a32b-18fa5a0100d2\") " pod="openstack/nova-metadata-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.744378 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f988f09-6ce5-4cf5-a32b-18fa5a0100d2-logs\") pod \"nova-metadata-0\" (UID: \"7f988f09-6ce5-4cf5-a32b-18fa5a0100d2\") " pod="openstack/nova-metadata-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.744506 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6nlv\" (UniqueName: \"kubernetes.io/projected/80ed7f7c-004e-4690-b4c0-edbd0cb31b71-kube-api-access-t6nlv\") pod \"nova-scheduler-0\" (UID: \"80ed7f7c-004e-4690-b4c0-edbd0cb31b71\") " pod="openstack/nova-scheduler-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.745734 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/708027bc-d695-4f49-bb70-31173245a13f-dns-svc\") pod \"dnsmasq-dns-8df4bd59-49bbv\" (UID: \"708027bc-d695-4f49-bb70-31173245a13f\") " pod="openstack/dnsmasq-dns-8df4bd59-49bbv" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.746306 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/708027bc-d695-4f49-bb70-31173245a13f-dns-swift-storage-0\") pod \"dnsmasq-dns-8df4bd59-49bbv\" (UID: \"708027bc-d695-4f49-bb70-31173245a13f\") " pod="openstack/dnsmasq-dns-8df4bd59-49bbv" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.747411 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f988f09-6ce5-4cf5-a32b-18fa5a0100d2-logs\") pod \"nova-metadata-0\" (UID: \"7f988f09-6ce5-4cf5-a32b-18fa5a0100d2\") " pod="openstack/nova-metadata-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.748471 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/708027bc-d695-4f49-bb70-31173245a13f-ovsdbserver-sb\") pod \"dnsmasq-dns-8df4bd59-49bbv\" (UID: \"708027bc-d695-4f49-bb70-31173245a13f\") " pod="openstack/dnsmasq-dns-8df4bd59-49bbv" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.749210 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/708027bc-d695-4f49-bb70-31173245a13f-ovsdbserver-nb\") pod \"dnsmasq-dns-8df4bd59-49bbv\" (UID: \"708027bc-d695-4f49-bb70-31173245a13f\") " pod="openstack/dnsmasq-dns-8df4bd59-49bbv" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.749475 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/708027bc-d695-4f49-bb70-31173245a13f-config\") pod \"dnsmasq-dns-8df4bd59-49bbv\" (UID: \"708027bc-d695-4f49-bb70-31173245a13f\") " pod="openstack/dnsmasq-dns-8df4bd59-49bbv" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.753844 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f988f09-6ce5-4cf5-a32b-18fa5a0100d2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7f988f09-6ce5-4cf5-a32b-18fa5a0100d2\") " pod="openstack/nova-metadata-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.754237 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/948ac29a-d7ab-47a3-aa8e-d74b5f1de702-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"948ac29a-d7ab-47a3-aa8e-d74b5f1de702\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.765090 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f988f09-6ce5-4cf5-a32b-18fa5a0100d2-config-data\") pod \"nova-metadata-0\" (UID: \"7f988f09-6ce5-4cf5-a32b-18fa5a0100d2\") " pod="openstack/nova-metadata-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.771376 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4cvc\" (UniqueName: \"kubernetes.io/projected/7f988f09-6ce5-4cf5-a32b-18fa5a0100d2-kube-api-access-v4cvc\") pod \"nova-metadata-0\" (UID: \"7f988f09-6ce5-4cf5-a32b-18fa5a0100d2\") " pod="openstack/nova-metadata-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.771939 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/948ac29a-d7ab-47a3-aa8e-d74b5f1de702-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"948ac29a-d7ab-47a3-aa8e-d74b5f1de702\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.776172 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnrsx\" (UniqueName: \"kubernetes.io/projected/708027bc-d695-4f49-bb70-31173245a13f-kube-api-access-vnrsx\") pod \"dnsmasq-dns-8df4bd59-49bbv\" (UID: \"708027bc-d695-4f49-bb70-31173245a13f\") " pod="openstack/dnsmasq-dns-8df4bd59-49bbv" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.780626 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sttrf\" (UniqueName: \"kubernetes.io/projected/948ac29a-d7ab-47a3-aa8e-d74b5f1de702-kube-api-access-sttrf\") pod \"nova-cell1-novncproxy-0\" (UID: \"948ac29a-d7ab-47a3-aa8e-d74b5f1de702\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.824726 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.845853 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6nlv\" (UniqueName: \"kubernetes.io/projected/80ed7f7c-004e-4690-b4c0-edbd0cb31b71-kube-api-access-t6nlv\") pod \"nova-scheduler-0\" (UID: \"80ed7f7c-004e-4690-b4c0-edbd0cb31b71\") " pod="openstack/nova-scheduler-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.845953 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80ed7f7c-004e-4690-b4c0-edbd0cb31b71-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"80ed7f7c-004e-4690-b4c0-edbd0cb31b71\") " pod="openstack/nova-scheduler-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.846009 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80ed7f7c-004e-4690-b4c0-edbd0cb31b71-config-data\") pod \"nova-scheduler-0\" (UID: \"80ed7f7c-004e-4690-b4c0-edbd0cb31b71\") " pod="openstack/nova-scheduler-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.851501 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80ed7f7c-004e-4690-b4c0-edbd0cb31b71-config-data\") pod \"nova-scheduler-0\" (UID: \"80ed7f7c-004e-4690-b4c0-edbd0cb31b71\") " pod="openstack/nova-scheduler-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.854409 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80ed7f7c-004e-4690-b4c0-edbd0cb31b71-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"80ed7f7c-004e-4690-b4c0-edbd0cb31b71\") " pod="openstack/nova-scheduler-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.864646 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6nlv\" (UniqueName: \"kubernetes.io/projected/80ed7f7c-004e-4690-b4c0-edbd0cb31b71-kube-api-access-t6nlv\") pod \"nova-scheduler-0\" (UID: \"80ed7f7c-004e-4690-b4c0-edbd0cb31b71\") " pod="openstack/nova-scheduler-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.894273 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.937511 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.950038 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8df4bd59-49bbv" Dec 04 10:36:22 crc kubenswrapper[4831]: I1204 10:36:22.977922 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 10:36:23 crc kubenswrapper[4831]: I1204 10:36:23.158894 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-mmb2j"] Dec 04 10:36:23 crc kubenswrapper[4831]: I1204 10:36:23.618888 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" event={"ID":"8475bb26-8864-4d49-935b-db7d4cb73387","Type":"ContainerStarted","Data":"61f7279f438426e6b31be64e9ccb62c729f466d054e9b0e0a804c882066b625e"} Dec 04 10:36:23 crc kubenswrapper[4831]: I1204 10:36:23.766840 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:36:23 crc kubenswrapper[4831]: W1204 10:36:23.783705 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod948ac29a_d7ab_47a3_aa8e_d74b5f1de702.slice/crio-23076ac4e9d1a4540e9d46caad15677918147b1de0e02a71c93c4da3f456a427 WatchSource:0}: Error finding container 23076ac4e9d1a4540e9d46caad15677918147b1de0e02a71c93c4da3f456a427: Status 404 returned error can't find the container with id 23076ac4e9d1a4540e9d46caad15677918147b1de0e02a71c93c4da3f456a427 Dec 04 10:36:23 crc kubenswrapper[4831]: I1204 10:36:23.822609 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 10:36:23 crc kubenswrapper[4831]: I1204 10:36:23.847648 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:36:23 crc kubenswrapper[4831]: I1204 10:36:23.871125 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:36:23 crc kubenswrapper[4831]: I1204 10:36:23.880296 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8df4bd59-49bbv"] Dec 04 10:36:23 crc kubenswrapper[4831]: I1204 10:36:23.927108 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-r67gq"] Dec 04 10:36:23 crc kubenswrapper[4831]: I1204 10:36:23.954025 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-r67gq"] Dec 04 10:36:23 crc kubenswrapper[4831]: I1204 10:36:23.956806 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-r67gq" Dec 04 10:36:23 crc kubenswrapper[4831]: I1204 10:36:23.966238 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 04 10:36:23 crc kubenswrapper[4831]: I1204 10:36:23.967206 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 04 10:36:24 crc kubenswrapper[4831]: I1204 10:36:24.078169 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6a6469d-082d-4be1-ac56-bf92b750390d-config-data\") pod \"nova-cell1-conductor-db-sync-r67gq\" (UID: \"c6a6469d-082d-4be1-ac56-bf92b750390d\") " pod="openstack/nova-cell1-conductor-db-sync-r67gq" Dec 04 10:36:24 crc kubenswrapper[4831]: I1204 10:36:24.078473 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxrc9\" (UniqueName: \"kubernetes.io/projected/c6a6469d-082d-4be1-ac56-bf92b750390d-kube-api-access-cxrc9\") pod \"nova-cell1-conductor-db-sync-r67gq\" (UID: \"c6a6469d-082d-4be1-ac56-bf92b750390d\") " pod="openstack/nova-cell1-conductor-db-sync-r67gq" Dec 04 10:36:24 crc kubenswrapper[4831]: I1204 10:36:24.078612 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6a6469d-082d-4be1-ac56-bf92b750390d-scripts\") pod \"nova-cell1-conductor-db-sync-r67gq\" (UID: \"c6a6469d-082d-4be1-ac56-bf92b750390d\") " pod="openstack/nova-cell1-conductor-db-sync-r67gq" Dec 04 10:36:24 crc kubenswrapper[4831]: I1204 10:36:24.078651 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6a6469d-082d-4be1-ac56-bf92b750390d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-r67gq\" (UID: \"c6a6469d-082d-4be1-ac56-bf92b750390d\") " pod="openstack/nova-cell1-conductor-db-sync-r67gq" Dec 04 10:36:24 crc kubenswrapper[4831]: I1204 10:36:24.179808 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6a6469d-082d-4be1-ac56-bf92b750390d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-r67gq\" (UID: \"c6a6469d-082d-4be1-ac56-bf92b750390d\") " pod="openstack/nova-cell1-conductor-db-sync-r67gq" Dec 04 10:36:24 crc kubenswrapper[4831]: I1204 10:36:24.181825 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6a6469d-082d-4be1-ac56-bf92b750390d-config-data\") pod \"nova-cell1-conductor-db-sync-r67gq\" (UID: \"c6a6469d-082d-4be1-ac56-bf92b750390d\") " pod="openstack/nova-cell1-conductor-db-sync-r67gq" Dec 04 10:36:24 crc kubenswrapper[4831]: I1204 10:36:24.182015 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxrc9\" (UniqueName: \"kubernetes.io/projected/c6a6469d-082d-4be1-ac56-bf92b750390d-kube-api-access-cxrc9\") pod \"nova-cell1-conductor-db-sync-r67gq\" (UID: \"c6a6469d-082d-4be1-ac56-bf92b750390d\") " pod="openstack/nova-cell1-conductor-db-sync-r67gq" Dec 04 10:36:24 crc kubenswrapper[4831]: I1204 10:36:24.182351 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6a6469d-082d-4be1-ac56-bf92b750390d-scripts\") pod \"nova-cell1-conductor-db-sync-r67gq\" (UID: \"c6a6469d-082d-4be1-ac56-bf92b750390d\") " pod="openstack/nova-cell1-conductor-db-sync-r67gq" Dec 04 10:36:24 crc kubenswrapper[4831]: I1204 10:36:24.187124 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6a6469d-082d-4be1-ac56-bf92b750390d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-r67gq\" (UID: \"c6a6469d-082d-4be1-ac56-bf92b750390d\") " pod="openstack/nova-cell1-conductor-db-sync-r67gq" Dec 04 10:36:24 crc kubenswrapper[4831]: I1204 10:36:24.187463 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6a6469d-082d-4be1-ac56-bf92b750390d-scripts\") pod \"nova-cell1-conductor-db-sync-r67gq\" (UID: \"c6a6469d-082d-4be1-ac56-bf92b750390d\") " pod="openstack/nova-cell1-conductor-db-sync-r67gq" Dec 04 10:36:24 crc kubenswrapper[4831]: I1204 10:36:24.198383 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6a6469d-082d-4be1-ac56-bf92b750390d-config-data\") pod \"nova-cell1-conductor-db-sync-r67gq\" (UID: \"c6a6469d-082d-4be1-ac56-bf92b750390d\") " pod="openstack/nova-cell1-conductor-db-sync-r67gq" Dec 04 10:36:24 crc kubenswrapper[4831]: I1204 10:36:24.199197 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxrc9\" (UniqueName: \"kubernetes.io/projected/c6a6469d-082d-4be1-ac56-bf92b750390d-kube-api-access-cxrc9\") pod \"nova-cell1-conductor-db-sync-r67gq\" (UID: \"c6a6469d-082d-4be1-ac56-bf92b750390d\") " pod="openstack/nova-cell1-conductor-db-sync-r67gq" Dec 04 10:36:24 crc kubenswrapper[4831]: I1204 10:36:24.301837 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-r67gq" Dec 04 10:36:24 crc kubenswrapper[4831]: I1204 10:36:24.521415 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0bc76d60-b433-4a70-aeda-3866b89c9197","Type":"ContainerStarted","Data":"5db4e6b270b4aeed66b40f6d1c67b3fd9bdb6d97b12becb904649e53837d965e"} Dec 04 10:36:24 crc kubenswrapper[4831]: I1204 10:36:24.528430 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"948ac29a-d7ab-47a3-aa8e-d74b5f1de702","Type":"ContainerStarted","Data":"23076ac4e9d1a4540e9d46caad15677918147b1de0e02a71c93c4da3f456a427"} Dec 04 10:36:24 crc kubenswrapper[4831]: I1204 10:36:24.531951 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mmb2j" event={"ID":"2dbc983b-0bfd-4646-9a07-4f0894f1c480","Type":"ContainerStarted","Data":"f626c01b67169f6dbf326e381e3a344d2ef41499a05917f879afca21d3c87fad"} Dec 04 10:36:24 crc kubenswrapper[4831]: I1204 10:36:24.531994 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mmb2j" event={"ID":"2dbc983b-0bfd-4646-9a07-4f0894f1c480","Type":"ContainerStarted","Data":"e09764b432f95e1e7c7e24b2abbe156948837a30c2b4d51d945944edbf657886"} Dec 04 10:36:24 crc kubenswrapper[4831]: I1204 10:36:24.539645 4831 generic.go:334] "Generic (PLEG): container finished" podID="708027bc-d695-4f49-bb70-31173245a13f" containerID="f4c2ff65ae25feb78d8e09bda55e78fca42b761b0f747b220faae5f59c50d54f" exitCode=0 Dec 04 10:36:24 crc kubenswrapper[4831]: I1204 10:36:24.539762 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8df4bd59-49bbv" event={"ID":"708027bc-d695-4f49-bb70-31173245a13f","Type":"ContainerDied","Data":"f4c2ff65ae25feb78d8e09bda55e78fca42b761b0f747b220faae5f59c50d54f"} Dec 04 10:36:24 crc kubenswrapper[4831]: I1204 10:36:24.539788 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8df4bd59-49bbv" event={"ID":"708027bc-d695-4f49-bb70-31173245a13f","Type":"ContainerStarted","Data":"78f36bf099587aa292210e9615949f1c80cce61599853371f6d087e82a2c41d4"} Dec 04 10:36:24 crc kubenswrapper[4831]: I1204 10:36:24.559456 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"80ed7f7c-004e-4690-b4c0-edbd0cb31b71","Type":"ContainerStarted","Data":"92692d61ba9dcf68bdb7d88297a1633b081b1c9fff00a3b157eb230b89ac4cfc"} Dec 04 10:36:24 crc kubenswrapper[4831]: I1204 10:36:24.565647 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7f988f09-6ce5-4cf5-a32b-18fa5a0100d2","Type":"ContainerStarted","Data":"106029a4a77f75f6ff9a424a17f590bfaccf6bb964f3a22272606d358f043014"} Dec 04 10:36:24 crc kubenswrapper[4831]: I1204 10:36:24.571705 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-mmb2j" podStartSLOduration=2.571686394 podStartE2EDuration="2.571686394s" podCreationTimestamp="2025-12-04 10:36:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:36:24.552254458 +0000 UTC m=+1281.501429772" watchObservedRunningTime="2025-12-04 10:36:24.571686394 +0000 UTC m=+1281.520861698" Dec 04 10:36:24 crc kubenswrapper[4831]: I1204 10:36:24.875924 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-r67gq"] Dec 04 10:36:24 crc kubenswrapper[4831]: W1204 10:36:24.890201 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6a6469d_082d_4be1_ac56_bf92b750390d.slice/crio-69a14fc19f79831176e629c8523e102300d4ce9ee6a9be64bd7d885b62c27c22 WatchSource:0}: Error finding container 69a14fc19f79831176e629c8523e102300d4ce9ee6a9be64bd7d885b62c27c22: Status 404 returned error can't find the container with id 69a14fc19f79831176e629c8523e102300d4ce9ee6a9be64bd7d885b62c27c22 Dec 04 10:36:25 crc kubenswrapper[4831]: I1204 10:36:25.573842 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-r67gq" event={"ID":"c6a6469d-082d-4be1-ac56-bf92b750390d","Type":"ContainerStarted","Data":"69a14fc19f79831176e629c8523e102300d4ce9ee6a9be64bd7d885b62c27c22"} Dec 04 10:36:25 crc kubenswrapper[4831]: I1204 10:36:25.587334 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8df4bd59-49bbv" event={"ID":"708027bc-d695-4f49-bb70-31173245a13f","Type":"ContainerStarted","Data":"0eb8a5f2a0e0b126fe24e5f3096174fd5168eb32b10acfa9221b786103471aba"} Dec 04 10:36:25 crc kubenswrapper[4831]: I1204 10:36:25.587621 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8df4bd59-49bbv" Dec 04 10:36:26 crc kubenswrapper[4831]: I1204 10:36:26.037045 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8df4bd59-49bbv" podStartSLOduration=4.037029063 podStartE2EDuration="4.037029063s" podCreationTimestamp="2025-12-04 10:36:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:36:25.630297253 +0000 UTC m=+1282.579472567" watchObservedRunningTime="2025-12-04 10:36:26.037029063 +0000 UTC m=+1282.986204377" Dec 04 10:36:26 crc kubenswrapper[4831]: I1204 10:36:26.045383 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:36:26 crc kubenswrapper[4831]: I1204 10:36:26.128694 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 10:36:26 crc kubenswrapper[4831]: I1204 10:36:26.598723 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-r67gq" event={"ID":"c6a6469d-082d-4be1-ac56-bf92b750390d","Type":"ContainerStarted","Data":"76f66e055faa71edef7fd22c90355f8ef99f2fbe084150654596ff28ab7ea2f9"} Dec 04 10:36:27 crc kubenswrapper[4831]: I1204 10:36:27.611237 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7f988f09-6ce5-4cf5-a32b-18fa5a0100d2","Type":"ContainerStarted","Data":"c7f81ae4c7b0f64b9d790182ddaf981790635588a8398d42efbcd77d76762dbe"} Dec 04 10:36:27 crc kubenswrapper[4831]: I1204 10:36:27.611821 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7f988f09-6ce5-4cf5-a32b-18fa5a0100d2","Type":"ContainerStarted","Data":"5e493f5ba505cebd04effa30fd5e10a7212fae75c6c1f0cc6301d4c991727ce8"} Dec 04 10:36:27 crc kubenswrapper[4831]: I1204 10:36:27.611380 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7f988f09-6ce5-4cf5-a32b-18fa5a0100d2" containerName="nova-metadata-metadata" containerID="cri-o://c7f81ae4c7b0f64b9d790182ddaf981790635588a8398d42efbcd77d76762dbe" gracePeriod=30 Dec 04 10:36:27 crc kubenswrapper[4831]: I1204 10:36:27.611329 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7f988f09-6ce5-4cf5-a32b-18fa5a0100d2" containerName="nova-metadata-log" containerID="cri-o://5e493f5ba505cebd04effa30fd5e10a7212fae75c6c1f0cc6301d4c991727ce8" gracePeriod=30 Dec 04 10:36:27 crc kubenswrapper[4831]: I1204 10:36:27.615348 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0bc76d60-b433-4a70-aeda-3866b89c9197","Type":"ContainerStarted","Data":"5d85f870ed5af273cfbb5057932c378fbe015b196c0d149be0b59a952453f7fa"} Dec 04 10:36:27 crc kubenswrapper[4831]: I1204 10:36:27.615555 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0bc76d60-b433-4a70-aeda-3866b89c9197","Type":"ContainerStarted","Data":"432c52cd74b8f2649b471e9a8cc62429f37c3b65a3e26b922ca05f67c3d2783c"} Dec 04 10:36:27 crc kubenswrapper[4831]: I1204 10:36:27.619148 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"948ac29a-d7ab-47a3-aa8e-d74b5f1de702","Type":"ContainerStarted","Data":"a38aa139053bbe0f386d00ab71e9750f2cb13948cd9ede8fb0f68c9f3cc5bbaf"} Dec 04 10:36:27 crc kubenswrapper[4831]: I1204 10:36:27.619309 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="948ac29a-d7ab-47a3-aa8e-d74b5f1de702" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://a38aa139053bbe0f386d00ab71e9750f2cb13948cd9ede8fb0f68c9f3cc5bbaf" gracePeriod=30 Dec 04 10:36:27 crc kubenswrapper[4831]: I1204 10:36:27.625207 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"80ed7f7c-004e-4690-b4c0-edbd0cb31b71","Type":"ContainerStarted","Data":"fd62ea1b531305afca9437e5369c1a85fbef48d0132d4b5f38ed8e3d0ac45e72"} Dec 04 10:36:27 crc kubenswrapper[4831]: I1204 10:36:27.630375 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.515431579 podStartE2EDuration="5.630326848s" podCreationTimestamp="2025-12-04 10:36:22 +0000 UTC" firstStartedPulling="2025-12-04 10:36:23.789578347 +0000 UTC m=+1280.738753661" lastFinishedPulling="2025-12-04 10:36:26.904473616 +0000 UTC m=+1283.853648930" observedRunningTime="2025-12-04 10:36:27.628003127 +0000 UTC m=+1284.577178441" watchObservedRunningTime="2025-12-04 10:36:27.630326848 +0000 UTC m=+1284.579502172" Dec 04 10:36:27 crc kubenswrapper[4831]: I1204 10:36:27.651852 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-r67gq" podStartSLOduration=4.651830099 podStartE2EDuration="4.651830099s" podCreationTimestamp="2025-12-04 10:36:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:36:26.612109043 +0000 UTC m=+1283.561284357" watchObservedRunningTime="2025-12-04 10:36:27.651830099 +0000 UTC m=+1284.601005413" Dec 04 10:36:27 crc kubenswrapper[4831]: I1204 10:36:27.664824 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.560522517 podStartE2EDuration="5.664805694s" podCreationTimestamp="2025-12-04 10:36:22 +0000 UTC" firstStartedPulling="2025-12-04 10:36:23.801763841 +0000 UTC m=+1280.750939155" lastFinishedPulling="2025-12-04 10:36:26.906047018 +0000 UTC m=+1283.855222332" observedRunningTime="2025-12-04 10:36:27.646026875 +0000 UTC m=+1284.595202189" watchObservedRunningTime="2025-12-04 10:36:27.664805694 +0000 UTC m=+1284.613981008" Dec 04 10:36:27 crc kubenswrapper[4831]: I1204 10:36:27.680642 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.55797852 podStartE2EDuration="5.680622104s" podCreationTimestamp="2025-12-04 10:36:22 +0000 UTC" firstStartedPulling="2025-12-04 10:36:23.808397427 +0000 UTC m=+1280.757572741" lastFinishedPulling="2025-12-04 10:36:26.931041011 +0000 UTC m=+1283.880216325" observedRunningTime="2025-12-04 10:36:27.668943514 +0000 UTC m=+1284.618118828" watchObservedRunningTime="2025-12-04 10:36:27.680622104 +0000 UTC m=+1284.629797418" Dec 04 10:36:27 crc kubenswrapper[4831]: I1204 10:36:27.691143 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.603403926 podStartE2EDuration="5.691125893s" podCreationTimestamp="2025-12-04 10:36:22 +0000 UTC" firstStartedPulling="2025-12-04 10:36:23.816760859 +0000 UTC m=+1280.765936173" lastFinishedPulling="2025-12-04 10:36:26.904482826 +0000 UTC m=+1283.853658140" observedRunningTime="2025-12-04 10:36:27.687140827 +0000 UTC m=+1284.636316141" watchObservedRunningTime="2025-12-04 10:36:27.691125893 +0000 UTC m=+1284.640301207" Dec 04 10:36:27 crc kubenswrapper[4831]: I1204 10:36:27.895020 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:36:27 crc kubenswrapper[4831]: I1204 10:36:27.937625 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 10:36:27 crc kubenswrapper[4831]: I1204 10:36:27.937722 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 10:36:27 crc kubenswrapper[4831]: I1204 10:36:27.978574 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 04 10:36:28 crc kubenswrapper[4831]: I1204 10:36:28.633861 4831 generic.go:334] "Generic (PLEG): container finished" podID="7f988f09-6ce5-4cf5-a32b-18fa5a0100d2" containerID="5e493f5ba505cebd04effa30fd5e10a7212fae75c6c1f0cc6301d4c991727ce8" exitCode=143 Dec 04 10:36:28 crc kubenswrapper[4831]: I1204 10:36:28.634808 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7f988f09-6ce5-4cf5-a32b-18fa5a0100d2","Type":"ContainerDied","Data":"5e493f5ba505cebd04effa30fd5e10a7212fae75c6c1f0cc6301d4c991727ce8"} Dec 04 10:36:32 crc kubenswrapper[4831]: I1204 10:36:32.683281 4831 generic.go:334] "Generic (PLEG): container finished" podID="2dbc983b-0bfd-4646-9a07-4f0894f1c480" containerID="f626c01b67169f6dbf326e381e3a344d2ef41499a05917f879afca21d3c87fad" exitCode=0 Dec 04 10:36:32 crc kubenswrapper[4831]: I1204 10:36:32.683342 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mmb2j" event={"ID":"2dbc983b-0bfd-4646-9a07-4f0894f1c480","Type":"ContainerDied","Data":"f626c01b67169f6dbf326e381e3a344d2ef41499a05917f879afca21d3c87fad"} Dec 04 10:36:32 crc kubenswrapper[4831]: I1204 10:36:32.826511 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 10:36:32 crc kubenswrapper[4831]: I1204 10:36:32.826574 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 10:36:32 crc kubenswrapper[4831]: I1204 10:36:32.953521 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8df4bd59-49bbv" Dec 04 10:36:32 crc kubenswrapper[4831]: I1204 10:36:32.978612 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 04 10:36:33 crc kubenswrapper[4831]: I1204 10:36:33.043092 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-665f7b5ff9-klz7d"] Dec 04 10:36:33 crc kubenswrapper[4831]: I1204 10:36:33.043384 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-665f7b5ff9-klz7d" podUID="5173ee92-12de-4849-9659-882e5cfb1566" containerName="dnsmasq-dns" containerID="cri-o://d073ffebbd17041287755022174d9c5fe685e388d6b4c25553ca339ab6ba33ce" gracePeriod=10 Dec 04 10:36:33 crc kubenswrapper[4831]: I1204 10:36:33.052417 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 04 10:36:33 crc kubenswrapper[4831]: I1204 10:36:33.575967 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-665f7b5ff9-klz7d" Dec 04 10:36:33 crc kubenswrapper[4831]: I1204 10:36:33.694579 4831 generic.go:334] "Generic (PLEG): container finished" podID="5173ee92-12de-4849-9659-882e5cfb1566" containerID="d073ffebbd17041287755022174d9c5fe685e388d6b4c25553ca339ab6ba33ce" exitCode=0 Dec 04 10:36:33 crc kubenswrapper[4831]: I1204 10:36:33.694815 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-665f7b5ff9-klz7d" Dec 04 10:36:33 crc kubenswrapper[4831]: I1204 10:36:33.695482 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-665f7b5ff9-klz7d" event={"ID":"5173ee92-12de-4849-9659-882e5cfb1566","Type":"ContainerDied","Data":"d073ffebbd17041287755022174d9c5fe685e388d6b4c25553ca339ab6ba33ce"} Dec 04 10:36:33 crc kubenswrapper[4831]: I1204 10:36:33.695514 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-665f7b5ff9-klz7d" event={"ID":"5173ee92-12de-4849-9659-882e5cfb1566","Type":"ContainerDied","Data":"848c24145c819c42882eb04be06ec52cd5d1725211d62079fa0bd470c60608e5"} Dec 04 10:36:33 crc kubenswrapper[4831]: I1204 10:36:33.695531 4831 scope.go:117] "RemoveContainer" containerID="d073ffebbd17041287755022174d9c5fe685e388d6b4c25553ca339ab6ba33ce" Dec 04 10:36:33 crc kubenswrapper[4831]: I1204 10:36:33.722196 4831 scope.go:117] "RemoveContainer" containerID="1927f36549715d169df02f8ee59ec04b92459a9191554b0cb7d3d2bb01ae70e4" Dec 04 10:36:33 crc kubenswrapper[4831]: I1204 10:36:33.732938 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 04 10:36:33 crc kubenswrapper[4831]: I1204 10:36:33.735719 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5173ee92-12de-4849-9659-882e5cfb1566-dns-svc\") pod \"5173ee92-12de-4849-9659-882e5cfb1566\" (UID: \"5173ee92-12de-4849-9659-882e5cfb1566\") " Dec 04 10:36:33 crc kubenswrapper[4831]: I1204 10:36:33.735797 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5173ee92-12de-4849-9659-882e5cfb1566-config\") pod \"5173ee92-12de-4849-9659-882e5cfb1566\" (UID: \"5173ee92-12de-4849-9659-882e5cfb1566\") " Dec 04 10:36:33 crc kubenswrapper[4831]: I1204 10:36:33.735837 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5173ee92-12de-4849-9659-882e5cfb1566-ovsdbserver-sb\") pod \"5173ee92-12de-4849-9659-882e5cfb1566\" (UID: \"5173ee92-12de-4849-9659-882e5cfb1566\") " Dec 04 10:36:33 crc kubenswrapper[4831]: I1204 10:36:33.735889 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rg7sl\" (UniqueName: \"kubernetes.io/projected/5173ee92-12de-4849-9659-882e5cfb1566-kube-api-access-rg7sl\") pod \"5173ee92-12de-4849-9659-882e5cfb1566\" (UID: \"5173ee92-12de-4849-9659-882e5cfb1566\") " Dec 04 10:36:33 crc kubenswrapper[4831]: I1204 10:36:33.735937 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5173ee92-12de-4849-9659-882e5cfb1566-ovsdbserver-nb\") pod \"5173ee92-12de-4849-9659-882e5cfb1566\" (UID: \"5173ee92-12de-4849-9659-882e5cfb1566\") " Dec 04 10:36:33 crc kubenswrapper[4831]: I1204 10:36:33.735998 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5173ee92-12de-4849-9659-882e5cfb1566-dns-swift-storage-0\") pod \"5173ee92-12de-4849-9659-882e5cfb1566\" (UID: \"5173ee92-12de-4849-9659-882e5cfb1566\") " Dec 04 10:36:33 crc kubenswrapper[4831]: I1204 10:36:33.749628 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5173ee92-12de-4849-9659-882e5cfb1566-kube-api-access-rg7sl" (OuterVolumeSpecName: "kube-api-access-rg7sl") pod "5173ee92-12de-4849-9659-882e5cfb1566" (UID: "5173ee92-12de-4849-9659-882e5cfb1566"). InnerVolumeSpecName "kube-api-access-rg7sl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:36:33 crc kubenswrapper[4831]: I1204 10:36:33.778769 4831 scope.go:117] "RemoveContainer" containerID="d073ffebbd17041287755022174d9c5fe685e388d6b4c25553ca339ab6ba33ce" Dec 04 10:36:33 crc kubenswrapper[4831]: E1204 10:36:33.779285 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d073ffebbd17041287755022174d9c5fe685e388d6b4c25553ca339ab6ba33ce\": container with ID starting with d073ffebbd17041287755022174d9c5fe685e388d6b4c25553ca339ab6ba33ce not found: ID does not exist" containerID="d073ffebbd17041287755022174d9c5fe685e388d6b4c25553ca339ab6ba33ce" Dec 04 10:36:33 crc kubenswrapper[4831]: I1204 10:36:33.779317 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d073ffebbd17041287755022174d9c5fe685e388d6b4c25553ca339ab6ba33ce"} err="failed to get container status \"d073ffebbd17041287755022174d9c5fe685e388d6b4c25553ca339ab6ba33ce\": rpc error: code = NotFound desc = could not find container \"d073ffebbd17041287755022174d9c5fe685e388d6b4c25553ca339ab6ba33ce\": container with ID starting with d073ffebbd17041287755022174d9c5fe685e388d6b4c25553ca339ab6ba33ce not found: ID does not exist" Dec 04 10:36:33 crc kubenswrapper[4831]: I1204 10:36:33.779338 4831 scope.go:117] "RemoveContainer" containerID="1927f36549715d169df02f8ee59ec04b92459a9191554b0cb7d3d2bb01ae70e4" Dec 04 10:36:33 crc kubenswrapper[4831]: E1204 10:36:33.779562 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1927f36549715d169df02f8ee59ec04b92459a9191554b0cb7d3d2bb01ae70e4\": container with ID starting with 1927f36549715d169df02f8ee59ec04b92459a9191554b0cb7d3d2bb01ae70e4 not found: ID does not exist" containerID="1927f36549715d169df02f8ee59ec04b92459a9191554b0cb7d3d2bb01ae70e4" Dec 04 10:36:33 crc kubenswrapper[4831]: I1204 10:36:33.779584 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1927f36549715d169df02f8ee59ec04b92459a9191554b0cb7d3d2bb01ae70e4"} err="failed to get container status \"1927f36549715d169df02f8ee59ec04b92459a9191554b0cb7d3d2bb01ae70e4\": rpc error: code = NotFound desc = could not find container \"1927f36549715d169df02f8ee59ec04b92459a9191554b0cb7d3d2bb01ae70e4\": container with ID starting with 1927f36549715d169df02f8ee59ec04b92459a9191554b0cb7d3d2bb01ae70e4 not found: ID does not exist" Dec 04 10:36:33 crc kubenswrapper[4831]: I1204 10:36:33.820328 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5173ee92-12de-4849-9659-882e5cfb1566-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5173ee92-12de-4849-9659-882e5cfb1566" (UID: "5173ee92-12de-4849-9659-882e5cfb1566"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:36:33 crc kubenswrapper[4831]: I1204 10:36:33.820678 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5173ee92-12de-4849-9659-882e5cfb1566-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5173ee92-12de-4849-9659-882e5cfb1566" (UID: "5173ee92-12de-4849-9659-882e5cfb1566"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:36:33 crc kubenswrapper[4831]: I1204 10:36:33.838157 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5173ee92-12de-4849-9659-882e5cfb1566-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:33 crc kubenswrapper[4831]: I1204 10:36:33.838205 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5173ee92-12de-4849-9659-882e5cfb1566-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:33 crc kubenswrapper[4831]: I1204 10:36:33.838217 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rg7sl\" (UniqueName: \"kubernetes.io/projected/5173ee92-12de-4849-9659-882e5cfb1566-kube-api-access-rg7sl\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:33 crc kubenswrapper[4831]: I1204 10:36:33.854940 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5173ee92-12de-4849-9659-882e5cfb1566-config" (OuterVolumeSpecName: "config") pod "5173ee92-12de-4849-9659-882e5cfb1566" (UID: "5173ee92-12de-4849-9659-882e5cfb1566"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:36:33 crc kubenswrapper[4831]: I1204 10:36:33.866791 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5173ee92-12de-4849-9659-882e5cfb1566-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5173ee92-12de-4849-9659-882e5cfb1566" (UID: "5173ee92-12de-4849-9659-882e5cfb1566"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:36:33 crc kubenswrapper[4831]: I1204 10:36:33.867170 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5173ee92-12de-4849-9659-882e5cfb1566-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5173ee92-12de-4849-9659-882e5cfb1566" (UID: "5173ee92-12de-4849-9659-882e5cfb1566"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:36:33 crc kubenswrapper[4831]: I1204 10:36:33.916802 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0bc76d60-b433-4a70-aeda-3866b89c9197" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.204:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 10:36:33 crc kubenswrapper[4831]: I1204 10:36:33.917168 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0bc76d60-b433-4a70-aeda-3866b89c9197" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.204:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 10:36:33 crc kubenswrapper[4831]: I1204 10:36:33.938948 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5173ee92-12de-4849-9659-882e5cfb1566-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:33 crc kubenswrapper[4831]: I1204 10:36:33.938991 4831 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5173ee92-12de-4849-9659-882e5cfb1566-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:33 crc kubenswrapper[4831]: I1204 10:36:33.939004 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5173ee92-12de-4849-9659-882e5cfb1566-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:34 crc kubenswrapper[4831]: I1204 10:36:34.012053 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mmb2j" Dec 04 10:36:34 crc kubenswrapper[4831]: I1204 10:36:34.035166 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-665f7b5ff9-klz7d"] Dec 04 10:36:34 crc kubenswrapper[4831]: I1204 10:36:34.040321 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dbc983b-0bfd-4646-9a07-4f0894f1c480-combined-ca-bundle\") pod \"2dbc983b-0bfd-4646-9a07-4f0894f1c480\" (UID: \"2dbc983b-0bfd-4646-9a07-4f0894f1c480\") " Dec 04 10:36:34 crc kubenswrapper[4831]: I1204 10:36:34.040360 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dbc983b-0bfd-4646-9a07-4f0894f1c480-scripts\") pod \"2dbc983b-0bfd-4646-9a07-4f0894f1c480\" (UID: \"2dbc983b-0bfd-4646-9a07-4f0894f1c480\") " Dec 04 10:36:34 crc kubenswrapper[4831]: I1204 10:36:34.040465 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dbc983b-0bfd-4646-9a07-4f0894f1c480-config-data\") pod \"2dbc983b-0bfd-4646-9a07-4f0894f1c480\" (UID: \"2dbc983b-0bfd-4646-9a07-4f0894f1c480\") " Dec 04 10:36:34 crc kubenswrapper[4831]: I1204 10:36:34.040492 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kdtc\" (UniqueName: \"kubernetes.io/projected/2dbc983b-0bfd-4646-9a07-4f0894f1c480-kube-api-access-2kdtc\") pod \"2dbc983b-0bfd-4646-9a07-4f0894f1c480\" (UID: \"2dbc983b-0bfd-4646-9a07-4f0894f1c480\") " Dec 04 10:36:34 crc kubenswrapper[4831]: I1204 10:36:34.053056 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dbc983b-0bfd-4646-9a07-4f0894f1c480-scripts" (OuterVolumeSpecName: "scripts") pod "2dbc983b-0bfd-4646-9a07-4f0894f1c480" (UID: "2dbc983b-0bfd-4646-9a07-4f0894f1c480"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:36:34 crc kubenswrapper[4831]: I1204 10:36:34.053116 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-665f7b5ff9-klz7d"] Dec 04 10:36:34 crc kubenswrapper[4831]: I1204 10:36:34.064402 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dbc983b-0bfd-4646-9a07-4f0894f1c480-kube-api-access-2kdtc" (OuterVolumeSpecName: "kube-api-access-2kdtc") pod "2dbc983b-0bfd-4646-9a07-4f0894f1c480" (UID: "2dbc983b-0bfd-4646-9a07-4f0894f1c480"). InnerVolumeSpecName "kube-api-access-2kdtc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:36:34 crc kubenswrapper[4831]: I1204 10:36:34.080500 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dbc983b-0bfd-4646-9a07-4f0894f1c480-config-data" (OuterVolumeSpecName: "config-data") pod "2dbc983b-0bfd-4646-9a07-4f0894f1c480" (UID: "2dbc983b-0bfd-4646-9a07-4f0894f1c480"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:36:34 crc kubenswrapper[4831]: I1204 10:36:34.103377 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dbc983b-0bfd-4646-9a07-4f0894f1c480-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2dbc983b-0bfd-4646-9a07-4f0894f1c480" (UID: "2dbc983b-0bfd-4646-9a07-4f0894f1c480"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:36:34 crc kubenswrapper[4831]: I1204 10:36:34.142843 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dbc983b-0bfd-4646-9a07-4f0894f1c480-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:34 crc kubenswrapper[4831]: I1204 10:36:34.142889 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kdtc\" (UniqueName: \"kubernetes.io/projected/2dbc983b-0bfd-4646-9a07-4f0894f1c480-kube-api-access-2kdtc\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:34 crc kubenswrapper[4831]: I1204 10:36:34.142902 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dbc983b-0bfd-4646-9a07-4f0894f1c480-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:34 crc kubenswrapper[4831]: I1204 10:36:34.142911 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dbc983b-0bfd-4646-9a07-4f0894f1c480-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:34 crc kubenswrapper[4831]: I1204 10:36:34.711091 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mmb2j" event={"ID":"2dbc983b-0bfd-4646-9a07-4f0894f1c480","Type":"ContainerDied","Data":"e09764b432f95e1e7c7e24b2abbe156948837a30c2b4d51d945944edbf657886"} Dec 04 10:36:34 crc kubenswrapper[4831]: I1204 10:36:34.711171 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e09764b432f95e1e7c7e24b2abbe156948837a30c2b4d51d945944edbf657886" Dec 04 10:36:34 crc kubenswrapper[4831]: I1204 10:36:34.711326 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mmb2j" Dec 04 10:36:34 crc kubenswrapper[4831]: I1204 10:36:34.908708 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:36:34 crc kubenswrapper[4831]: I1204 10:36:34.921021 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:36:34 crc kubenswrapper[4831]: I1204 10:36:34.921690 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0bc76d60-b433-4a70-aeda-3866b89c9197" containerName="nova-api-api" containerID="cri-o://5d85f870ed5af273cfbb5057932c378fbe015b196c0d149be0b59a952453f7fa" gracePeriod=30 Dec 04 10:36:34 crc kubenswrapper[4831]: I1204 10:36:34.921650 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0bc76d60-b433-4a70-aeda-3866b89c9197" containerName="nova-api-log" containerID="cri-o://432c52cd74b8f2649b471e9a8cc62429f37c3b65a3e26b922ca05f67c3d2783c" gracePeriod=30 Dec 04 10:36:35 crc kubenswrapper[4831]: I1204 10:36:35.286206 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5173ee92-12de-4849-9659-882e5cfb1566" path="/var/lib/kubelet/pods/5173ee92-12de-4849-9659-882e5cfb1566/volumes" Dec 04 10:36:35 crc kubenswrapper[4831]: I1204 10:36:35.721978 4831 generic.go:334] "Generic (PLEG): container finished" podID="0bc76d60-b433-4a70-aeda-3866b89c9197" containerID="432c52cd74b8f2649b471e9a8cc62429f37c3b65a3e26b922ca05f67c3d2783c" exitCode=143 Dec 04 10:36:35 crc kubenswrapper[4831]: I1204 10:36:35.722051 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0bc76d60-b433-4a70-aeda-3866b89c9197","Type":"ContainerDied","Data":"432c52cd74b8f2649b471e9a8cc62429f37c3b65a3e26b922ca05f67c3d2783c"} Dec 04 10:36:35 crc kubenswrapper[4831]: I1204 10:36:35.724674 4831 generic.go:334] "Generic (PLEG): container finished" podID="c6a6469d-082d-4be1-ac56-bf92b750390d" containerID="76f66e055faa71edef7fd22c90355f8ef99f2fbe084150654596ff28ab7ea2f9" exitCode=0 Dec 04 10:36:35 crc kubenswrapper[4831]: I1204 10:36:35.724839 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="80ed7f7c-004e-4690-b4c0-edbd0cb31b71" containerName="nova-scheduler-scheduler" containerID="cri-o://fd62ea1b531305afca9437e5369c1a85fbef48d0132d4b5f38ed8e3d0ac45e72" gracePeriod=30 Dec 04 10:36:35 crc kubenswrapper[4831]: I1204 10:36:35.725125 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-r67gq" event={"ID":"c6a6469d-082d-4be1-ac56-bf92b750390d","Type":"ContainerDied","Data":"76f66e055faa71edef7fd22c90355f8ef99f2fbe084150654596ff28ab7ea2f9"} Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.196553 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-r67gq" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.213053 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6a6469d-082d-4be1-ac56-bf92b750390d-config-data\") pod \"c6a6469d-082d-4be1-ac56-bf92b750390d\" (UID: \"c6a6469d-082d-4be1-ac56-bf92b750390d\") " Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.213153 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6a6469d-082d-4be1-ac56-bf92b750390d-scripts\") pod \"c6a6469d-082d-4be1-ac56-bf92b750390d\" (UID: \"c6a6469d-082d-4be1-ac56-bf92b750390d\") " Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.213263 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6a6469d-082d-4be1-ac56-bf92b750390d-combined-ca-bundle\") pod \"c6a6469d-082d-4be1-ac56-bf92b750390d\" (UID: \"c6a6469d-082d-4be1-ac56-bf92b750390d\") " Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.213344 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxrc9\" (UniqueName: \"kubernetes.io/projected/c6a6469d-082d-4be1-ac56-bf92b750390d-kube-api-access-cxrc9\") pod \"c6a6469d-082d-4be1-ac56-bf92b750390d\" (UID: \"c6a6469d-082d-4be1-ac56-bf92b750390d\") " Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.220845 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6a6469d-082d-4be1-ac56-bf92b750390d-scripts" (OuterVolumeSpecName: "scripts") pod "c6a6469d-082d-4be1-ac56-bf92b750390d" (UID: "c6a6469d-082d-4be1-ac56-bf92b750390d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.222040 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6a6469d-082d-4be1-ac56-bf92b750390d-kube-api-access-cxrc9" (OuterVolumeSpecName: "kube-api-access-cxrc9") pod "c6a6469d-082d-4be1-ac56-bf92b750390d" (UID: "c6a6469d-082d-4be1-ac56-bf92b750390d"). InnerVolumeSpecName "kube-api-access-cxrc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.249927 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6a6469d-082d-4be1-ac56-bf92b750390d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c6a6469d-082d-4be1-ac56-bf92b750390d" (UID: "c6a6469d-082d-4be1-ac56-bf92b750390d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.250922 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6a6469d-082d-4be1-ac56-bf92b750390d-config-data" (OuterVolumeSpecName: "config-data") pod "c6a6469d-082d-4be1-ac56-bf92b750390d" (UID: "c6a6469d-082d-4be1-ac56-bf92b750390d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.315799 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxrc9\" (UniqueName: \"kubernetes.io/projected/c6a6469d-082d-4be1-ac56-bf92b750390d-kube-api-access-cxrc9\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.315931 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6a6469d-082d-4be1-ac56-bf92b750390d-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.316012 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6a6469d-082d-4be1-ac56-bf92b750390d-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.316087 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6a6469d-082d-4be1-ac56-bf92b750390d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.452781 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.518067 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpn46\" (UniqueName: \"kubernetes.io/projected/0bc76d60-b433-4a70-aeda-3866b89c9197-kube-api-access-fpn46\") pod \"0bc76d60-b433-4a70-aeda-3866b89c9197\" (UID: \"0bc76d60-b433-4a70-aeda-3866b89c9197\") " Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.518138 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bc76d60-b433-4a70-aeda-3866b89c9197-combined-ca-bundle\") pod \"0bc76d60-b433-4a70-aeda-3866b89c9197\" (UID: \"0bc76d60-b433-4a70-aeda-3866b89c9197\") " Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.518212 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bc76d60-b433-4a70-aeda-3866b89c9197-logs\") pod \"0bc76d60-b433-4a70-aeda-3866b89c9197\" (UID: \"0bc76d60-b433-4a70-aeda-3866b89c9197\") " Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.518345 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bc76d60-b433-4a70-aeda-3866b89c9197-config-data\") pod \"0bc76d60-b433-4a70-aeda-3866b89c9197\" (UID: \"0bc76d60-b433-4a70-aeda-3866b89c9197\") " Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.519200 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bc76d60-b433-4a70-aeda-3866b89c9197-logs" (OuterVolumeSpecName: "logs") pod "0bc76d60-b433-4a70-aeda-3866b89c9197" (UID: "0bc76d60-b433-4a70-aeda-3866b89c9197"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.523462 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bc76d60-b433-4a70-aeda-3866b89c9197-kube-api-access-fpn46" (OuterVolumeSpecName: "kube-api-access-fpn46") pod "0bc76d60-b433-4a70-aeda-3866b89c9197" (UID: "0bc76d60-b433-4a70-aeda-3866b89c9197"). InnerVolumeSpecName "kube-api-access-fpn46". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.547371 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bc76d60-b433-4a70-aeda-3866b89c9197-config-data" (OuterVolumeSpecName: "config-data") pod "0bc76d60-b433-4a70-aeda-3866b89c9197" (UID: "0bc76d60-b433-4a70-aeda-3866b89c9197"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.568498 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bc76d60-b433-4a70-aeda-3866b89c9197-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0bc76d60-b433-4a70-aeda-3866b89c9197" (UID: "0bc76d60-b433-4a70-aeda-3866b89c9197"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.620170 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bc76d60-b433-4a70-aeda-3866b89c9197-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.620399 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bc76d60-b433-4a70-aeda-3866b89c9197-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.620460 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpn46\" (UniqueName: \"kubernetes.io/projected/0bc76d60-b433-4a70-aeda-3866b89c9197-kube-api-access-fpn46\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.620517 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bc76d60-b433-4a70-aeda-3866b89c9197-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.745324 4831 generic.go:334] "Generic (PLEG): container finished" podID="0bc76d60-b433-4a70-aeda-3866b89c9197" containerID="5d85f870ed5af273cfbb5057932c378fbe015b196c0d149be0b59a952453f7fa" exitCode=0 Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.745387 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0bc76d60-b433-4a70-aeda-3866b89c9197","Type":"ContainerDied","Data":"5d85f870ed5af273cfbb5057932c378fbe015b196c0d149be0b59a952453f7fa"} Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.745410 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0bc76d60-b433-4a70-aeda-3866b89c9197","Type":"ContainerDied","Data":"5db4e6b270b4aeed66b40f6d1c67b3fd9bdb6d97b12becb904649e53837d965e"} Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.745426 4831 scope.go:117] "RemoveContainer" containerID="5d85f870ed5af273cfbb5057932c378fbe015b196c0d149be0b59a952453f7fa" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.745433 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.748805 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-r67gq" event={"ID":"c6a6469d-082d-4be1-ac56-bf92b750390d","Type":"ContainerDied","Data":"69a14fc19f79831176e629c8523e102300d4ce9ee6a9be64bd7d885b62c27c22"} Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.748841 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69a14fc19f79831176e629c8523e102300d4ce9ee6a9be64bd7d885b62c27c22" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.748854 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-r67gq" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.789043 4831 scope.go:117] "RemoveContainer" containerID="432c52cd74b8f2649b471e9a8cc62429f37c3b65a3e26b922ca05f67c3d2783c" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.800902 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.812365 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.834195 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 04 10:36:37 crc kubenswrapper[4831]: E1204 10:36:37.835062 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dbc983b-0bfd-4646-9a07-4f0894f1c480" containerName="nova-manage" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.835098 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dbc983b-0bfd-4646-9a07-4f0894f1c480" containerName="nova-manage" Dec 04 10:36:37 crc kubenswrapper[4831]: E1204 10:36:37.835134 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bc76d60-b433-4a70-aeda-3866b89c9197" containerName="nova-api-log" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.835147 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bc76d60-b433-4a70-aeda-3866b89c9197" containerName="nova-api-log" Dec 04 10:36:37 crc kubenswrapper[4831]: E1204 10:36:37.835191 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bc76d60-b433-4a70-aeda-3866b89c9197" containerName="nova-api-api" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.835204 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bc76d60-b433-4a70-aeda-3866b89c9197" containerName="nova-api-api" Dec 04 10:36:37 crc kubenswrapper[4831]: E1204 10:36:37.835242 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5173ee92-12de-4849-9659-882e5cfb1566" containerName="init" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.835253 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="5173ee92-12de-4849-9659-882e5cfb1566" containerName="init" Dec 04 10:36:37 crc kubenswrapper[4831]: E1204 10:36:37.835272 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6a6469d-082d-4be1-ac56-bf92b750390d" containerName="nova-cell1-conductor-db-sync" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.835283 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6a6469d-082d-4be1-ac56-bf92b750390d" containerName="nova-cell1-conductor-db-sync" Dec 04 10:36:37 crc kubenswrapper[4831]: E1204 10:36:37.835317 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5173ee92-12de-4849-9659-882e5cfb1566" containerName="dnsmasq-dns" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.835344 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="5173ee92-12de-4849-9659-882e5cfb1566" containerName="dnsmasq-dns" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.835756 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bc76d60-b433-4a70-aeda-3866b89c9197" containerName="nova-api-api" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.835792 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6a6469d-082d-4be1-ac56-bf92b750390d" containerName="nova-cell1-conductor-db-sync" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.835811 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bc76d60-b433-4a70-aeda-3866b89c9197" containerName="nova-api-log" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.835852 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="5173ee92-12de-4849-9659-882e5cfb1566" containerName="dnsmasq-dns" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.835883 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dbc983b-0bfd-4646-9a07-4f0894f1c480" containerName="nova-manage" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.840073 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.844499 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.845356 4831 scope.go:117] "RemoveContainer" containerID="5d85f870ed5af273cfbb5057932c378fbe015b196c0d149be0b59a952453f7fa" Dec 04 10:36:37 crc kubenswrapper[4831]: E1204 10:36:37.849190 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d85f870ed5af273cfbb5057932c378fbe015b196c0d149be0b59a952453f7fa\": container with ID starting with 5d85f870ed5af273cfbb5057932c378fbe015b196c0d149be0b59a952453f7fa not found: ID does not exist" containerID="5d85f870ed5af273cfbb5057932c378fbe015b196c0d149be0b59a952453f7fa" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.849230 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d85f870ed5af273cfbb5057932c378fbe015b196c0d149be0b59a952453f7fa"} err="failed to get container status \"5d85f870ed5af273cfbb5057932c378fbe015b196c0d149be0b59a952453f7fa\": rpc error: code = NotFound desc = could not find container \"5d85f870ed5af273cfbb5057932c378fbe015b196c0d149be0b59a952453f7fa\": container with ID starting with 5d85f870ed5af273cfbb5057932c378fbe015b196c0d149be0b59a952453f7fa not found: ID does not exist" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.849255 4831 scope.go:117] "RemoveContainer" containerID="432c52cd74b8f2649b471e9a8cc62429f37c3b65a3e26b922ca05f67c3d2783c" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.850430 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 10:36:37 crc kubenswrapper[4831]: E1204 10:36:37.850771 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"432c52cd74b8f2649b471e9a8cc62429f37c3b65a3e26b922ca05f67c3d2783c\": container with ID starting with 432c52cd74b8f2649b471e9a8cc62429f37c3b65a3e26b922ca05f67c3d2783c not found: ID does not exist" containerID="432c52cd74b8f2649b471e9a8cc62429f37c3b65a3e26b922ca05f67c3d2783c" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.850805 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"432c52cd74b8f2649b471e9a8cc62429f37c3b65a3e26b922ca05f67c3d2783c"} err="failed to get container status \"432c52cd74b8f2649b471e9a8cc62429f37c3b65a3e26b922ca05f67c3d2783c\": rpc error: code = NotFound desc = could not find container \"432c52cd74b8f2649b471e9a8cc62429f37c3b65a3e26b922ca05f67c3d2783c\": container with ID starting with 432c52cd74b8f2649b471e9a8cc62429f37c3b65a3e26b922ca05f67c3d2783c not found: ID does not exist" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.852099 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.854400 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.889123 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:36:37 crc kubenswrapper[4831]: I1204 10:36:37.901921 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 10:36:37 crc kubenswrapper[4831]: E1204 10:36:37.980292 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fd62ea1b531305afca9437e5369c1a85fbef48d0132d4b5f38ed8e3d0ac45e72" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 10:36:37 crc kubenswrapper[4831]: E1204 10:36:37.981864 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fd62ea1b531305afca9437e5369c1a85fbef48d0132d4b5f38ed8e3d0ac45e72" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 10:36:37 crc kubenswrapper[4831]: E1204 10:36:37.985092 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fd62ea1b531305afca9437e5369c1a85fbef48d0132d4b5f38ed8e3d0ac45e72" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 10:36:37 crc kubenswrapper[4831]: E1204 10:36:37.985132 4831 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="80ed7f7c-004e-4690-b4c0-edbd0cb31b71" containerName="nova-scheduler-scheduler" Dec 04 10:36:38 crc kubenswrapper[4831]: I1204 10:36:38.026589 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6973b9dd-5fe0-4b1d-9df7-6de89e670246-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6973b9dd-5fe0-4b1d-9df7-6de89e670246\") " pod="openstack/nova-api-0" Dec 04 10:36:38 crc kubenswrapper[4831]: I1204 10:36:38.026746 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f4c049-2837-400e-8ee0-accb79c79fc5-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b8f4c049-2837-400e-8ee0-accb79c79fc5\") " pod="openstack/nova-cell1-conductor-0" Dec 04 10:36:38 crc kubenswrapper[4831]: I1204 10:36:38.026796 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6973b9dd-5fe0-4b1d-9df7-6de89e670246-config-data\") pod \"nova-api-0\" (UID: \"6973b9dd-5fe0-4b1d-9df7-6de89e670246\") " pod="openstack/nova-api-0" Dec 04 10:36:38 crc kubenswrapper[4831]: I1204 10:36:38.026883 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8f4c049-2837-400e-8ee0-accb79c79fc5-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b8f4c049-2837-400e-8ee0-accb79c79fc5\") " pod="openstack/nova-cell1-conductor-0" Dec 04 10:36:38 crc kubenswrapper[4831]: I1204 10:36:38.026958 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ttvg\" (UniqueName: \"kubernetes.io/projected/6973b9dd-5fe0-4b1d-9df7-6de89e670246-kube-api-access-4ttvg\") pod \"nova-api-0\" (UID: \"6973b9dd-5fe0-4b1d-9df7-6de89e670246\") " pod="openstack/nova-api-0" Dec 04 10:36:38 crc kubenswrapper[4831]: I1204 10:36:38.026982 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6973b9dd-5fe0-4b1d-9df7-6de89e670246-logs\") pod \"nova-api-0\" (UID: \"6973b9dd-5fe0-4b1d-9df7-6de89e670246\") " pod="openstack/nova-api-0" Dec 04 10:36:38 crc kubenswrapper[4831]: I1204 10:36:38.027019 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd6kd\" (UniqueName: \"kubernetes.io/projected/b8f4c049-2837-400e-8ee0-accb79c79fc5-kube-api-access-qd6kd\") pod \"nova-cell1-conductor-0\" (UID: \"b8f4c049-2837-400e-8ee0-accb79c79fc5\") " pod="openstack/nova-cell1-conductor-0" Dec 04 10:36:38 crc kubenswrapper[4831]: I1204 10:36:38.129326 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8f4c049-2837-400e-8ee0-accb79c79fc5-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b8f4c049-2837-400e-8ee0-accb79c79fc5\") " pod="openstack/nova-cell1-conductor-0" Dec 04 10:36:38 crc kubenswrapper[4831]: I1204 10:36:38.129395 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ttvg\" (UniqueName: \"kubernetes.io/projected/6973b9dd-5fe0-4b1d-9df7-6de89e670246-kube-api-access-4ttvg\") pod \"nova-api-0\" (UID: \"6973b9dd-5fe0-4b1d-9df7-6de89e670246\") " pod="openstack/nova-api-0" Dec 04 10:36:38 crc kubenswrapper[4831]: I1204 10:36:38.129423 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6973b9dd-5fe0-4b1d-9df7-6de89e670246-logs\") pod \"nova-api-0\" (UID: \"6973b9dd-5fe0-4b1d-9df7-6de89e670246\") " pod="openstack/nova-api-0" Dec 04 10:36:38 crc kubenswrapper[4831]: I1204 10:36:38.129470 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qd6kd\" (UniqueName: \"kubernetes.io/projected/b8f4c049-2837-400e-8ee0-accb79c79fc5-kube-api-access-qd6kd\") pod \"nova-cell1-conductor-0\" (UID: \"b8f4c049-2837-400e-8ee0-accb79c79fc5\") " pod="openstack/nova-cell1-conductor-0" Dec 04 10:36:38 crc kubenswrapper[4831]: I1204 10:36:38.129517 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6973b9dd-5fe0-4b1d-9df7-6de89e670246-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6973b9dd-5fe0-4b1d-9df7-6de89e670246\") " pod="openstack/nova-api-0" Dec 04 10:36:38 crc kubenswrapper[4831]: I1204 10:36:38.129623 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f4c049-2837-400e-8ee0-accb79c79fc5-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b8f4c049-2837-400e-8ee0-accb79c79fc5\") " pod="openstack/nova-cell1-conductor-0" Dec 04 10:36:38 crc kubenswrapper[4831]: I1204 10:36:38.129705 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6973b9dd-5fe0-4b1d-9df7-6de89e670246-config-data\") pod \"nova-api-0\" (UID: \"6973b9dd-5fe0-4b1d-9df7-6de89e670246\") " pod="openstack/nova-api-0" Dec 04 10:36:38 crc kubenswrapper[4831]: I1204 10:36:38.130286 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6973b9dd-5fe0-4b1d-9df7-6de89e670246-logs\") pod \"nova-api-0\" (UID: \"6973b9dd-5fe0-4b1d-9df7-6de89e670246\") " pod="openstack/nova-api-0" Dec 04 10:36:38 crc kubenswrapper[4831]: I1204 10:36:38.135219 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6973b9dd-5fe0-4b1d-9df7-6de89e670246-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6973b9dd-5fe0-4b1d-9df7-6de89e670246\") " pod="openstack/nova-api-0" Dec 04 10:36:38 crc kubenswrapper[4831]: I1204 10:36:38.135517 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6973b9dd-5fe0-4b1d-9df7-6de89e670246-config-data\") pod \"nova-api-0\" (UID: \"6973b9dd-5fe0-4b1d-9df7-6de89e670246\") " pod="openstack/nova-api-0" Dec 04 10:36:38 crc kubenswrapper[4831]: I1204 10:36:38.136367 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8f4c049-2837-400e-8ee0-accb79c79fc5-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b8f4c049-2837-400e-8ee0-accb79c79fc5\") " pod="openstack/nova-cell1-conductor-0" Dec 04 10:36:38 crc kubenswrapper[4831]: I1204 10:36:38.139637 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f4c049-2837-400e-8ee0-accb79c79fc5-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b8f4c049-2837-400e-8ee0-accb79c79fc5\") " pod="openstack/nova-cell1-conductor-0" Dec 04 10:36:38 crc kubenswrapper[4831]: I1204 10:36:38.151016 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd6kd\" (UniqueName: \"kubernetes.io/projected/b8f4c049-2837-400e-8ee0-accb79c79fc5-kube-api-access-qd6kd\") pod \"nova-cell1-conductor-0\" (UID: \"b8f4c049-2837-400e-8ee0-accb79c79fc5\") " pod="openstack/nova-cell1-conductor-0" Dec 04 10:36:38 crc kubenswrapper[4831]: I1204 10:36:38.151060 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ttvg\" (UniqueName: \"kubernetes.io/projected/6973b9dd-5fe0-4b1d-9df7-6de89e670246-kube-api-access-4ttvg\") pod \"nova-api-0\" (UID: \"6973b9dd-5fe0-4b1d-9df7-6de89e670246\") " pod="openstack/nova-api-0" Dec 04 10:36:38 crc kubenswrapper[4831]: I1204 10:36:38.158687 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 10:36:38 crc kubenswrapper[4831]: I1204 10:36:38.174021 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 04 10:36:38 crc kubenswrapper[4831]: I1204 10:36:38.627849 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 10:36:38 crc kubenswrapper[4831]: W1204 10:36:38.629284 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8f4c049_2837_400e_8ee0_accb79c79fc5.slice/crio-6dc231b294d15a0df575842969ec51bd68fa2215417ec3fd5b0c14ad501ef35c WatchSource:0}: Error finding container 6dc231b294d15a0df575842969ec51bd68fa2215417ec3fd5b0c14ad501ef35c: Status 404 returned error can't find the container with id 6dc231b294d15a0df575842969ec51bd68fa2215417ec3fd5b0c14ad501ef35c Dec 04 10:36:38 crc kubenswrapper[4831]: I1204 10:36:38.753556 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:36:38 crc kubenswrapper[4831]: I1204 10:36:38.771129 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6973b9dd-5fe0-4b1d-9df7-6de89e670246","Type":"ContainerStarted","Data":"4fd059f0723c19d09c8021830b9d7462d5ebfbaec462fc8e4fc45dd9e77884a4"} Dec 04 10:36:38 crc kubenswrapper[4831]: I1204 10:36:38.773629 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"b8f4c049-2837-400e-8ee0-accb79c79fc5","Type":"ContainerStarted","Data":"6dc231b294d15a0df575842969ec51bd68fa2215417ec3fd5b0c14ad501ef35c"} Dec 04 10:36:39 crc kubenswrapper[4831]: I1204 10:36:39.302118 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bc76d60-b433-4a70-aeda-3866b89c9197" path="/var/lib/kubelet/pods/0bc76d60-b433-4a70-aeda-3866b89c9197/volumes" Dec 04 10:36:39 crc kubenswrapper[4831]: I1204 10:36:39.486730 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 10:36:39 crc kubenswrapper[4831]: I1204 10:36:39.664373 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80ed7f7c-004e-4690-b4c0-edbd0cb31b71-combined-ca-bundle\") pod \"80ed7f7c-004e-4690-b4c0-edbd0cb31b71\" (UID: \"80ed7f7c-004e-4690-b4c0-edbd0cb31b71\") " Dec 04 10:36:39 crc kubenswrapper[4831]: I1204 10:36:39.664477 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80ed7f7c-004e-4690-b4c0-edbd0cb31b71-config-data\") pod \"80ed7f7c-004e-4690-b4c0-edbd0cb31b71\" (UID: \"80ed7f7c-004e-4690-b4c0-edbd0cb31b71\") " Dec 04 10:36:39 crc kubenswrapper[4831]: I1204 10:36:39.664643 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6nlv\" (UniqueName: \"kubernetes.io/projected/80ed7f7c-004e-4690-b4c0-edbd0cb31b71-kube-api-access-t6nlv\") pod \"80ed7f7c-004e-4690-b4c0-edbd0cb31b71\" (UID: \"80ed7f7c-004e-4690-b4c0-edbd0cb31b71\") " Dec 04 10:36:39 crc kubenswrapper[4831]: I1204 10:36:39.670129 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80ed7f7c-004e-4690-b4c0-edbd0cb31b71-kube-api-access-t6nlv" (OuterVolumeSpecName: "kube-api-access-t6nlv") pod "80ed7f7c-004e-4690-b4c0-edbd0cb31b71" (UID: "80ed7f7c-004e-4690-b4c0-edbd0cb31b71"). InnerVolumeSpecName "kube-api-access-t6nlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:36:39 crc kubenswrapper[4831]: I1204 10:36:39.702447 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80ed7f7c-004e-4690-b4c0-edbd0cb31b71-config-data" (OuterVolumeSpecName: "config-data") pod "80ed7f7c-004e-4690-b4c0-edbd0cb31b71" (UID: "80ed7f7c-004e-4690-b4c0-edbd0cb31b71"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:36:39 crc kubenswrapper[4831]: I1204 10:36:39.723175 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80ed7f7c-004e-4690-b4c0-edbd0cb31b71-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "80ed7f7c-004e-4690-b4c0-edbd0cb31b71" (UID: "80ed7f7c-004e-4690-b4c0-edbd0cb31b71"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:36:39 crc kubenswrapper[4831]: I1204 10:36:39.769245 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6nlv\" (UniqueName: \"kubernetes.io/projected/80ed7f7c-004e-4690-b4c0-edbd0cb31b71-kube-api-access-t6nlv\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:39 crc kubenswrapper[4831]: I1204 10:36:39.769293 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80ed7f7c-004e-4690-b4c0-edbd0cb31b71-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:39 crc kubenswrapper[4831]: I1204 10:36:39.769302 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80ed7f7c-004e-4690-b4c0-edbd0cb31b71-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:39 crc kubenswrapper[4831]: I1204 10:36:39.793282 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6973b9dd-5fe0-4b1d-9df7-6de89e670246","Type":"ContainerStarted","Data":"184de0549ae1f93177ed87ee9769b3c77e64bc7f2f13269f4c4893eac88c81c7"} Dec 04 10:36:39 crc kubenswrapper[4831]: I1204 10:36:39.793342 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6973b9dd-5fe0-4b1d-9df7-6de89e670246","Type":"ContainerStarted","Data":"b5ae963d18514b6df38f3d1569269e84b02e2651bd743da7404dc5a914b52ada"} Dec 04 10:36:39 crc kubenswrapper[4831]: I1204 10:36:39.797035 4831 generic.go:334] "Generic (PLEG): container finished" podID="80ed7f7c-004e-4690-b4c0-edbd0cb31b71" containerID="fd62ea1b531305afca9437e5369c1a85fbef48d0132d4b5f38ed8e3d0ac45e72" exitCode=0 Dec 04 10:36:39 crc kubenswrapper[4831]: I1204 10:36:39.797175 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"80ed7f7c-004e-4690-b4c0-edbd0cb31b71","Type":"ContainerDied","Data":"fd62ea1b531305afca9437e5369c1a85fbef48d0132d4b5f38ed8e3d0ac45e72"} Dec 04 10:36:39 crc kubenswrapper[4831]: I1204 10:36:39.797217 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"80ed7f7c-004e-4690-b4c0-edbd0cb31b71","Type":"ContainerDied","Data":"92692d61ba9dcf68bdb7d88297a1633b081b1c9fff00a3b157eb230b89ac4cfc"} Dec 04 10:36:39 crc kubenswrapper[4831]: I1204 10:36:39.797152 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 10:36:39 crc kubenswrapper[4831]: I1204 10:36:39.797239 4831 scope.go:117] "RemoveContainer" containerID="fd62ea1b531305afca9437e5369c1a85fbef48d0132d4b5f38ed8e3d0ac45e72" Dec 04 10:36:39 crc kubenswrapper[4831]: I1204 10:36:39.800428 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"b8f4c049-2837-400e-8ee0-accb79c79fc5","Type":"ContainerStarted","Data":"133eaf662b9f7e4a77a657c32021115ecdd109e19dc1993afcd87b63cc62014a"} Dec 04 10:36:39 crc kubenswrapper[4831]: I1204 10:36:39.802926 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 04 10:36:39 crc kubenswrapper[4831]: I1204 10:36:39.821250 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.8212276259999998 podStartE2EDuration="2.821227626s" podCreationTimestamp="2025-12-04 10:36:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:36:39.814930039 +0000 UTC m=+1296.764105363" watchObservedRunningTime="2025-12-04 10:36:39.821227626 +0000 UTC m=+1296.770402940" Dec 04 10:36:39 crc kubenswrapper[4831]: I1204 10:36:39.829899 4831 scope.go:117] "RemoveContainer" containerID="fd62ea1b531305afca9437e5369c1a85fbef48d0132d4b5f38ed8e3d0ac45e72" Dec 04 10:36:39 crc kubenswrapper[4831]: E1204 10:36:39.830360 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd62ea1b531305afca9437e5369c1a85fbef48d0132d4b5f38ed8e3d0ac45e72\": container with ID starting with fd62ea1b531305afca9437e5369c1a85fbef48d0132d4b5f38ed8e3d0ac45e72 not found: ID does not exist" containerID="fd62ea1b531305afca9437e5369c1a85fbef48d0132d4b5f38ed8e3d0ac45e72" Dec 04 10:36:39 crc kubenswrapper[4831]: I1204 10:36:39.830403 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd62ea1b531305afca9437e5369c1a85fbef48d0132d4b5f38ed8e3d0ac45e72"} err="failed to get container status \"fd62ea1b531305afca9437e5369c1a85fbef48d0132d4b5f38ed8e3d0ac45e72\": rpc error: code = NotFound desc = could not find container \"fd62ea1b531305afca9437e5369c1a85fbef48d0132d4b5f38ed8e3d0ac45e72\": container with ID starting with fd62ea1b531305afca9437e5369c1a85fbef48d0132d4b5f38ed8e3d0ac45e72 not found: ID does not exist" Dec 04 10:36:39 crc kubenswrapper[4831]: I1204 10:36:39.852463 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.852440465 podStartE2EDuration="2.852440465s" podCreationTimestamp="2025-12-04 10:36:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:36:39.836007489 +0000 UTC m=+1296.785182803" watchObservedRunningTime="2025-12-04 10:36:39.852440465 +0000 UTC m=+1296.801615779" Dec 04 10:36:39 crc kubenswrapper[4831]: I1204 10:36:39.880973 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:36:39 crc kubenswrapper[4831]: I1204 10:36:39.891425 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:36:39 crc kubenswrapper[4831]: I1204 10:36:39.899782 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:36:39 crc kubenswrapper[4831]: E1204 10:36:39.900259 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80ed7f7c-004e-4690-b4c0-edbd0cb31b71" containerName="nova-scheduler-scheduler" Dec 04 10:36:39 crc kubenswrapper[4831]: I1204 10:36:39.900277 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="80ed7f7c-004e-4690-b4c0-edbd0cb31b71" containerName="nova-scheduler-scheduler" Dec 04 10:36:39 crc kubenswrapper[4831]: I1204 10:36:39.900487 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="80ed7f7c-004e-4690-b4c0-edbd0cb31b71" containerName="nova-scheduler-scheduler" Dec 04 10:36:39 crc kubenswrapper[4831]: I1204 10:36:39.901207 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 10:36:39 crc kubenswrapper[4831]: I1204 10:36:39.904438 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 04 10:36:39 crc kubenswrapper[4831]: I1204 10:36:39.909554 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:36:39 crc kubenswrapper[4831]: I1204 10:36:39.973591 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd6d571-6f32-4d3d-8149-02b1c74c9ec2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dbd6d571-6f32-4d3d-8149-02b1c74c9ec2\") " pod="openstack/nova-scheduler-0" Dec 04 10:36:39 crc kubenswrapper[4831]: I1204 10:36:39.973646 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbd6d571-6f32-4d3d-8149-02b1c74c9ec2-config-data\") pod \"nova-scheduler-0\" (UID: \"dbd6d571-6f32-4d3d-8149-02b1c74c9ec2\") " pod="openstack/nova-scheduler-0" Dec 04 10:36:39 crc kubenswrapper[4831]: I1204 10:36:39.973747 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l742x\" (UniqueName: \"kubernetes.io/projected/dbd6d571-6f32-4d3d-8149-02b1c74c9ec2-kube-api-access-l742x\") pod \"nova-scheduler-0\" (UID: \"dbd6d571-6f32-4d3d-8149-02b1c74c9ec2\") " pod="openstack/nova-scheduler-0" Dec 04 10:36:40 crc kubenswrapper[4831]: I1204 10:36:40.074719 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd6d571-6f32-4d3d-8149-02b1c74c9ec2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dbd6d571-6f32-4d3d-8149-02b1c74c9ec2\") " pod="openstack/nova-scheduler-0" Dec 04 10:36:40 crc kubenswrapper[4831]: I1204 10:36:40.074788 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbd6d571-6f32-4d3d-8149-02b1c74c9ec2-config-data\") pod \"nova-scheduler-0\" (UID: \"dbd6d571-6f32-4d3d-8149-02b1c74c9ec2\") " pod="openstack/nova-scheduler-0" Dec 04 10:36:40 crc kubenswrapper[4831]: I1204 10:36:40.074831 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l742x\" (UniqueName: \"kubernetes.io/projected/dbd6d571-6f32-4d3d-8149-02b1c74c9ec2-kube-api-access-l742x\") pod \"nova-scheduler-0\" (UID: \"dbd6d571-6f32-4d3d-8149-02b1c74c9ec2\") " pod="openstack/nova-scheduler-0" Dec 04 10:36:40 crc kubenswrapper[4831]: I1204 10:36:40.079099 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbd6d571-6f32-4d3d-8149-02b1c74c9ec2-config-data\") pod \"nova-scheduler-0\" (UID: \"dbd6d571-6f32-4d3d-8149-02b1c74c9ec2\") " pod="openstack/nova-scheduler-0" Dec 04 10:36:40 crc kubenswrapper[4831]: I1204 10:36:40.088492 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd6d571-6f32-4d3d-8149-02b1c74c9ec2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dbd6d571-6f32-4d3d-8149-02b1c74c9ec2\") " pod="openstack/nova-scheduler-0" Dec 04 10:36:40 crc kubenswrapper[4831]: I1204 10:36:40.092162 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l742x\" (UniqueName: \"kubernetes.io/projected/dbd6d571-6f32-4d3d-8149-02b1c74c9ec2-kube-api-access-l742x\") pod \"nova-scheduler-0\" (UID: \"dbd6d571-6f32-4d3d-8149-02b1c74c9ec2\") " pod="openstack/nova-scheduler-0" Dec 04 10:36:40 crc kubenswrapper[4831]: I1204 10:36:40.218417 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 10:36:40 crc kubenswrapper[4831]: I1204 10:36:40.717852 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:36:40 crc kubenswrapper[4831]: W1204 10:36:40.720810 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbd6d571_6f32_4d3d_8149_02b1c74c9ec2.slice/crio-3ae98c822c8829ee8972feff04bb4b66cb3f7a8baa1f146ae8df4d0cabf65bd3 WatchSource:0}: Error finding container 3ae98c822c8829ee8972feff04bb4b66cb3f7a8baa1f146ae8df4d0cabf65bd3: Status 404 returned error can't find the container with id 3ae98c822c8829ee8972feff04bb4b66cb3f7a8baa1f146ae8df4d0cabf65bd3 Dec 04 10:36:40 crc kubenswrapper[4831]: I1204 10:36:40.812121 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dbd6d571-6f32-4d3d-8149-02b1c74c9ec2","Type":"ContainerStarted","Data":"3ae98c822c8829ee8972feff04bb4b66cb3f7a8baa1f146ae8df4d0cabf65bd3"} Dec 04 10:36:41 crc kubenswrapper[4831]: I1204 10:36:41.314734 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80ed7f7c-004e-4690-b4c0-edbd0cb31b71" path="/var/lib/kubelet/pods/80ed7f7c-004e-4690-b4c0-edbd0cb31b71/volumes" Dec 04 10:36:41 crc kubenswrapper[4831]: I1204 10:36:41.825222 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dbd6d571-6f32-4d3d-8149-02b1c74c9ec2","Type":"ContainerStarted","Data":"0362367f1999aa21a31283dbc6c3cdd4a16ff0d4c505294f6c9a56cd58fd5792"} Dec 04 10:36:41 crc kubenswrapper[4831]: I1204 10:36:41.848677 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.8486438979999997 podStartE2EDuration="2.848643898s" podCreationTimestamp="2025-12-04 10:36:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:36:41.842323431 +0000 UTC m=+1298.791498785" watchObservedRunningTime="2025-12-04 10:36:41.848643898 +0000 UTC m=+1298.797819212" Dec 04 10:36:43 crc kubenswrapper[4831]: I1204 10:36:43.222491 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 04 10:36:43 crc kubenswrapper[4831]: I1204 10:36:43.730633 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 04 10:36:45 crc kubenswrapper[4831]: I1204 10:36:45.218996 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 04 10:36:47 crc kubenswrapper[4831]: I1204 10:36:47.317500 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 10:36:47 crc kubenswrapper[4831]: I1204 10:36:47.318109 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="52a381c8-f093-4972-90ac-64799e0184c2" containerName="kube-state-metrics" containerID="cri-o://98c180c693a926e92092f69d0d0e65bd179b51f8837e7a9a74f9b28c70fd7985" gracePeriod=30 Dec 04 10:36:47 crc kubenswrapper[4831]: I1204 10:36:47.854288 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 10:36:47 crc kubenswrapper[4831]: I1204 10:36:47.885986 4831 generic.go:334] "Generic (PLEG): container finished" podID="52a381c8-f093-4972-90ac-64799e0184c2" containerID="98c180c693a926e92092f69d0d0e65bd179b51f8837e7a9a74f9b28c70fd7985" exitCode=2 Dec 04 10:36:47 crc kubenswrapper[4831]: I1204 10:36:47.886032 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"52a381c8-f093-4972-90ac-64799e0184c2","Type":"ContainerDied","Data":"98c180c693a926e92092f69d0d0e65bd179b51f8837e7a9a74f9b28c70fd7985"} Dec 04 10:36:47 crc kubenswrapper[4831]: I1204 10:36:47.886061 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"52a381c8-f093-4972-90ac-64799e0184c2","Type":"ContainerDied","Data":"b6430f35e1ea09b90b169b9825f390619c6c2acc8669b3c7a91400a7d2e03fe9"} Dec 04 10:36:47 crc kubenswrapper[4831]: I1204 10:36:47.886067 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 10:36:47 crc kubenswrapper[4831]: I1204 10:36:47.886077 4831 scope.go:117] "RemoveContainer" containerID="98c180c693a926e92092f69d0d0e65bd179b51f8837e7a9a74f9b28c70fd7985" Dec 04 10:36:47 crc kubenswrapper[4831]: I1204 10:36:47.931853 4831 scope.go:117] "RemoveContainer" containerID="98c180c693a926e92092f69d0d0e65bd179b51f8837e7a9a74f9b28c70fd7985" Dec 04 10:36:47 crc kubenswrapper[4831]: E1204 10:36:47.932334 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98c180c693a926e92092f69d0d0e65bd179b51f8837e7a9a74f9b28c70fd7985\": container with ID starting with 98c180c693a926e92092f69d0d0e65bd179b51f8837e7a9a74f9b28c70fd7985 not found: ID does not exist" containerID="98c180c693a926e92092f69d0d0e65bd179b51f8837e7a9a74f9b28c70fd7985" Dec 04 10:36:47 crc kubenswrapper[4831]: I1204 10:36:47.932395 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98c180c693a926e92092f69d0d0e65bd179b51f8837e7a9a74f9b28c70fd7985"} err="failed to get container status \"98c180c693a926e92092f69d0d0e65bd179b51f8837e7a9a74f9b28c70fd7985\": rpc error: code = NotFound desc = could not find container \"98c180c693a926e92092f69d0d0e65bd179b51f8837e7a9a74f9b28c70fd7985\": container with ID starting with 98c180c693a926e92092f69d0d0e65bd179b51f8837e7a9a74f9b28c70fd7985 not found: ID does not exist" Dec 04 10:36:48 crc kubenswrapper[4831]: I1204 10:36:48.025831 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ckzz\" (UniqueName: \"kubernetes.io/projected/52a381c8-f093-4972-90ac-64799e0184c2-kube-api-access-5ckzz\") pod \"52a381c8-f093-4972-90ac-64799e0184c2\" (UID: \"52a381c8-f093-4972-90ac-64799e0184c2\") " Dec 04 10:36:48 crc kubenswrapper[4831]: I1204 10:36:48.035985 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52a381c8-f093-4972-90ac-64799e0184c2-kube-api-access-5ckzz" (OuterVolumeSpecName: "kube-api-access-5ckzz") pod "52a381c8-f093-4972-90ac-64799e0184c2" (UID: "52a381c8-f093-4972-90ac-64799e0184c2"). InnerVolumeSpecName "kube-api-access-5ckzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:36:48 crc kubenswrapper[4831]: I1204 10:36:48.128934 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ckzz\" (UniqueName: \"kubernetes.io/projected/52a381c8-f093-4972-90ac-64799e0184c2-kube-api-access-5ckzz\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:48 crc kubenswrapper[4831]: I1204 10:36:48.159797 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 10:36:48 crc kubenswrapper[4831]: I1204 10:36:48.159837 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 10:36:48 crc kubenswrapper[4831]: I1204 10:36:48.228131 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 10:36:48 crc kubenswrapper[4831]: I1204 10:36:48.237676 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 10:36:48 crc kubenswrapper[4831]: I1204 10:36:48.260762 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 10:36:48 crc kubenswrapper[4831]: E1204 10:36:48.261212 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52a381c8-f093-4972-90ac-64799e0184c2" containerName="kube-state-metrics" Dec 04 10:36:48 crc kubenswrapper[4831]: I1204 10:36:48.261230 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="52a381c8-f093-4972-90ac-64799e0184c2" containerName="kube-state-metrics" Dec 04 10:36:48 crc kubenswrapper[4831]: I1204 10:36:48.261461 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="52a381c8-f093-4972-90ac-64799e0184c2" containerName="kube-state-metrics" Dec 04 10:36:48 crc kubenswrapper[4831]: I1204 10:36:48.262154 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 10:36:48 crc kubenswrapper[4831]: I1204 10:36:48.267292 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 04 10:36:48 crc kubenswrapper[4831]: I1204 10:36:48.267405 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 04 10:36:48 crc kubenswrapper[4831]: I1204 10:36:48.294590 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 10:36:48 crc kubenswrapper[4831]: I1204 10:36:48.334044 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd604d68-70ec-4bc4-bd2a-8bc427ced498-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"bd604d68-70ec-4bc4-bd2a-8bc427ced498\") " pod="openstack/kube-state-metrics-0" Dec 04 10:36:48 crc kubenswrapper[4831]: I1204 10:36:48.334137 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrlwp\" (UniqueName: \"kubernetes.io/projected/bd604d68-70ec-4bc4-bd2a-8bc427ced498-kube-api-access-hrlwp\") pod \"kube-state-metrics-0\" (UID: \"bd604d68-70ec-4bc4-bd2a-8bc427ced498\") " pod="openstack/kube-state-metrics-0" Dec 04 10:36:48 crc kubenswrapper[4831]: I1204 10:36:48.334170 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd604d68-70ec-4bc4-bd2a-8bc427ced498-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"bd604d68-70ec-4bc4-bd2a-8bc427ced498\") " pod="openstack/kube-state-metrics-0" Dec 04 10:36:48 crc kubenswrapper[4831]: I1204 10:36:48.334235 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/bd604d68-70ec-4bc4-bd2a-8bc427ced498-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"bd604d68-70ec-4bc4-bd2a-8bc427ced498\") " pod="openstack/kube-state-metrics-0" Dec 04 10:36:48 crc kubenswrapper[4831]: I1204 10:36:48.436119 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd604d68-70ec-4bc4-bd2a-8bc427ced498-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"bd604d68-70ec-4bc4-bd2a-8bc427ced498\") " pod="openstack/kube-state-metrics-0" Dec 04 10:36:48 crc kubenswrapper[4831]: I1204 10:36:48.436294 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrlwp\" (UniqueName: \"kubernetes.io/projected/bd604d68-70ec-4bc4-bd2a-8bc427ced498-kube-api-access-hrlwp\") pod \"kube-state-metrics-0\" (UID: \"bd604d68-70ec-4bc4-bd2a-8bc427ced498\") " pod="openstack/kube-state-metrics-0" Dec 04 10:36:48 crc kubenswrapper[4831]: I1204 10:36:48.436704 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd604d68-70ec-4bc4-bd2a-8bc427ced498-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"bd604d68-70ec-4bc4-bd2a-8bc427ced498\") " pod="openstack/kube-state-metrics-0" Dec 04 10:36:48 crc kubenswrapper[4831]: I1204 10:36:48.436854 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/bd604d68-70ec-4bc4-bd2a-8bc427ced498-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"bd604d68-70ec-4bc4-bd2a-8bc427ced498\") " pod="openstack/kube-state-metrics-0" Dec 04 10:36:48 crc kubenswrapper[4831]: I1204 10:36:48.441540 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd604d68-70ec-4bc4-bd2a-8bc427ced498-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"bd604d68-70ec-4bc4-bd2a-8bc427ced498\") " pod="openstack/kube-state-metrics-0" Dec 04 10:36:48 crc kubenswrapper[4831]: I1204 10:36:48.445064 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/bd604d68-70ec-4bc4-bd2a-8bc427ced498-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"bd604d68-70ec-4bc4-bd2a-8bc427ced498\") " pod="openstack/kube-state-metrics-0" Dec 04 10:36:48 crc kubenswrapper[4831]: I1204 10:36:48.445384 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd604d68-70ec-4bc4-bd2a-8bc427ced498-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"bd604d68-70ec-4bc4-bd2a-8bc427ced498\") " pod="openstack/kube-state-metrics-0" Dec 04 10:36:48 crc kubenswrapper[4831]: I1204 10:36:48.467000 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrlwp\" (UniqueName: \"kubernetes.io/projected/bd604d68-70ec-4bc4-bd2a-8bc427ced498-kube-api-access-hrlwp\") pod \"kube-state-metrics-0\" (UID: \"bd604d68-70ec-4bc4-bd2a-8bc427ced498\") " pod="openstack/kube-state-metrics-0" Dec 04 10:36:48 crc kubenswrapper[4831]: I1204 10:36:48.591716 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 10:36:49 crc kubenswrapper[4831]: I1204 10:36:49.103341 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 10:36:49 crc kubenswrapper[4831]: I1204 10:36:49.200851 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6973b9dd-5fe0-4b1d-9df7-6de89e670246" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.210:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 10:36:49 crc kubenswrapper[4831]: I1204 10:36:49.241884 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6973b9dd-5fe0-4b1d-9df7-6de89e670246" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.210:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 10:36:49 crc kubenswrapper[4831]: I1204 10:36:49.291453 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52a381c8-f093-4972-90ac-64799e0184c2" path="/var/lib/kubelet/pods/52a381c8-f093-4972-90ac-64799e0184c2/volumes" Dec 04 10:36:49 crc kubenswrapper[4831]: I1204 10:36:49.474134 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:36:49 crc kubenswrapper[4831]: I1204 10:36:49.474475 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="497edb30-816e-45e2-9180-cfc1392f2c1c" containerName="ceilometer-central-agent" containerID="cri-o://1cdb2f89346659e1489a3a26b8fd8d830f5d69984d0c880c48ec59a2523b29cd" gracePeriod=30 Dec 04 10:36:49 crc kubenswrapper[4831]: I1204 10:36:49.475034 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="497edb30-816e-45e2-9180-cfc1392f2c1c" containerName="proxy-httpd" containerID="cri-o://2ad59666456662eed54cc5463119b7fe38d168438dc698d4536fef71174693b0" gracePeriod=30 Dec 04 10:36:49 crc kubenswrapper[4831]: I1204 10:36:49.475090 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="497edb30-816e-45e2-9180-cfc1392f2c1c" containerName="sg-core" containerID="cri-o://ef95e8684c317ba39ddc1e8dd5a156ab0fe2edf5a5042395fb5dfb2e48b36d4f" gracePeriod=30 Dec 04 10:36:49 crc kubenswrapper[4831]: I1204 10:36:49.475138 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="497edb30-816e-45e2-9180-cfc1392f2c1c" containerName="ceilometer-notification-agent" containerID="cri-o://e275b793e55ff61a6841bba6df6a8edfacfa68c57633c0dcf3a30e8019927667" gracePeriod=30 Dec 04 10:36:49 crc kubenswrapper[4831]: I1204 10:36:49.924093 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"bd604d68-70ec-4bc4-bd2a-8bc427ced498","Type":"ContainerStarted","Data":"a43134b3e13a4070d22e5387b0276a16820f31bf2c86abbd855a7b28a52f6450"} Dec 04 10:36:49 crc kubenswrapper[4831]: I1204 10:36:49.924430 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"bd604d68-70ec-4bc4-bd2a-8bc427ced498","Type":"ContainerStarted","Data":"e348506e445d4fe05e54cb1f10d3f1d8709f3a8ef5bfdda0c434e3c8beea4c3b"} Dec 04 10:36:49 crc kubenswrapper[4831]: I1204 10:36:49.924653 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 04 10:36:49 crc kubenswrapper[4831]: I1204 10:36:49.928824 4831 generic.go:334] "Generic (PLEG): container finished" podID="497edb30-816e-45e2-9180-cfc1392f2c1c" containerID="2ad59666456662eed54cc5463119b7fe38d168438dc698d4536fef71174693b0" exitCode=0 Dec 04 10:36:49 crc kubenswrapper[4831]: I1204 10:36:49.928864 4831 generic.go:334] "Generic (PLEG): container finished" podID="497edb30-816e-45e2-9180-cfc1392f2c1c" containerID="ef95e8684c317ba39ddc1e8dd5a156ab0fe2edf5a5042395fb5dfb2e48b36d4f" exitCode=2 Dec 04 10:36:49 crc kubenswrapper[4831]: I1204 10:36:49.928892 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"497edb30-816e-45e2-9180-cfc1392f2c1c","Type":"ContainerDied","Data":"2ad59666456662eed54cc5463119b7fe38d168438dc698d4536fef71174693b0"} Dec 04 10:36:49 crc kubenswrapper[4831]: I1204 10:36:49.928922 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"497edb30-816e-45e2-9180-cfc1392f2c1c","Type":"ContainerDied","Data":"ef95e8684c317ba39ddc1e8dd5a156ab0fe2edf5a5042395fb5dfb2e48b36d4f"} Dec 04 10:36:49 crc kubenswrapper[4831]: I1204 10:36:49.941297 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.571370634 podStartE2EDuration="1.941271936s" podCreationTimestamp="2025-12-04 10:36:48 +0000 UTC" firstStartedPulling="2025-12-04 10:36:49.11023195 +0000 UTC m=+1306.059407254" lastFinishedPulling="2025-12-04 10:36:49.480133242 +0000 UTC m=+1306.429308556" observedRunningTime="2025-12-04 10:36:49.93914065 +0000 UTC m=+1306.888315984" watchObservedRunningTime="2025-12-04 10:36:49.941271936 +0000 UTC m=+1306.890447250" Dec 04 10:36:50 crc kubenswrapper[4831]: I1204 10:36:50.219400 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 04 10:36:50 crc kubenswrapper[4831]: I1204 10:36:50.268742 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 04 10:36:50 crc kubenswrapper[4831]: I1204 10:36:50.943533 4831 generic.go:334] "Generic (PLEG): container finished" podID="497edb30-816e-45e2-9180-cfc1392f2c1c" containerID="1cdb2f89346659e1489a3a26b8fd8d830f5d69984d0c880c48ec59a2523b29cd" exitCode=0 Dec 04 10:36:50 crc kubenswrapper[4831]: I1204 10:36:50.944730 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"497edb30-816e-45e2-9180-cfc1392f2c1c","Type":"ContainerDied","Data":"1cdb2f89346659e1489a3a26b8fd8d830f5d69984d0c880c48ec59a2523b29cd"} Dec 04 10:36:50 crc kubenswrapper[4831]: I1204 10:36:50.975206 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 04 10:36:52 crc kubenswrapper[4831]: I1204 10:36:52.690069 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:36:52 crc kubenswrapper[4831]: I1204 10:36:52.746726 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/497edb30-816e-45e2-9180-cfc1392f2c1c-sg-core-conf-yaml\") pod \"497edb30-816e-45e2-9180-cfc1392f2c1c\" (UID: \"497edb30-816e-45e2-9180-cfc1392f2c1c\") " Dec 04 10:36:52 crc kubenswrapper[4831]: I1204 10:36:52.746806 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7d7lq\" (UniqueName: \"kubernetes.io/projected/497edb30-816e-45e2-9180-cfc1392f2c1c-kube-api-access-7d7lq\") pod \"497edb30-816e-45e2-9180-cfc1392f2c1c\" (UID: \"497edb30-816e-45e2-9180-cfc1392f2c1c\") " Dec 04 10:36:52 crc kubenswrapper[4831]: I1204 10:36:52.746968 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/497edb30-816e-45e2-9180-cfc1392f2c1c-combined-ca-bundle\") pod \"497edb30-816e-45e2-9180-cfc1392f2c1c\" (UID: \"497edb30-816e-45e2-9180-cfc1392f2c1c\") " Dec 04 10:36:52 crc kubenswrapper[4831]: I1204 10:36:52.747050 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/497edb30-816e-45e2-9180-cfc1392f2c1c-scripts\") pod \"497edb30-816e-45e2-9180-cfc1392f2c1c\" (UID: \"497edb30-816e-45e2-9180-cfc1392f2c1c\") " Dec 04 10:36:52 crc kubenswrapper[4831]: I1204 10:36:52.747082 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/497edb30-816e-45e2-9180-cfc1392f2c1c-run-httpd\") pod \"497edb30-816e-45e2-9180-cfc1392f2c1c\" (UID: \"497edb30-816e-45e2-9180-cfc1392f2c1c\") " Dec 04 10:36:52 crc kubenswrapper[4831]: I1204 10:36:52.747111 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/497edb30-816e-45e2-9180-cfc1392f2c1c-log-httpd\") pod \"497edb30-816e-45e2-9180-cfc1392f2c1c\" (UID: \"497edb30-816e-45e2-9180-cfc1392f2c1c\") " Dec 04 10:36:52 crc kubenswrapper[4831]: I1204 10:36:52.747137 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/497edb30-816e-45e2-9180-cfc1392f2c1c-config-data\") pod \"497edb30-816e-45e2-9180-cfc1392f2c1c\" (UID: \"497edb30-816e-45e2-9180-cfc1392f2c1c\") " Dec 04 10:36:52 crc kubenswrapper[4831]: I1204 10:36:52.747397 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/497edb30-816e-45e2-9180-cfc1392f2c1c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "497edb30-816e-45e2-9180-cfc1392f2c1c" (UID: "497edb30-816e-45e2-9180-cfc1392f2c1c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:36:52 crc kubenswrapper[4831]: I1204 10:36:52.747503 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/497edb30-816e-45e2-9180-cfc1392f2c1c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "497edb30-816e-45e2-9180-cfc1392f2c1c" (UID: "497edb30-816e-45e2-9180-cfc1392f2c1c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:36:52 crc kubenswrapper[4831]: I1204 10:36:52.748111 4831 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/497edb30-816e-45e2-9180-cfc1392f2c1c-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:52 crc kubenswrapper[4831]: I1204 10:36:52.748134 4831 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/497edb30-816e-45e2-9180-cfc1392f2c1c-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:52 crc kubenswrapper[4831]: I1204 10:36:52.768540 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/497edb30-816e-45e2-9180-cfc1392f2c1c-scripts" (OuterVolumeSpecName: "scripts") pod "497edb30-816e-45e2-9180-cfc1392f2c1c" (UID: "497edb30-816e-45e2-9180-cfc1392f2c1c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:36:52 crc kubenswrapper[4831]: I1204 10:36:52.768575 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/497edb30-816e-45e2-9180-cfc1392f2c1c-kube-api-access-7d7lq" (OuterVolumeSpecName: "kube-api-access-7d7lq") pod "497edb30-816e-45e2-9180-cfc1392f2c1c" (UID: "497edb30-816e-45e2-9180-cfc1392f2c1c"). InnerVolumeSpecName "kube-api-access-7d7lq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:36:52 crc kubenswrapper[4831]: I1204 10:36:52.799258 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/497edb30-816e-45e2-9180-cfc1392f2c1c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "497edb30-816e-45e2-9180-cfc1392f2c1c" (UID: "497edb30-816e-45e2-9180-cfc1392f2c1c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:36:52 crc kubenswrapper[4831]: I1204 10:36:52.850621 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/497edb30-816e-45e2-9180-cfc1392f2c1c-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:52 crc kubenswrapper[4831]: I1204 10:36:52.850677 4831 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/497edb30-816e-45e2-9180-cfc1392f2c1c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:52 crc kubenswrapper[4831]: I1204 10:36:52.850692 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7d7lq\" (UniqueName: \"kubernetes.io/projected/497edb30-816e-45e2-9180-cfc1392f2c1c-kube-api-access-7d7lq\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:52 crc kubenswrapper[4831]: I1204 10:36:52.866073 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/497edb30-816e-45e2-9180-cfc1392f2c1c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "497edb30-816e-45e2-9180-cfc1392f2c1c" (UID: "497edb30-816e-45e2-9180-cfc1392f2c1c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:36:52 crc kubenswrapper[4831]: I1204 10:36:52.882235 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/497edb30-816e-45e2-9180-cfc1392f2c1c-config-data" (OuterVolumeSpecName: "config-data") pod "497edb30-816e-45e2-9180-cfc1392f2c1c" (UID: "497edb30-816e-45e2-9180-cfc1392f2c1c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:36:52 crc kubenswrapper[4831]: I1204 10:36:52.952539 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/497edb30-816e-45e2-9180-cfc1392f2c1c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:52 crc kubenswrapper[4831]: I1204 10:36:52.952567 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/497edb30-816e-45e2-9180-cfc1392f2c1c-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:52 crc kubenswrapper[4831]: I1204 10:36:52.965464 4831 generic.go:334] "Generic (PLEG): container finished" podID="497edb30-816e-45e2-9180-cfc1392f2c1c" containerID="e275b793e55ff61a6841bba6df6a8edfacfa68c57633c0dcf3a30e8019927667" exitCode=0 Dec 04 10:36:52 crc kubenswrapper[4831]: I1204 10:36:52.965511 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"497edb30-816e-45e2-9180-cfc1392f2c1c","Type":"ContainerDied","Data":"e275b793e55ff61a6841bba6df6a8edfacfa68c57633c0dcf3a30e8019927667"} Dec 04 10:36:52 crc kubenswrapper[4831]: I1204 10:36:52.965543 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"497edb30-816e-45e2-9180-cfc1392f2c1c","Type":"ContainerDied","Data":"177a4cc7b6b7877b0a90f7fc54fdd3802b74a893bf113981354dff7ca74c04e1"} Dec 04 10:36:52 crc kubenswrapper[4831]: I1204 10:36:52.965563 4831 scope.go:117] "RemoveContainer" containerID="2ad59666456662eed54cc5463119b7fe38d168438dc698d4536fef71174693b0" Dec 04 10:36:52 crc kubenswrapper[4831]: I1204 10:36:52.965885 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.002525 4831 scope.go:117] "RemoveContainer" containerID="ef95e8684c317ba39ddc1e8dd5a156ab0fe2edf5a5042395fb5dfb2e48b36d4f" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.002747 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.012191 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.030509 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:36:53 crc kubenswrapper[4831]: E1204 10:36:53.031172 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="497edb30-816e-45e2-9180-cfc1392f2c1c" containerName="ceilometer-notification-agent" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.031242 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="497edb30-816e-45e2-9180-cfc1392f2c1c" containerName="ceilometer-notification-agent" Dec 04 10:36:53 crc kubenswrapper[4831]: E1204 10:36:53.031312 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="497edb30-816e-45e2-9180-cfc1392f2c1c" containerName="ceilometer-central-agent" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.031394 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="497edb30-816e-45e2-9180-cfc1392f2c1c" containerName="ceilometer-central-agent" Dec 04 10:36:53 crc kubenswrapper[4831]: E1204 10:36:53.031464 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="497edb30-816e-45e2-9180-cfc1392f2c1c" containerName="sg-core" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.031525 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="497edb30-816e-45e2-9180-cfc1392f2c1c" containerName="sg-core" Dec 04 10:36:53 crc kubenswrapper[4831]: E1204 10:36:53.031599 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="497edb30-816e-45e2-9180-cfc1392f2c1c" containerName="proxy-httpd" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.031705 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="497edb30-816e-45e2-9180-cfc1392f2c1c" containerName="proxy-httpd" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.032008 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="497edb30-816e-45e2-9180-cfc1392f2c1c" containerName="sg-core" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.032109 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="497edb30-816e-45e2-9180-cfc1392f2c1c" containerName="ceilometer-notification-agent" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.032250 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="497edb30-816e-45e2-9180-cfc1392f2c1c" containerName="ceilometer-central-agent" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.032444 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="497edb30-816e-45e2-9180-cfc1392f2c1c" containerName="proxy-httpd" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.032403 4831 scope.go:117] "RemoveContainer" containerID="e275b793e55ff61a6841bba6df6a8edfacfa68c57633c0dcf3a30e8019927667" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.035330 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.037627 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.038065 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.038298 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.049674 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.060628 4831 scope.go:117] "RemoveContainer" containerID="1cdb2f89346659e1489a3a26b8fd8d830f5d69984d0c880c48ec59a2523b29cd" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.081970 4831 scope.go:117] "RemoveContainer" containerID="2ad59666456662eed54cc5463119b7fe38d168438dc698d4536fef71174693b0" Dec 04 10:36:53 crc kubenswrapper[4831]: E1204 10:36:53.082513 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ad59666456662eed54cc5463119b7fe38d168438dc698d4536fef71174693b0\": container with ID starting with 2ad59666456662eed54cc5463119b7fe38d168438dc698d4536fef71174693b0 not found: ID does not exist" containerID="2ad59666456662eed54cc5463119b7fe38d168438dc698d4536fef71174693b0" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.082579 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ad59666456662eed54cc5463119b7fe38d168438dc698d4536fef71174693b0"} err="failed to get container status \"2ad59666456662eed54cc5463119b7fe38d168438dc698d4536fef71174693b0\": rpc error: code = NotFound desc = could not find container \"2ad59666456662eed54cc5463119b7fe38d168438dc698d4536fef71174693b0\": container with ID starting with 2ad59666456662eed54cc5463119b7fe38d168438dc698d4536fef71174693b0 not found: ID does not exist" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.082618 4831 scope.go:117] "RemoveContainer" containerID="ef95e8684c317ba39ddc1e8dd5a156ab0fe2edf5a5042395fb5dfb2e48b36d4f" Dec 04 10:36:53 crc kubenswrapper[4831]: E1204 10:36:53.083033 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef95e8684c317ba39ddc1e8dd5a156ab0fe2edf5a5042395fb5dfb2e48b36d4f\": container with ID starting with ef95e8684c317ba39ddc1e8dd5a156ab0fe2edf5a5042395fb5dfb2e48b36d4f not found: ID does not exist" containerID="ef95e8684c317ba39ddc1e8dd5a156ab0fe2edf5a5042395fb5dfb2e48b36d4f" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.083077 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef95e8684c317ba39ddc1e8dd5a156ab0fe2edf5a5042395fb5dfb2e48b36d4f"} err="failed to get container status \"ef95e8684c317ba39ddc1e8dd5a156ab0fe2edf5a5042395fb5dfb2e48b36d4f\": rpc error: code = NotFound desc = could not find container \"ef95e8684c317ba39ddc1e8dd5a156ab0fe2edf5a5042395fb5dfb2e48b36d4f\": container with ID starting with ef95e8684c317ba39ddc1e8dd5a156ab0fe2edf5a5042395fb5dfb2e48b36d4f not found: ID does not exist" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.083104 4831 scope.go:117] "RemoveContainer" containerID="e275b793e55ff61a6841bba6df6a8edfacfa68c57633c0dcf3a30e8019927667" Dec 04 10:36:53 crc kubenswrapper[4831]: E1204 10:36:53.083317 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e275b793e55ff61a6841bba6df6a8edfacfa68c57633c0dcf3a30e8019927667\": container with ID starting with e275b793e55ff61a6841bba6df6a8edfacfa68c57633c0dcf3a30e8019927667 not found: ID does not exist" containerID="e275b793e55ff61a6841bba6df6a8edfacfa68c57633c0dcf3a30e8019927667" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.083340 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e275b793e55ff61a6841bba6df6a8edfacfa68c57633c0dcf3a30e8019927667"} err="failed to get container status \"e275b793e55ff61a6841bba6df6a8edfacfa68c57633c0dcf3a30e8019927667\": rpc error: code = NotFound desc = could not find container \"e275b793e55ff61a6841bba6df6a8edfacfa68c57633c0dcf3a30e8019927667\": container with ID starting with e275b793e55ff61a6841bba6df6a8edfacfa68c57633c0dcf3a30e8019927667 not found: ID does not exist" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.083357 4831 scope.go:117] "RemoveContainer" containerID="1cdb2f89346659e1489a3a26b8fd8d830f5d69984d0c880c48ec59a2523b29cd" Dec 04 10:36:53 crc kubenswrapper[4831]: E1204 10:36:53.083512 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cdb2f89346659e1489a3a26b8fd8d830f5d69984d0c880c48ec59a2523b29cd\": container with ID starting with 1cdb2f89346659e1489a3a26b8fd8d830f5d69984d0c880c48ec59a2523b29cd not found: ID does not exist" containerID="1cdb2f89346659e1489a3a26b8fd8d830f5d69984d0c880c48ec59a2523b29cd" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.083535 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cdb2f89346659e1489a3a26b8fd8d830f5d69984d0c880c48ec59a2523b29cd"} err="failed to get container status \"1cdb2f89346659e1489a3a26b8fd8d830f5d69984d0c880c48ec59a2523b29cd\": rpc error: code = NotFound desc = could not find container \"1cdb2f89346659e1489a3a26b8fd8d830f5d69984d0c880c48ec59a2523b29cd\": container with ID starting with 1cdb2f89346659e1489a3a26b8fd8d830f5d69984d0c880c48ec59a2523b29cd not found: ID does not exist" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.160908 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9\") " pod="openstack/ceilometer-0" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.161031 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-run-httpd\") pod \"ceilometer-0\" (UID: \"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9\") " pod="openstack/ceilometer-0" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.161092 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb4t8\" (UniqueName: \"kubernetes.io/projected/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-kube-api-access-mb4t8\") pod \"ceilometer-0\" (UID: \"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9\") " pod="openstack/ceilometer-0" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.161122 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-log-httpd\") pod \"ceilometer-0\" (UID: \"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9\") " pod="openstack/ceilometer-0" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.161203 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-config-data\") pod \"ceilometer-0\" (UID: \"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9\") " pod="openstack/ceilometer-0" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.161286 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9\") " pod="openstack/ceilometer-0" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.161331 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-scripts\") pod \"ceilometer-0\" (UID: \"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9\") " pod="openstack/ceilometer-0" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.161358 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9\") " pod="openstack/ceilometer-0" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.263784 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-config-data\") pod \"ceilometer-0\" (UID: \"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9\") " pod="openstack/ceilometer-0" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.263920 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9\") " pod="openstack/ceilometer-0" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.263970 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-scripts\") pod \"ceilometer-0\" (UID: \"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9\") " pod="openstack/ceilometer-0" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.264005 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9\") " pod="openstack/ceilometer-0" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.264070 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9\") " pod="openstack/ceilometer-0" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.264495 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-run-httpd\") pod \"ceilometer-0\" (UID: \"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9\") " pod="openstack/ceilometer-0" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.264681 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mb4t8\" (UniqueName: \"kubernetes.io/projected/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-kube-api-access-mb4t8\") pod \"ceilometer-0\" (UID: \"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9\") " pod="openstack/ceilometer-0" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.264741 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-log-httpd\") pod \"ceilometer-0\" (UID: \"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9\") " pod="openstack/ceilometer-0" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.264905 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-run-httpd\") pod \"ceilometer-0\" (UID: \"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9\") " pod="openstack/ceilometer-0" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.265104 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-log-httpd\") pod \"ceilometer-0\" (UID: \"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9\") " pod="openstack/ceilometer-0" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.267265 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-scripts\") pod \"ceilometer-0\" (UID: \"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9\") " pod="openstack/ceilometer-0" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.267889 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9\") " pod="openstack/ceilometer-0" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.268071 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-config-data\") pod \"ceilometer-0\" (UID: \"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9\") " pod="openstack/ceilometer-0" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.268223 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9\") " pod="openstack/ceilometer-0" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.268595 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9\") " pod="openstack/ceilometer-0" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.283127 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb4t8\" (UniqueName: \"kubernetes.io/projected/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-kube-api-access-mb4t8\") pod \"ceilometer-0\" (UID: \"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9\") " pod="openstack/ceilometer-0" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.287726 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="497edb30-816e-45e2-9180-cfc1392f2c1c" path="/var/lib/kubelet/pods/497edb30-816e-45e2-9180-cfc1392f2c1c/volumes" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.363139 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.845383 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:36:53 crc kubenswrapper[4831]: W1204 10:36:53.846831 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2686e5ba_98c9_4707_b45d_1dadeeeaeaa9.slice/crio-f31268a31ad6252e04299b2281ad9e3941f1ee0265e565c9d9e7910530859e77 WatchSource:0}: Error finding container f31268a31ad6252e04299b2281ad9e3941f1ee0265e565c9d9e7910530859e77: Status 404 returned error can't find the container with id f31268a31ad6252e04299b2281ad9e3941f1ee0265e565c9d9e7910530859e77 Dec 04 10:36:53 crc kubenswrapper[4831]: I1204 10:36:53.975603 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9","Type":"ContainerStarted","Data":"f31268a31ad6252e04299b2281ad9e3941f1ee0265e565c9d9e7910530859e77"} Dec 04 10:36:54 crc kubenswrapper[4831]: I1204 10:36:54.987345 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9","Type":"ContainerStarted","Data":"12a41a7236d97d862156cc853c7af71e1ec83b4f218d58489f1415d2a0f1b6f7"} Dec 04 10:36:54 crc kubenswrapper[4831]: I1204 10:36:54.987721 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9","Type":"ContainerStarted","Data":"52bb5b7261b5a6b3b9097633d04eba04533f95b7ac0da2050dab45f62608add3"} Dec 04 10:36:54 crc kubenswrapper[4831]: I1204 10:36:54.987734 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9","Type":"ContainerStarted","Data":"67289110e07a4c1df95d095fdf75a21af41b63a9182362808dd213acaf2d3f7e"} Dec 04 10:36:57 crc kubenswrapper[4831]: I1204 10:36:57.011449 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9","Type":"ContainerStarted","Data":"9b70f0dea682eeb4a084b23cc8e1c7ab50c2338b4c509d344839ce10d2a40b2c"} Dec 04 10:36:57 crc kubenswrapper[4831]: I1204 10:36:57.013950 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 10:36:57 crc kubenswrapper[4831]: I1204 10:36:57.034244 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.8075364980000002 podStartE2EDuration="4.034225571s" podCreationTimestamp="2025-12-04 10:36:53 +0000 UTC" firstStartedPulling="2025-12-04 10:36:53.849600492 +0000 UTC m=+1310.798775806" lastFinishedPulling="2025-12-04 10:36:56.076289565 +0000 UTC m=+1313.025464879" observedRunningTime="2025-12-04 10:36:57.031742955 +0000 UTC m=+1313.980918279" watchObservedRunningTime="2025-12-04 10:36:57.034225571 +0000 UTC m=+1313.983400895" Dec 04 10:36:58 crc kubenswrapper[4831]: I1204 10:36:58.027057 4831 generic.go:334] "Generic (PLEG): container finished" podID="948ac29a-d7ab-47a3-aa8e-d74b5f1de702" containerID="a38aa139053bbe0f386d00ab71e9750f2cb13948cd9ede8fb0f68c9f3cc5bbaf" exitCode=137 Dec 04 10:36:58 crc kubenswrapper[4831]: I1204 10:36:58.027152 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"948ac29a-d7ab-47a3-aa8e-d74b5f1de702","Type":"ContainerDied","Data":"a38aa139053bbe0f386d00ab71e9750f2cb13948cd9ede8fb0f68c9f3cc5bbaf"} Dec 04 10:36:58 crc kubenswrapper[4831]: I1204 10:36:58.030160 4831 generic.go:334] "Generic (PLEG): container finished" podID="7f988f09-6ce5-4cf5-a32b-18fa5a0100d2" containerID="c7f81ae4c7b0f64b9d790182ddaf981790635588a8398d42efbcd77d76762dbe" exitCode=137 Dec 04 10:36:58 crc kubenswrapper[4831]: I1204 10:36:58.030238 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7f988f09-6ce5-4cf5-a32b-18fa5a0100d2","Type":"ContainerDied","Data":"c7f81ae4c7b0f64b9d790182ddaf981790635588a8398d42efbcd77d76762dbe"} Dec 04 10:36:58 crc kubenswrapper[4831]: I1204 10:36:58.164936 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 04 10:36:58 crc kubenswrapper[4831]: I1204 10:36:58.165682 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 04 10:36:58 crc kubenswrapper[4831]: I1204 10:36:58.166736 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 04 10:36:58 crc kubenswrapper[4831]: I1204 10:36:58.166845 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 04 10:36:58 crc kubenswrapper[4831]: I1204 10:36:58.173491 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 04 10:36:58 crc kubenswrapper[4831]: I1204 10:36:58.173587 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 04 10:36:58 crc kubenswrapper[4831]: I1204 10:36:58.390752 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86dddb665-dtvkq"] Dec 04 10:36:58 crc kubenswrapper[4831]: I1204 10:36:58.392462 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86dddb665-dtvkq" Dec 04 10:36:58 crc kubenswrapper[4831]: I1204 10:36:58.423728 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86dddb665-dtvkq"] Dec 04 10:36:58 crc kubenswrapper[4831]: I1204 10:36:58.479867 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/88fcf3fe-10ba-460e-89bb-6b94936a183e-dns-swift-storage-0\") pod \"dnsmasq-dns-86dddb665-dtvkq\" (UID: \"88fcf3fe-10ba-460e-89bb-6b94936a183e\") " pod="openstack/dnsmasq-dns-86dddb665-dtvkq" Dec 04 10:36:58 crc kubenswrapper[4831]: I1204 10:36:58.479928 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/88fcf3fe-10ba-460e-89bb-6b94936a183e-ovsdbserver-sb\") pod \"dnsmasq-dns-86dddb665-dtvkq\" (UID: \"88fcf3fe-10ba-460e-89bb-6b94936a183e\") " pod="openstack/dnsmasq-dns-86dddb665-dtvkq" Dec 04 10:36:58 crc kubenswrapper[4831]: I1204 10:36:58.479970 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/88fcf3fe-10ba-460e-89bb-6b94936a183e-ovsdbserver-nb\") pod \"dnsmasq-dns-86dddb665-dtvkq\" (UID: \"88fcf3fe-10ba-460e-89bb-6b94936a183e\") " pod="openstack/dnsmasq-dns-86dddb665-dtvkq" Dec 04 10:36:58 crc kubenswrapper[4831]: I1204 10:36:58.480024 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88fcf3fe-10ba-460e-89bb-6b94936a183e-config\") pod \"dnsmasq-dns-86dddb665-dtvkq\" (UID: \"88fcf3fe-10ba-460e-89bb-6b94936a183e\") " pod="openstack/dnsmasq-dns-86dddb665-dtvkq" Dec 04 10:36:58 crc kubenswrapper[4831]: I1204 10:36:58.480165 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/88fcf3fe-10ba-460e-89bb-6b94936a183e-dns-svc\") pod \"dnsmasq-dns-86dddb665-dtvkq\" (UID: \"88fcf3fe-10ba-460e-89bb-6b94936a183e\") " pod="openstack/dnsmasq-dns-86dddb665-dtvkq" Dec 04 10:36:58 crc kubenswrapper[4831]: I1204 10:36:58.480262 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp2vf\" (UniqueName: \"kubernetes.io/projected/88fcf3fe-10ba-460e-89bb-6b94936a183e-kube-api-access-kp2vf\") pod \"dnsmasq-dns-86dddb665-dtvkq\" (UID: \"88fcf3fe-10ba-460e-89bb-6b94936a183e\") " pod="openstack/dnsmasq-dns-86dddb665-dtvkq" Dec 04 10:36:58 crc kubenswrapper[4831]: I1204 10:36:58.583810 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/88fcf3fe-10ba-460e-89bb-6b94936a183e-dns-svc\") pod \"dnsmasq-dns-86dddb665-dtvkq\" (UID: \"88fcf3fe-10ba-460e-89bb-6b94936a183e\") " pod="openstack/dnsmasq-dns-86dddb665-dtvkq" Dec 04 10:36:58 crc kubenswrapper[4831]: I1204 10:36:58.583904 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp2vf\" (UniqueName: \"kubernetes.io/projected/88fcf3fe-10ba-460e-89bb-6b94936a183e-kube-api-access-kp2vf\") pod \"dnsmasq-dns-86dddb665-dtvkq\" (UID: \"88fcf3fe-10ba-460e-89bb-6b94936a183e\") " pod="openstack/dnsmasq-dns-86dddb665-dtvkq" Dec 04 10:36:58 crc kubenswrapper[4831]: I1204 10:36:58.583942 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/88fcf3fe-10ba-460e-89bb-6b94936a183e-dns-swift-storage-0\") pod \"dnsmasq-dns-86dddb665-dtvkq\" (UID: \"88fcf3fe-10ba-460e-89bb-6b94936a183e\") " pod="openstack/dnsmasq-dns-86dddb665-dtvkq" Dec 04 10:36:58 crc kubenswrapper[4831]: I1204 10:36:58.583964 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/88fcf3fe-10ba-460e-89bb-6b94936a183e-ovsdbserver-sb\") pod \"dnsmasq-dns-86dddb665-dtvkq\" (UID: \"88fcf3fe-10ba-460e-89bb-6b94936a183e\") " pod="openstack/dnsmasq-dns-86dddb665-dtvkq" Dec 04 10:36:58 crc kubenswrapper[4831]: I1204 10:36:58.583994 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/88fcf3fe-10ba-460e-89bb-6b94936a183e-ovsdbserver-nb\") pod \"dnsmasq-dns-86dddb665-dtvkq\" (UID: \"88fcf3fe-10ba-460e-89bb-6b94936a183e\") " pod="openstack/dnsmasq-dns-86dddb665-dtvkq" Dec 04 10:36:58 crc kubenswrapper[4831]: I1204 10:36:58.584025 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88fcf3fe-10ba-460e-89bb-6b94936a183e-config\") pod \"dnsmasq-dns-86dddb665-dtvkq\" (UID: \"88fcf3fe-10ba-460e-89bb-6b94936a183e\") " pod="openstack/dnsmasq-dns-86dddb665-dtvkq" Dec 04 10:36:58 crc kubenswrapper[4831]: I1204 10:36:58.584866 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88fcf3fe-10ba-460e-89bb-6b94936a183e-config\") pod \"dnsmasq-dns-86dddb665-dtvkq\" (UID: \"88fcf3fe-10ba-460e-89bb-6b94936a183e\") " pod="openstack/dnsmasq-dns-86dddb665-dtvkq" Dec 04 10:36:58 crc kubenswrapper[4831]: I1204 10:36:58.585355 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/88fcf3fe-10ba-460e-89bb-6b94936a183e-dns-svc\") pod \"dnsmasq-dns-86dddb665-dtvkq\" (UID: \"88fcf3fe-10ba-460e-89bb-6b94936a183e\") " pod="openstack/dnsmasq-dns-86dddb665-dtvkq" Dec 04 10:36:58 crc kubenswrapper[4831]: I1204 10:36:58.586105 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/88fcf3fe-10ba-460e-89bb-6b94936a183e-dns-swift-storage-0\") pod \"dnsmasq-dns-86dddb665-dtvkq\" (UID: \"88fcf3fe-10ba-460e-89bb-6b94936a183e\") " pod="openstack/dnsmasq-dns-86dddb665-dtvkq" Dec 04 10:36:58 crc kubenswrapper[4831]: I1204 10:36:58.586563 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/88fcf3fe-10ba-460e-89bb-6b94936a183e-ovsdbserver-sb\") pod \"dnsmasq-dns-86dddb665-dtvkq\" (UID: \"88fcf3fe-10ba-460e-89bb-6b94936a183e\") " pod="openstack/dnsmasq-dns-86dddb665-dtvkq" Dec 04 10:36:58 crc kubenswrapper[4831]: I1204 10:36:58.587043 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/88fcf3fe-10ba-460e-89bb-6b94936a183e-ovsdbserver-nb\") pod \"dnsmasq-dns-86dddb665-dtvkq\" (UID: \"88fcf3fe-10ba-460e-89bb-6b94936a183e\") " pod="openstack/dnsmasq-dns-86dddb665-dtvkq" Dec 04 10:36:58 crc kubenswrapper[4831]: I1204 10:36:58.618705 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp2vf\" (UniqueName: \"kubernetes.io/projected/88fcf3fe-10ba-460e-89bb-6b94936a183e-kube-api-access-kp2vf\") pod \"dnsmasq-dns-86dddb665-dtvkq\" (UID: \"88fcf3fe-10ba-460e-89bb-6b94936a183e\") " pod="openstack/dnsmasq-dns-86dddb665-dtvkq" Dec 04 10:36:58 crc kubenswrapper[4831]: I1204 10:36:58.637887 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 04 10:36:58 crc kubenswrapper[4831]: I1204 10:36:58.733782 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86dddb665-dtvkq" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.039871 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.042952 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.097305 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sttrf\" (UniqueName: \"kubernetes.io/projected/948ac29a-d7ab-47a3-aa8e-d74b5f1de702-kube-api-access-sttrf\") pod \"948ac29a-d7ab-47a3-aa8e-d74b5f1de702\" (UID: \"948ac29a-d7ab-47a3-aa8e-d74b5f1de702\") " Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.097603 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4cvc\" (UniqueName: \"kubernetes.io/projected/7f988f09-6ce5-4cf5-a32b-18fa5a0100d2-kube-api-access-v4cvc\") pod \"7f988f09-6ce5-4cf5-a32b-18fa5a0100d2\" (UID: \"7f988f09-6ce5-4cf5-a32b-18fa5a0100d2\") " Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.097742 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f988f09-6ce5-4cf5-a32b-18fa5a0100d2-logs\") pod \"7f988f09-6ce5-4cf5-a32b-18fa5a0100d2\" (UID: \"7f988f09-6ce5-4cf5-a32b-18fa5a0100d2\") " Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.097813 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f988f09-6ce5-4cf5-a32b-18fa5a0100d2-combined-ca-bundle\") pod \"7f988f09-6ce5-4cf5-a32b-18fa5a0100d2\" (UID: \"7f988f09-6ce5-4cf5-a32b-18fa5a0100d2\") " Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.097872 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f988f09-6ce5-4cf5-a32b-18fa5a0100d2-config-data\") pod \"7f988f09-6ce5-4cf5-a32b-18fa5a0100d2\" (UID: \"7f988f09-6ce5-4cf5-a32b-18fa5a0100d2\") " Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.097930 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/948ac29a-d7ab-47a3-aa8e-d74b5f1de702-config-data\") pod \"948ac29a-d7ab-47a3-aa8e-d74b5f1de702\" (UID: \"948ac29a-d7ab-47a3-aa8e-d74b5f1de702\") " Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.097959 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/948ac29a-d7ab-47a3-aa8e-d74b5f1de702-combined-ca-bundle\") pod \"948ac29a-d7ab-47a3-aa8e-d74b5f1de702\" (UID: \"948ac29a-d7ab-47a3-aa8e-d74b5f1de702\") " Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.100028 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f988f09-6ce5-4cf5-a32b-18fa5a0100d2-logs" (OuterVolumeSpecName: "logs") pod "7f988f09-6ce5-4cf5-a32b-18fa5a0100d2" (UID: "7f988f09-6ce5-4cf5-a32b-18fa5a0100d2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.128346 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/948ac29a-d7ab-47a3-aa8e-d74b5f1de702-kube-api-access-sttrf" (OuterVolumeSpecName: "kube-api-access-sttrf") pod "948ac29a-d7ab-47a3-aa8e-d74b5f1de702" (UID: "948ac29a-d7ab-47a3-aa8e-d74b5f1de702"). InnerVolumeSpecName "kube-api-access-sttrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.132883 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7f988f09-6ce5-4cf5-a32b-18fa5a0100d2","Type":"ContainerDied","Data":"106029a4a77f75f6ff9a424a17f590bfaccf6bb964f3a22272606d358f043014"} Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.132931 4831 scope.go:117] "RemoveContainer" containerID="c7f81ae4c7b0f64b9d790182ddaf981790635588a8398d42efbcd77d76762dbe" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.133065 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.161126 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f988f09-6ce5-4cf5-a32b-18fa5a0100d2-kube-api-access-v4cvc" (OuterVolumeSpecName: "kube-api-access-v4cvc") pod "7f988f09-6ce5-4cf5-a32b-18fa5a0100d2" (UID: "7f988f09-6ce5-4cf5-a32b-18fa5a0100d2"). InnerVolumeSpecName "kube-api-access-v4cvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.200143 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sttrf\" (UniqueName: \"kubernetes.io/projected/948ac29a-d7ab-47a3-aa8e-d74b5f1de702-kube-api-access-sttrf\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.200187 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4cvc\" (UniqueName: \"kubernetes.io/projected/7f988f09-6ce5-4cf5-a32b-18fa5a0100d2-kube-api-access-v4cvc\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.200200 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f988f09-6ce5-4cf5-a32b-18fa5a0100d2-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.204702 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.204907 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"948ac29a-d7ab-47a3-aa8e-d74b5f1de702","Type":"ContainerDied","Data":"23076ac4e9d1a4540e9d46caad15677918147b1de0e02a71c93c4da3f456a427"} Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.239072 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f988f09-6ce5-4cf5-a32b-18fa5a0100d2-config-data" (OuterVolumeSpecName: "config-data") pod "7f988f09-6ce5-4cf5-a32b-18fa5a0100d2" (UID: "7f988f09-6ce5-4cf5-a32b-18fa5a0100d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.252846 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/948ac29a-d7ab-47a3-aa8e-d74b5f1de702-config-data" (OuterVolumeSpecName: "config-data") pod "948ac29a-d7ab-47a3-aa8e-d74b5f1de702" (UID: "948ac29a-d7ab-47a3-aa8e-d74b5f1de702"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.258483 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f988f09-6ce5-4cf5-a32b-18fa5a0100d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f988f09-6ce5-4cf5-a32b-18fa5a0100d2" (UID: "7f988f09-6ce5-4cf5-a32b-18fa5a0100d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.261075 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/948ac29a-d7ab-47a3-aa8e-d74b5f1de702-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "948ac29a-d7ab-47a3-aa8e-d74b5f1de702" (UID: "948ac29a-d7ab-47a3-aa8e-d74b5f1de702"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.276222 4831 scope.go:117] "RemoveContainer" containerID="5e493f5ba505cebd04effa30fd5e10a7212fae75c6c1f0cc6301d4c991727ce8" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.301988 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f988f09-6ce5-4cf5-a32b-18fa5a0100d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.302032 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f988f09-6ce5-4cf5-a32b-18fa5a0100d2-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.302041 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/948ac29a-d7ab-47a3-aa8e-d74b5f1de702-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.302049 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/948ac29a-d7ab-47a3-aa8e-d74b5f1de702-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.337650 4831 scope.go:117] "RemoveContainer" containerID="a38aa139053bbe0f386d00ab71e9750f2cb13948cd9ede8fb0f68c9f3cc5bbaf" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.446528 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86dddb665-dtvkq"] Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.467546 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.481736 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.488214 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:36:59 crc kubenswrapper[4831]: E1204 10:36:59.488582 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f988f09-6ce5-4cf5-a32b-18fa5a0100d2" containerName="nova-metadata-log" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.488594 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f988f09-6ce5-4cf5-a32b-18fa5a0100d2" containerName="nova-metadata-log" Dec 04 10:36:59 crc kubenswrapper[4831]: E1204 10:36:59.488622 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f988f09-6ce5-4cf5-a32b-18fa5a0100d2" containerName="nova-metadata-metadata" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.488627 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f988f09-6ce5-4cf5-a32b-18fa5a0100d2" containerName="nova-metadata-metadata" Dec 04 10:36:59 crc kubenswrapper[4831]: E1204 10:36:59.488648 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="948ac29a-d7ab-47a3-aa8e-d74b5f1de702" containerName="nova-cell1-novncproxy-novncproxy" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.488655 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="948ac29a-d7ab-47a3-aa8e-d74b5f1de702" containerName="nova-cell1-novncproxy-novncproxy" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.488832 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="948ac29a-d7ab-47a3-aa8e-d74b5f1de702" containerName="nova-cell1-novncproxy-novncproxy" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.488858 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f988f09-6ce5-4cf5-a32b-18fa5a0100d2" containerName="nova-metadata-log" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.488867 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f988f09-6ce5-4cf5-a32b-18fa5a0100d2" containerName="nova-metadata-metadata" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.489822 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.491559 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.495015 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.524743 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.561067 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.570214 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.581249 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.582585 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.586613 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.587551 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.587774 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.611112 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2gnt\" (UniqueName: \"kubernetes.io/projected/085dfcdf-cc1f-4769-b7bd-181169b959c4-kube-api-access-h2gnt\") pod \"nova-metadata-0\" (UID: \"085dfcdf-cc1f-4769-b7bd-181169b959c4\") " pod="openstack/nova-metadata-0" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.611123 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.611173 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/085dfcdf-cc1f-4769-b7bd-181169b959c4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"085dfcdf-cc1f-4769-b7bd-181169b959c4\") " pod="openstack/nova-metadata-0" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.611221 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/085dfcdf-cc1f-4769-b7bd-181169b959c4-config-data\") pod \"nova-metadata-0\" (UID: \"085dfcdf-cc1f-4769-b7bd-181169b959c4\") " pod="openstack/nova-metadata-0" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.611239 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/085dfcdf-cc1f-4769-b7bd-181169b959c4-logs\") pod \"nova-metadata-0\" (UID: \"085dfcdf-cc1f-4769-b7bd-181169b959c4\") " pod="openstack/nova-metadata-0" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.611321 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/085dfcdf-cc1f-4769-b7bd-181169b959c4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"085dfcdf-cc1f-4769-b7bd-181169b959c4\") " pod="openstack/nova-metadata-0" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.712721 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5cc0cf0e-1e1e-4c0b-8b47-fafcdf58b57b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5cc0cf0e-1e1e-4c0b-8b47-fafcdf58b57b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.712807 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cc0cf0e-1e1e-4c0b-8b47-fafcdf58b57b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5cc0cf0e-1e1e-4c0b-8b47-fafcdf58b57b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.712908 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2gnt\" (UniqueName: \"kubernetes.io/projected/085dfcdf-cc1f-4769-b7bd-181169b959c4-kube-api-access-h2gnt\") pod \"nova-metadata-0\" (UID: \"085dfcdf-cc1f-4769-b7bd-181169b959c4\") " pod="openstack/nova-metadata-0" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.712940 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/085dfcdf-cc1f-4769-b7bd-181169b959c4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"085dfcdf-cc1f-4769-b7bd-181169b959c4\") " pod="openstack/nova-metadata-0" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.712976 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5cc0cf0e-1e1e-4c0b-8b47-fafcdf58b57b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5cc0cf0e-1e1e-4c0b-8b47-fafcdf58b57b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.713004 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/085dfcdf-cc1f-4769-b7bd-181169b959c4-config-data\") pod \"nova-metadata-0\" (UID: \"085dfcdf-cc1f-4769-b7bd-181169b959c4\") " pod="openstack/nova-metadata-0" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.713027 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/085dfcdf-cc1f-4769-b7bd-181169b959c4-logs\") pod \"nova-metadata-0\" (UID: \"085dfcdf-cc1f-4769-b7bd-181169b959c4\") " pod="openstack/nova-metadata-0" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.713083 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cc0cf0e-1e1e-4c0b-8b47-fafcdf58b57b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5cc0cf0e-1e1e-4c0b-8b47-fafcdf58b57b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.713132 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbv7v\" (UniqueName: \"kubernetes.io/projected/5cc0cf0e-1e1e-4c0b-8b47-fafcdf58b57b-kube-api-access-pbv7v\") pod \"nova-cell1-novncproxy-0\" (UID: \"5cc0cf0e-1e1e-4c0b-8b47-fafcdf58b57b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.713172 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/085dfcdf-cc1f-4769-b7bd-181169b959c4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"085dfcdf-cc1f-4769-b7bd-181169b959c4\") " pod="openstack/nova-metadata-0" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.715107 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/085dfcdf-cc1f-4769-b7bd-181169b959c4-logs\") pod \"nova-metadata-0\" (UID: \"085dfcdf-cc1f-4769-b7bd-181169b959c4\") " pod="openstack/nova-metadata-0" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.720891 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/085dfcdf-cc1f-4769-b7bd-181169b959c4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"085dfcdf-cc1f-4769-b7bd-181169b959c4\") " pod="openstack/nova-metadata-0" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.731646 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/085dfcdf-cc1f-4769-b7bd-181169b959c4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"085dfcdf-cc1f-4769-b7bd-181169b959c4\") " pod="openstack/nova-metadata-0" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.732454 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/085dfcdf-cc1f-4769-b7bd-181169b959c4-config-data\") pod \"nova-metadata-0\" (UID: \"085dfcdf-cc1f-4769-b7bd-181169b959c4\") " pod="openstack/nova-metadata-0" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.734021 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2gnt\" (UniqueName: \"kubernetes.io/projected/085dfcdf-cc1f-4769-b7bd-181169b959c4-kube-api-access-h2gnt\") pod \"nova-metadata-0\" (UID: \"085dfcdf-cc1f-4769-b7bd-181169b959c4\") " pod="openstack/nova-metadata-0" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.815034 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5cc0cf0e-1e1e-4c0b-8b47-fafcdf58b57b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5cc0cf0e-1e1e-4c0b-8b47-fafcdf58b57b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.815089 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cc0cf0e-1e1e-4c0b-8b47-fafcdf58b57b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5cc0cf0e-1e1e-4c0b-8b47-fafcdf58b57b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.815218 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5cc0cf0e-1e1e-4c0b-8b47-fafcdf58b57b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5cc0cf0e-1e1e-4c0b-8b47-fafcdf58b57b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.815281 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cc0cf0e-1e1e-4c0b-8b47-fafcdf58b57b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5cc0cf0e-1e1e-4c0b-8b47-fafcdf58b57b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.815333 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbv7v\" (UniqueName: \"kubernetes.io/projected/5cc0cf0e-1e1e-4c0b-8b47-fafcdf58b57b-kube-api-access-pbv7v\") pod \"nova-cell1-novncproxy-0\" (UID: \"5cc0cf0e-1e1e-4c0b-8b47-fafcdf58b57b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.819210 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5cc0cf0e-1e1e-4c0b-8b47-fafcdf58b57b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5cc0cf0e-1e1e-4c0b-8b47-fafcdf58b57b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.820681 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cc0cf0e-1e1e-4c0b-8b47-fafcdf58b57b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5cc0cf0e-1e1e-4c0b-8b47-fafcdf58b57b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.821701 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5cc0cf0e-1e1e-4c0b-8b47-fafcdf58b57b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5cc0cf0e-1e1e-4c0b-8b47-fafcdf58b57b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.823791 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cc0cf0e-1e1e-4c0b-8b47-fafcdf58b57b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5cc0cf0e-1e1e-4c0b-8b47-fafcdf58b57b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.836379 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbv7v\" (UniqueName: \"kubernetes.io/projected/5cc0cf0e-1e1e-4c0b-8b47-fafcdf58b57b-kube-api-access-pbv7v\") pod \"nova-cell1-novncproxy-0\" (UID: \"5cc0cf0e-1e1e-4c0b-8b47-fafcdf58b57b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.839529 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 10:36:59 crc kubenswrapper[4831]: I1204 10:36:59.903566 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:37:00 crc kubenswrapper[4831]: I1204 10:37:00.234762 4831 generic.go:334] "Generic (PLEG): container finished" podID="88fcf3fe-10ba-460e-89bb-6b94936a183e" containerID="fd670f42a2981b0642eaa0634d258f6e32f944a33bd44d4f643decb26080e4e5" exitCode=0 Dec 04 10:37:00 crc kubenswrapper[4831]: I1204 10:37:00.236295 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86dddb665-dtvkq" event={"ID":"88fcf3fe-10ba-460e-89bb-6b94936a183e","Type":"ContainerDied","Data":"fd670f42a2981b0642eaa0634d258f6e32f944a33bd44d4f643decb26080e4e5"} Dec 04 10:37:00 crc kubenswrapper[4831]: I1204 10:37:00.236325 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86dddb665-dtvkq" event={"ID":"88fcf3fe-10ba-460e-89bb-6b94936a183e","Type":"ContainerStarted","Data":"9330e3ef5f38b26cb332c32d800d7e1a3b40d798412ee6e0342feaab9c88ad4f"} Dec 04 10:37:00 crc kubenswrapper[4831]: I1204 10:37:00.423980 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:37:00 crc kubenswrapper[4831]: I1204 10:37:00.562090 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 10:37:00 crc kubenswrapper[4831]: W1204 10:37:00.563132 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5cc0cf0e_1e1e_4c0b_8b47_fafcdf58b57b.slice/crio-97602687ea8baa3da7fa8c0baf7fe42f9de62a8fcda9538d98908c1a4e752b53 WatchSource:0}: Error finding container 97602687ea8baa3da7fa8c0baf7fe42f9de62a8fcda9538d98908c1a4e752b53: Status 404 returned error can't find the container with id 97602687ea8baa3da7fa8c0baf7fe42f9de62a8fcda9538d98908c1a4e752b53 Dec 04 10:37:01 crc kubenswrapper[4831]: I1204 10:37:01.174586 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:37:01 crc kubenswrapper[4831]: I1204 10:37:01.256639 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5cc0cf0e-1e1e-4c0b-8b47-fafcdf58b57b","Type":"ContainerStarted","Data":"83cde8701c54bc83897b6afe44cda4ceca2a5e8b692a702e4f973e4a13fbcb05"} Dec 04 10:37:01 crc kubenswrapper[4831]: I1204 10:37:01.257018 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5cc0cf0e-1e1e-4c0b-8b47-fafcdf58b57b","Type":"ContainerStarted","Data":"97602687ea8baa3da7fa8c0baf7fe42f9de62a8fcda9538d98908c1a4e752b53"} Dec 04 10:37:01 crc kubenswrapper[4831]: I1204 10:37:01.259903 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86dddb665-dtvkq" event={"ID":"88fcf3fe-10ba-460e-89bb-6b94936a183e","Type":"ContainerStarted","Data":"b271798fadb7ecde85c8ec4c83e30da4c06274d7287f6522556443c07aa234e6"} Dec 04 10:37:01 crc kubenswrapper[4831]: I1204 10:37:01.260019 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86dddb665-dtvkq" Dec 04 10:37:01 crc kubenswrapper[4831]: I1204 10:37:01.261979 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6973b9dd-5fe0-4b1d-9df7-6de89e670246" containerName="nova-api-log" containerID="cri-o://b5ae963d18514b6df38f3d1569269e84b02e2651bd743da7404dc5a914b52ada" gracePeriod=30 Dec 04 10:37:01 crc kubenswrapper[4831]: I1204 10:37:01.262587 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"085dfcdf-cc1f-4769-b7bd-181169b959c4","Type":"ContainerStarted","Data":"acc19b0820a5d0abbc4f1e31955ec10a41ad6c86d5df871544bbc29a58be13ab"} Dec 04 10:37:01 crc kubenswrapper[4831]: I1204 10:37:01.262617 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"085dfcdf-cc1f-4769-b7bd-181169b959c4","Type":"ContainerStarted","Data":"22b6b312a91c400aedc2ded68076b60054520725a89ab67a126094001bb86f9e"} Dec 04 10:37:01 crc kubenswrapper[4831]: I1204 10:37:01.262628 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"085dfcdf-cc1f-4769-b7bd-181169b959c4","Type":"ContainerStarted","Data":"b3988714f961583fec2d936ce8b0069914bc8c3290679797c5da4ddb65761721"} Dec 04 10:37:01 crc kubenswrapper[4831]: I1204 10:37:01.262706 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6973b9dd-5fe0-4b1d-9df7-6de89e670246" containerName="nova-api-api" containerID="cri-o://184de0549ae1f93177ed87ee9769b3c77e64bc7f2f13269f4c4893eac88c81c7" gracePeriod=30 Dec 04 10:37:01 crc kubenswrapper[4831]: I1204 10:37:01.300741 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.300718367 podStartE2EDuration="2.300718367s" podCreationTimestamp="2025-12-04 10:36:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:37:01.282790211 +0000 UTC m=+1318.231965535" watchObservedRunningTime="2025-12-04 10:37:01.300718367 +0000 UTC m=+1318.249893681" Dec 04 10:37:01 crc kubenswrapper[4831]: I1204 10:37:01.316409 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f988f09-6ce5-4cf5-a32b-18fa5a0100d2" path="/var/lib/kubelet/pods/7f988f09-6ce5-4cf5-a32b-18fa5a0100d2/volumes" Dec 04 10:37:01 crc kubenswrapper[4831]: I1204 10:37:01.317140 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="948ac29a-d7ab-47a3-aa8e-d74b5f1de702" path="/var/lib/kubelet/pods/948ac29a-d7ab-47a3-aa8e-d74b5f1de702/volumes" Dec 04 10:37:01 crc kubenswrapper[4831]: I1204 10:37:01.335928 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.335907001 podStartE2EDuration="2.335907001s" podCreationTimestamp="2025-12-04 10:36:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:37:01.319112065 +0000 UTC m=+1318.268287379" watchObservedRunningTime="2025-12-04 10:37:01.335907001 +0000 UTC m=+1318.285082315" Dec 04 10:37:01 crc kubenswrapper[4831]: I1204 10:37:01.360398 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86dddb665-dtvkq" podStartSLOduration=3.3603781010000002 podStartE2EDuration="3.360378101s" podCreationTimestamp="2025-12-04 10:36:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:37:01.343077521 +0000 UTC m=+1318.292252835" watchObservedRunningTime="2025-12-04 10:37:01.360378101 +0000 UTC m=+1318.309553415" Dec 04 10:37:02 crc kubenswrapper[4831]: I1204 10:37:02.108438 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:37:02 crc kubenswrapper[4831]: I1204 10:37:02.108740 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2686e5ba-98c9-4707-b45d-1dadeeeaeaa9" containerName="ceilometer-central-agent" containerID="cri-o://67289110e07a4c1df95d095fdf75a21af41b63a9182362808dd213acaf2d3f7e" gracePeriod=30 Dec 04 10:37:02 crc kubenswrapper[4831]: I1204 10:37:02.108818 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2686e5ba-98c9-4707-b45d-1dadeeeaeaa9" containerName="sg-core" containerID="cri-o://12a41a7236d97d862156cc853c7af71e1ec83b4f218d58489f1415d2a0f1b6f7" gracePeriod=30 Dec 04 10:37:02 crc kubenswrapper[4831]: I1204 10:37:02.108867 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2686e5ba-98c9-4707-b45d-1dadeeeaeaa9" containerName="proxy-httpd" containerID="cri-o://9b70f0dea682eeb4a084b23cc8e1c7ab50c2338b4c509d344839ce10d2a40b2c" gracePeriod=30 Dec 04 10:37:02 crc kubenswrapper[4831]: I1204 10:37:02.108920 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2686e5ba-98c9-4707-b45d-1dadeeeaeaa9" containerName="ceilometer-notification-agent" containerID="cri-o://52bb5b7261b5a6b3b9097633d04eba04533f95b7ac0da2050dab45f62608add3" gracePeriod=30 Dec 04 10:37:02 crc kubenswrapper[4831]: I1204 10:37:02.274460 4831 generic.go:334] "Generic (PLEG): container finished" podID="2686e5ba-98c9-4707-b45d-1dadeeeaeaa9" containerID="12a41a7236d97d862156cc853c7af71e1ec83b4f218d58489f1415d2a0f1b6f7" exitCode=2 Dec 04 10:37:02 crc kubenswrapper[4831]: I1204 10:37:02.274539 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9","Type":"ContainerDied","Data":"12a41a7236d97d862156cc853c7af71e1ec83b4f218d58489f1415d2a0f1b6f7"} Dec 04 10:37:02 crc kubenswrapper[4831]: I1204 10:37:02.276454 4831 generic.go:334] "Generic (PLEG): container finished" podID="6973b9dd-5fe0-4b1d-9df7-6de89e670246" containerID="b5ae963d18514b6df38f3d1569269e84b02e2651bd743da7404dc5a914b52ada" exitCode=143 Dec 04 10:37:02 crc kubenswrapper[4831]: I1204 10:37:02.276573 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6973b9dd-5fe0-4b1d-9df7-6de89e670246","Type":"ContainerDied","Data":"b5ae963d18514b6df38f3d1569269e84b02e2651bd743da7404dc5a914b52ada"} Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.344181 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.346907 4831 generic.go:334] "Generic (PLEG): container finished" podID="6973b9dd-5fe0-4b1d-9df7-6de89e670246" containerID="184de0549ae1f93177ed87ee9769b3c77e64bc7f2f13269f4c4893eac88c81c7" exitCode=0 Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.347007 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6973b9dd-5fe0-4b1d-9df7-6de89e670246","Type":"ContainerDied","Data":"184de0549ae1f93177ed87ee9769b3c77e64bc7f2f13269f4c4893eac88c81c7"} Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.355975 4831 generic.go:334] "Generic (PLEG): container finished" podID="2686e5ba-98c9-4707-b45d-1dadeeeaeaa9" containerID="9b70f0dea682eeb4a084b23cc8e1c7ab50c2338b4c509d344839ce10d2a40b2c" exitCode=0 Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.356010 4831 generic.go:334] "Generic (PLEG): container finished" podID="2686e5ba-98c9-4707-b45d-1dadeeeaeaa9" containerID="52bb5b7261b5a6b3b9097633d04eba04533f95b7ac0da2050dab45f62608add3" exitCode=0 Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.356019 4831 generic.go:334] "Generic (PLEG): container finished" podID="2686e5ba-98c9-4707-b45d-1dadeeeaeaa9" containerID="67289110e07a4c1df95d095fdf75a21af41b63a9182362808dd213acaf2d3f7e" exitCode=0 Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.356041 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9","Type":"ContainerDied","Data":"9b70f0dea682eeb4a084b23cc8e1c7ab50c2338b4c509d344839ce10d2a40b2c"} Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.356072 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9","Type":"ContainerDied","Data":"52bb5b7261b5a6b3b9097633d04eba04533f95b7ac0da2050dab45f62608add3"} Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.356085 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9","Type":"ContainerDied","Data":"67289110e07a4c1df95d095fdf75a21af41b63a9182362808dd213acaf2d3f7e"} Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.356095 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9","Type":"ContainerDied","Data":"f31268a31ad6252e04299b2281ad9e3941f1ee0265e565c9d9e7910530859e77"} Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.356113 4831 scope.go:117] "RemoveContainer" containerID="9b70f0dea682eeb4a084b23cc8e1c7ab50c2338b4c509d344839ce10d2a40b2c" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.356494 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.422648 4831 scope.go:117] "RemoveContainer" containerID="12a41a7236d97d862156cc853c7af71e1ec83b4f218d58489f1415d2a0f1b6f7" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.459149 4831 scope.go:117] "RemoveContainer" containerID="52bb5b7261b5a6b3b9097633d04eba04533f95b7ac0da2050dab45f62608add3" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.501594 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-run-httpd\") pod \"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9\" (UID: \"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9\") " Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.501689 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-ceilometer-tls-certs\") pod \"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9\" (UID: \"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9\") " Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.501800 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mb4t8\" (UniqueName: \"kubernetes.io/projected/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-kube-api-access-mb4t8\") pod \"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9\" (UID: \"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9\") " Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.501838 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-sg-core-conf-yaml\") pod \"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9\" (UID: \"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9\") " Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.501873 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-log-httpd\") pod \"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9\" (UID: \"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9\") " Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.501898 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-scripts\") pod \"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9\" (UID: \"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9\") " Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.501947 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-combined-ca-bundle\") pod \"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9\" (UID: \"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9\") " Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.502023 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-config-data\") pod \"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9\" (UID: \"2686e5ba-98c9-4707-b45d-1dadeeeaeaa9\") " Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.502420 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2686e5ba-98c9-4707-b45d-1dadeeeaeaa9" (UID: "2686e5ba-98c9-4707-b45d-1dadeeeaeaa9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.502545 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2686e5ba-98c9-4707-b45d-1dadeeeaeaa9" (UID: "2686e5ba-98c9-4707-b45d-1dadeeeaeaa9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.508034 4831 scope.go:117] "RemoveContainer" containerID="67289110e07a4c1df95d095fdf75a21af41b63a9182362808dd213acaf2d3f7e" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.514916 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-scripts" (OuterVolumeSpecName: "scripts") pod "2686e5ba-98c9-4707-b45d-1dadeeeaeaa9" (UID: "2686e5ba-98c9-4707-b45d-1dadeeeaeaa9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.515043 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-kube-api-access-mb4t8" (OuterVolumeSpecName: "kube-api-access-mb4t8") pod "2686e5ba-98c9-4707-b45d-1dadeeeaeaa9" (UID: "2686e5ba-98c9-4707-b45d-1dadeeeaeaa9"). InnerVolumeSpecName "kube-api-access-mb4t8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.559354 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2686e5ba-98c9-4707-b45d-1dadeeeaeaa9" (UID: "2686e5ba-98c9-4707-b45d-1dadeeeaeaa9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.577432 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "2686e5ba-98c9-4707-b45d-1dadeeeaeaa9" (UID: "2686e5ba-98c9-4707-b45d-1dadeeeaeaa9"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.604936 4831 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.604977 4831 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.604993 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mb4t8\" (UniqueName: \"kubernetes.io/projected/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-kube-api-access-mb4t8\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.605007 4831 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.605021 4831 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.605031 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.670427 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2686e5ba-98c9-4707-b45d-1dadeeeaeaa9" (UID: "2686e5ba-98c9-4707-b45d-1dadeeeaeaa9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.690388 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-config-data" (OuterVolumeSpecName: "config-data") pod "2686e5ba-98c9-4707-b45d-1dadeeeaeaa9" (UID: "2686e5ba-98c9-4707-b45d-1dadeeeaeaa9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.706772 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.706807 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.746103 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.750499 4831 scope.go:117] "RemoveContainer" containerID="9b70f0dea682eeb4a084b23cc8e1c7ab50c2338b4c509d344839ce10d2a40b2c" Dec 04 10:37:03 crc kubenswrapper[4831]: E1204 10:37:03.750989 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b70f0dea682eeb4a084b23cc8e1c7ab50c2338b4c509d344839ce10d2a40b2c\": container with ID starting with 9b70f0dea682eeb4a084b23cc8e1c7ab50c2338b4c509d344839ce10d2a40b2c not found: ID does not exist" containerID="9b70f0dea682eeb4a084b23cc8e1c7ab50c2338b4c509d344839ce10d2a40b2c" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.751031 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b70f0dea682eeb4a084b23cc8e1c7ab50c2338b4c509d344839ce10d2a40b2c"} err="failed to get container status \"9b70f0dea682eeb4a084b23cc8e1c7ab50c2338b4c509d344839ce10d2a40b2c\": rpc error: code = NotFound desc = could not find container \"9b70f0dea682eeb4a084b23cc8e1c7ab50c2338b4c509d344839ce10d2a40b2c\": container with ID starting with 9b70f0dea682eeb4a084b23cc8e1c7ab50c2338b4c509d344839ce10d2a40b2c not found: ID does not exist" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.751059 4831 scope.go:117] "RemoveContainer" containerID="12a41a7236d97d862156cc853c7af71e1ec83b4f218d58489f1415d2a0f1b6f7" Dec 04 10:37:03 crc kubenswrapper[4831]: E1204 10:37:03.751371 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12a41a7236d97d862156cc853c7af71e1ec83b4f218d58489f1415d2a0f1b6f7\": container with ID starting with 12a41a7236d97d862156cc853c7af71e1ec83b4f218d58489f1415d2a0f1b6f7 not found: ID does not exist" containerID="12a41a7236d97d862156cc853c7af71e1ec83b4f218d58489f1415d2a0f1b6f7" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.751407 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12a41a7236d97d862156cc853c7af71e1ec83b4f218d58489f1415d2a0f1b6f7"} err="failed to get container status \"12a41a7236d97d862156cc853c7af71e1ec83b4f218d58489f1415d2a0f1b6f7\": rpc error: code = NotFound desc = could not find container \"12a41a7236d97d862156cc853c7af71e1ec83b4f218d58489f1415d2a0f1b6f7\": container with ID starting with 12a41a7236d97d862156cc853c7af71e1ec83b4f218d58489f1415d2a0f1b6f7 not found: ID does not exist" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.751428 4831 scope.go:117] "RemoveContainer" containerID="52bb5b7261b5a6b3b9097633d04eba04533f95b7ac0da2050dab45f62608add3" Dec 04 10:37:03 crc kubenswrapper[4831]: E1204 10:37:03.751815 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52bb5b7261b5a6b3b9097633d04eba04533f95b7ac0da2050dab45f62608add3\": container with ID starting with 52bb5b7261b5a6b3b9097633d04eba04533f95b7ac0da2050dab45f62608add3 not found: ID does not exist" containerID="52bb5b7261b5a6b3b9097633d04eba04533f95b7ac0da2050dab45f62608add3" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.751844 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52bb5b7261b5a6b3b9097633d04eba04533f95b7ac0da2050dab45f62608add3"} err="failed to get container status \"52bb5b7261b5a6b3b9097633d04eba04533f95b7ac0da2050dab45f62608add3\": rpc error: code = NotFound desc = could not find container \"52bb5b7261b5a6b3b9097633d04eba04533f95b7ac0da2050dab45f62608add3\": container with ID starting with 52bb5b7261b5a6b3b9097633d04eba04533f95b7ac0da2050dab45f62608add3 not found: ID does not exist" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.751863 4831 scope.go:117] "RemoveContainer" containerID="67289110e07a4c1df95d095fdf75a21af41b63a9182362808dd213acaf2d3f7e" Dec 04 10:37:03 crc kubenswrapper[4831]: E1204 10:37:03.752115 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67289110e07a4c1df95d095fdf75a21af41b63a9182362808dd213acaf2d3f7e\": container with ID starting with 67289110e07a4c1df95d095fdf75a21af41b63a9182362808dd213acaf2d3f7e not found: ID does not exist" containerID="67289110e07a4c1df95d095fdf75a21af41b63a9182362808dd213acaf2d3f7e" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.752144 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67289110e07a4c1df95d095fdf75a21af41b63a9182362808dd213acaf2d3f7e"} err="failed to get container status \"67289110e07a4c1df95d095fdf75a21af41b63a9182362808dd213acaf2d3f7e\": rpc error: code = NotFound desc = could not find container \"67289110e07a4c1df95d095fdf75a21af41b63a9182362808dd213acaf2d3f7e\": container with ID starting with 67289110e07a4c1df95d095fdf75a21af41b63a9182362808dd213acaf2d3f7e not found: ID does not exist" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.752162 4831 scope.go:117] "RemoveContainer" containerID="9b70f0dea682eeb4a084b23cc8e1c7ab50c2338b4c509d344839ce10d2a40b2c" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.752405 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b70f0dea682eeb4a084b23cc8e1c7ab50c2338b4c509d344839ce10d2a40b2c"} err="failed to get container status \"9b70f0dea682eeb4a084b23cc8e1c7ab50c2338b4c509d344839ce10d2a40b2c\": rpc error: code = NotFound desc = could not find container \"9b70f0dea682eeb4a084b23cc8e1c7ab50c2338b4c509d344839ce10d2a40b2c\": container with ID starting with 9b70f0dea682eeb4a084b23cc8e1c7ab50c2338b4c509d344839ce10d2a40b2c not found: ID does not exist" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.752434 4831 scope.go:117] "RemoveContainer" containerID="12a41a7236d97d862156cc853c7af71e1ec83b4f218d58489f1415d2a0f1b6f7" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.752703 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12a41a7236d97d862156cc853c7af71e1ec83b4f218d58489f1415d2a0f1b6f7"} err="failed to get container status \"12a41a7236d97d862156cc853c7af71e1ec83b4f218d58489f1415d2a0f1b6f7\": rpc error: code = NotFound desc = could not find container \"12a41a7236d97d862156cc853c7af71e1ec83b4f218d58489f1415d2a0f1b6f7\": container with ID starting with 12a41a7236d97d862156cc853c7af71e1ec83b4f218d58489f1415d2a0f1b6f7 not found: ID does not exist" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.752730 4831 scope.go:117] "RemoveContainer" containerID="52bb5b7261b5a6b3b9097633d04eba04533f95b7ac0da2050dab45f62608add3" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.752987 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52bb5b7261b5a6b3b9097633d04eba04533f95b7ac0da2050dab45f62608add3"} err="failed to get container status \"52bb5b7261b5a6b3b9097633d04eba04533f95b7ac0da2050dab45f62608add3\": rpc error: code = NotFound desc = could not find container \"52bb5b7261b5a6b3b9097633d04eba04533f95b7ac0da2050dab45f62608add3\": container with ID starting with 52bb5b7261b5a6b3b9097633d04eba04533f95b7ac0da2050dab45f62608add3 not found: ID does not exist" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.753017 4831 scope.go:117] "RemoveContainer" containerID="67289110e07a4c1df95d095fdf75a21af41b63a9182362808dd213acaf2d3f7e" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.753273 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67289110e07a4c1df95d095fdf75a21af41b63a9182362808dd213acaf2d3f7e"} err="failed to get container status \"67289110e07a4c1df95d095fdf75a21af41b63a9182362808dd213acaf2d3f7e\": rpc error: code = NotFound desc = could not find container \"67289110e07a4c1df95d095fdf75a21af41b63a9182362808dd213acaf2d3f7e\": container with ID starting with 67289110e07a4c1df95d095fdf75a21af41b63a9182362808dd213acaf2d3f7e not found: ID does not exist" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.753301 4831 scope.go:117] "RemoveContainer" containerID="9b70f0dea682eeb4a084b23cc8e1c7ab50c2338b4c509d344839ce10d2a40b2c" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.753543 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b70f0dea682eeb4a084b23cc8e1c7ab50c2338b4c509d344839ce10d2a40b2c"} err="failed to get container status \"9b70f0dea682eeb4a084b23cc8e1c7ab50c2338b4c509d344839ce10d2a40b2c\": rpc error: code = NotFound desc = could not find container \"9b70f0dea682eeb4a084b23cc8e1c7ab50c2338b4c509d344839ce10d2a40b2c\": container with ID starting with 9b70f0dea682eeb4a084b23cc8e1c7ab50c2338b4c509d344839ce10d2a40b2c not found: ID does not exist" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.753564 4831 scope.go:117] "RemoveContainer" containerID="12a41a7236d97d862156cc853c7af71e1ec83b4f218d58489f1415d2a0f1b6f7" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.753802 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12a41a7236d97d862156cc853c7af71e1ec83b4f218d58489f1415d2a0f1b6f7"} err="failed to get container status \"12a41a7236d97d862156cc853c7af71e1ec83b4f218d58489f1415d2a0f1b6f7\": rpc error: code = NotFound desc = could not find container \"12a41a7236d97d862156cc853c7af71e1ec83b4f218d58489f1415d2a0f1b6f7\": container with ID starting with 12a41a7236d97d862156cc853c7af71e1ec83b4f218d58489f1415d2a0f1b6f7 not found: ID does not exist" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.753822 4831 scope.go:117] "RemoveContainer" containerID="52bb5b7261b5a6b3b9097633d04eba04533f95b7ac0da2050dab45f62608add3" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.754054 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52bb5b7261b5a6b3b9097633d04eba04533f95b7ac0da2050dab45f62608add3"} err="failed to get container status \"52bb5b7261b5a6b3b9097633d04eba04533f95b7ac0da2050dab45f62608add3\": rpc error: code = NotFound desc = could not find container \"52bb5b7261b5a6b3b9097633d04eba04533f95b7ac0da2050dab45f62608add3\": container with ID starting with 52bb5b7261b5a6b3b9097633d04eba04533f95b7ac0da2050dab45f62608add3 not found: ID does not exist" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.754084 4831 scope.go:117] "RemoveContainer" containerID="67289110e07a4c1df95d095fdf75a21af41b63a9182362808dd213acaf2d3f7e" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.754337 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67289110e07a4c1df95d095fdf75a21af41b63a9182362808dd213acaf2d3f7e"} err="failed to get container status \"67289110e07a4c1df95d095fdf75a21af41b63a9182362808dd213acaf2d3f7e\": rpc error: code = NotFound desc = could not find container \"67289110e07a4c1df95d095fdf75a21af41b63a9182362808dd213acaf2d3f7e\": container with ID starting with 67289110e07a4c1df95d095fdf75a21af41b63a9182362808dd213acaf2d3f7e not found: ID does not exist" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.909526 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6973b9dd-5fe0-4b1d-9df7-6de89e670246-combined-ca-bundle\") pod \"6973b9dd-5fe0-4b1d-9df7-6de89e670246\" (UID: \"6973b9dd-5fe0-4b1d-9df7-6de89e670246\") " Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.909673 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6973b9dd-5fe0-4b1d-9df7-6de89e670246-logs\") pod \"6973b9dd-5fe0-4b1d-9df7-6de89e670246\" (UID: \"6973b9dd-5fe0-4b1d-9df7-6de89e670246\") " Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.909705 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6973b9dd-5fe0-4b1d-9df7-6de89e670246-config-data\") pod \"6973b9dd-5fe0-4b1d-9df7-6de89e670246\" (UID: \"6973b9dd-5fe0-4b1d-9df7-6de89e670246\") " Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.909781 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ttvg\" (UniqueName: \"kubernetes.io/projected/6973b9dd-5fe0-4b1d-9df7-6de89e670246-kube-api-access-4ttvg\") pod \"6973b9dd-5fe0-4b1d-9df7-6de89e670246\" (UID: \"6973b9dd-5fe0-4b1d-9df7-6de89e670246\") " Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.910417 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6973b9dd-5fe0-4b1d-9df7-6de89e670246-logs" (OuterVolumeSpecName: "logs") pod "6973b9dd-5fe0-4b1d-9df7-6de89e670246" (UID: "6973b9dd-5fe0-4b1d-9df7-6de89e670246"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.913813 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6973b9dd-5fe0-4b1d-9df7-6de89e670246-kube-api-access-4ttvg" (OuterVolumeSpecName: "kube-api-access-4ttvg") pod "6973b9dd-5fe0-4b1d-9df7-6de89e670246" (UID: "6973b9dd-5fe0-4b1d-9df7-6de89e670246"). InnerVolumeSpecName "kube-api-access-4ttvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.950906 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6973b9dd-5fe0-4b1d-9df7-6de89e670246-config-data" (OuterVolumeSpecName: "config-data") pod "6973b9dd-5fe0-4b1d-9df7-6de89e670246" (UID: "6973b9dd-5fe0-4b1d-9df7-6de89e670246"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.957222 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6973b9dd-5fe0-4b1d-9df7-6de89e670246-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6973b9dd-5fe0-4b1d-9df7-6de89e670246" (UID: "6973b9dd-5fe0-4b1d-9df7-6de89e670246"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:37:03 crc kubenswrapper[4831]: I1204 10:37:03.995812 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.013174 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6973b9dd-5fe0-4b1d-9df7-6de89e670246-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.013236 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6973b9dd-5fe0-4b1d-9df7-6de89e670246-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.013254 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ttvg\" (UniqueName: \"kubernetes.io/projected/6973b9dd-5fe0-4b1d-9df7-6de89e670246-kube-api-access-4ttvg\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.013270 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6973b9dd-5fe0-4b1d-9df7-6de89e670246-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.019579 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.061561 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:37:04 crc kubenswrapper[4831]: E1204 10:37:04.062825 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2686e5ba-98c9-4707-b45d-1dadeeeaeaa9" containerName="proxy-httpd" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.062850 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="2686e5ba-98c9-4707-b45d-1dadeeeaeaa9" containerName="proxy-httpd" Dec 04 10:37:04 crc kubenswrapper[4831]: E1204 10:37:04.062863 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2686e5ba-98c9-4707-b45d-1dadeeeaeaa9" containerName="sg-core" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.062872 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="2686e5ba-98c9-4707-b45d-1dadeeeaeaa9" containerName="sg-core" Dec 04 10:37:04 crc kubenswrapper[4831]: E1204 10:37:04.062894 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6973b9dd-5fe0-4b1d-9df7-6de89e670246" containerName="nova-api-log" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.062902 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6973b9dd-5fe0-4b1d-9df7-6de89e670246" containerName="nova-api-log" Dec 04 10:37:04 crc kubenswrapper[4831]: E1204 10:37:04.062936 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2686e5ba-98c9-4707-b45d-1dadeeeaeaa9" containerName="ceilometer-notification-agent" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.062945 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="2686e5ba-98c9-4707-b45d-1dadeeeaeaa9" containerName="ceilometer-notification-agent" Dec 04 10:37:04 crc kubenswrapper[4831]: E1204 10:37:04.062977 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2686e5ba-98c9-4707-b45d-1dadeeeaeaa9" containerName="ceilometer-central-agent" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.062984 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="2686e5ba-98c9-4707-b45d-1dadeeeaeaa9" containerName="ceilometer-central-agent" Dec 04 10:37:04 crc kubenswrapper[4831]: E1204 10:37:04.062998 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6973b9dd-5fe0-4b1d-9df7-6de89e670246" containerName="nova-api-api" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.063006 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6973b9dd-5fe0-4b1d-9df7-6de89e670246" containerName="nova-api-api" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.063243 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="2686e5ba-98c9-4707-b45d-1dadeeeaeaa9" containerName="ceilometer-central-agent" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.063267 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="2686e5ba-98c9-4707-b45d-1dadeeeaeaa9" containerName="sg-core" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.063286 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="2686e5ba-98c9-4707-b45d-1dadeeeaeaa9" containerName="ceilometer-notification-agent" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.063302 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="2686e5ba-98c9-4707-b45d-1dadeeeaeaa9" containerName="proxy-httpd" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.063323 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="6973b9dd-5fe0-4b1d-9df7-6de89e670246" containerName="nova-api-api" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.063333 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="6973b9dd-5fe0-4b1d-9df7-6de89e670246" containerName="nova-api-log" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.066249 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.068471 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.069319 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.069815 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.074459 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.216062 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c7dc6587-6d8c-43f7-a7d2-23690ff571f9\") " pod="openstack/ceilometer-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.216133 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-config-data\") pod \"ceilometer-0\" (UID: \"c7dc6587-6d8c-43f7-a7d2-23690ff571f9\") " pod="openstack/ceilometer-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.216196 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c7dc6587-6d8c-43f7-a7d2-23690ff571f9\") " pod="openstack/ceilometer-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.216216 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrlj9\" (UniqueName: \"kubernetes.io/projected/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-kube-api-access-zrlj9\") pod \"ceilometer-0\" (UID: \"c7dc6587-6d8c-43f7-a7d2-23690ff571f9\") " pod="openstack/ceilometer-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.216235 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-scripts\") pod \"ceilometer-0\" (UID: \"c7dc6587-6d8c-43f7-a7d2-23690ff571f9\") " pod="openstack/ceilometer-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.216275 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c7dc6587-6d8c-43f7-a7d2-23690ff571f9\") " pod="openstack/ceilometer-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.216316 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-run-httpd\") pod \"ceilometer-0\" (UID: \"c7dc6587-6d8c-43f7-a7d2-23690ff571f9\") " pod="openstack/ceilometer-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.216357 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-log-httpd\") pod \"ceilometer-0\" (UID: \"c7dc6587-6d8c-43f7-a7d2-23690ff571f9\") " pod="openstack/ceilometer-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.318139 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-log-httpd\") pod \"ceilometer-0\" (UID: \"c7dc6587-6d8c-43f7-a7d2-23690ff571f9\") " pod="openstack/ceilometer-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.318192 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c7dc6587-6d8c-43f7-a7d2-23690ff571f9\") " pod="openstack/ceilometer-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.318252 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-config-data\") pod \"ceilometer-0\" (UID: \"c7dc6587-6d8c-43f7-a7d2-23690ff571f9\") " pod="openstack/ceilometer-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.318326 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c7dc6587-6d8c-43f7-a7d2-23690ff571f9\") " pod="openstack/ceilometer-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.318349 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrlj9\" (UniqueName: \"kubernetes.io/projected/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-kube-api-access-zrlj9\") pod \"ceilometer-0\" (UID: \"c7dc6587-6d8c-43f7-a7d2-23690ff571f9\") " pod="openstack/ceilometer-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.318372 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-scripts\") pod \"ceilometer-0\" (UID: \"c7dc6587-6d8c-43f7-a7d2-23690ff571f9\") " pod="openstack/ceilometer-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.318427 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c7dc6587-6d8c-43f7-a7d2-23690ff571f9\") " pod="openstack/ceilometer-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.318481 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-run-httpd\") pod \"ceilometer-0\" (UID: \"c7dc6587-6d8c-43f7-a7d2-23690ff571f9\") " pod="openstack/ceilometer-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.318649 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-log-httpd\") pod \"ceilometer-0\" (UID: \"c7dc6587-6d8c-43f7-a7d2-23690ff571f9\") " pod="openstack/ceilometer-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.318797 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-run-httpd\") pod \"ceilometer-0\" (UID: \"c7dc6587-6d8c-43f7-a7d2-23690ff571f9\") " pod="openstack/ceilometer-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.323061 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c7dc6587-6d8c-43f7-a7d2-23690ff571f9\") " pod="openstack/ceilometer-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.323234 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c7dc6587-6d8c-43f7-a7d2-23690ff571f9\") " pod="openstack/ceilometer-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.323827 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-config-data\") pod \"ceilometer-0\" (UID: \"c7dc6587-6d8c-43f7-a7d2-23690ff571f9\") " pod="openstack/ceilometer-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.325109 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-scripts\") pod \"ceilometer-0\" (UID: \"c7dc6587-6d8c-43f7-a7d2-23690ff571f9\") " pod="openstack/ceilometer-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.325431 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c7dc6587-6d8c-43f7-a7d2-23690ff571f9\") " pod="openstack/ceilometer-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.340142 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrlj9\" (UniqueName: \"kubernetes.io/projected/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-kube-api-access-zrlj9\") pod \"ceilometer-0\" (UID: \"c7dc6587-6d8c-43f7-a7d2-23690ff571f9\") " pod="openstack/ceilometer-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.372131 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6973b9dd-5fe0-4b1d-9df7-6de89e670246","Type":"ContainerDied","Data":"4fd059f0723c19d09c8021830b9d7462d5ebfbaec462fc8e4fc45dd9e77884a4"} Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.372192 4831 scope.go:117] "RemoveContainer" containerID="184de0549ae1f93177ed87ee9769b3c77e64bc7f2f13269f4c4893eac88c81c7" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.372251 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.389566 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.407170 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.538966 4831 scope.go:117] "RemoveContainer" containerID="b5ae963d18514b6df38f3d1569269e84b02e2651bd743da7404dc5a914b52ada" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.543073 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.572876 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.588506 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.590465 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.593560 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.593937 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.595554 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.621886 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.727785 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3225b9e-74b2-4cc6-a556-75ececc9db23-logs\") pod \"nova-api-0\" (UID: \"f3225b9e-74b2-4cc6-a556-75ececc9db23\") " pod="openstack/nova-api-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.727859 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3225b9e-74b2-4cc6-a556-75ececc9db23-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f3225b9e-74b2-4cc6-a556-75ececc9db23\") " pod="openstack/nova-api-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.727930 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3225b9e-74b2-4cc6-a556-75ececc9db23-config-data\") pod \"nova-api-0\" (UID: \"f3225b9e-74b2-4cc6-a556-75ececc9db23\") " pod="openstack/nova-api-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.727958 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3225b9e-74b2-4cc6-a556-75ececc9db23-public-tls-certs\") pod \"nova-api-0\" (UID: \"f3225b9e-74b2-4cc6-a556-75ececc9db23\") " pod="openstack/nova-api-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.728043 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcp6w\" (UniqueName: \"kubernetes.io/projected/f3225b9e-74b2-4cc6-a556-75ececc9db23-kube-api-access-rcp6w\") pod \"nova-api-0\" (UID: \"f3225b9e-74b2-4cc6-a556-75ececc9db23\") " pod="openstack/nova-api-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.728098 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3225b9e-74b2-4cc6-a556-75ececc9db23-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f3225b9e-74b2-4cc6-a556-75ececc9db23\") " pod="openstack/nova-api-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.830574 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcp6w\" (UniqueName: \"kubernetes.io/projected/f3225b9e-74b2-4cc6-a556-75ececc9db23-kube-api-access-rcp6w\") pod \"nova-api-0\" (UID: \"f3225b9e-74b2-4cc6-a556-75ececc9db23\") " pod="openstack/nova-api-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.830700 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3225b9e-74b2-4cc6-a556-75ececc9db23-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f3225b9e-74b2-4cc6-a556-75ececc9db23\") " pod="openstack/nova-api-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.830778 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3225b9e-74b2-4cc6-a556-75ececc9db23-logs\") pod \"nova-api-0\" (UID: \"f3225b9e-74b2-4cc6-a556-75ececc9db23\") " pod="openstack/nova-api-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.830815 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3225b9e-74b2-4cc6-a556-75ececc9db23-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f3225b9e-74b2-4cc6-a556-75ececc9db23\") " pod="openstack/nova-api-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.830886 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3225b9e-74b2-4cc6-a556-75ececc9db23-config-data\") pod \"nova-api-0\" (UID: \"f3225b9e-74b2-4cc6-a556-75ececc9db23\") " pod="openstack/nova-api-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.830908 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3225b9e-74b2-4cc6-a556-75ececc9db23-public-tls-certs\") pod \"nova-api-0\" (UID: \"f3225b9e-74b2-4cc6-a556-75ececc9db23\") " pod="openstack/nova-api-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.831565 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3225b9e-74b2-4cc6-a556-75ececc9db23-logs\") pod \"nova-api-0\" (UID: \"f3225b9e-74b2-4cc6-a556-75ececc9db23\") " pod="openstack/nova-api-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.835754 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3225b9e-74b2-4cc6-a556-75ececc9db23-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f3225b9e-74b2-4cc6-a556-75ececc9db23\") " pod="openstack/nova-api-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.836741 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3225b9e-74b2-4cc6-a556-75ececc9db23-config-data\") pod \"nova-api-0\" (UID: \"f3225b9e-74b2-4cc6-a556-75ececc9db23\") " pod="openstack/nova-api-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.837217 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3225b9e-74b2-4cc6-a556-75ececc9db23-public-tls-certs\") pod \"nova-api-0\" (UID: \"f3225b9e-74b2-4cc6-a556-75ececc9db23\") " pod="openstack/nova-api-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.840781 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.841855 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.844700 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3225b9e-74b2-4cc6-a556-75ececc9db23-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f3225b9e-74b2-4cc6-a556-75ececc9db23\") " pod="openstack/nova-api-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.858239 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcp6w\" (UniqueName: \"kubernetes.io/projected/f3225b9e-74b2-4cc6-a556-75ececc9db23-kube-api-access-rcp6w\") pod \"nova-api-0\" (UID: \"f3225b9e-74b2-4cc6-a556-75ececc9db23\") " pod="openstack/nova-api-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.907767 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.910182 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 10:37:04 crc kubenswrapper[4831]: I1204 10:37:04.961245 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:37:05 crc kubenswrapper[4831]: I1204 10:37:05.294015 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2686e5ba-98c9-4707-b45d-1dadeeeaeaa9" path="/var/lib/kubelet/pods/2686e5ba-98c9-4707-b45d-1dadeeeaeaa9/volumes" Dec 04 10:37:05 crc kubenswrapper[4831]: I1204 10:37:05.295438 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6973b9dd-5fe0-4b1d-9df7-6de89e670246" path="/var/lib/kubelet/pods/6973b9dd-5fe0-4b1d-9df7-6de89e670246/volumes" Dec 04 10:37:05 crc kubenswrapper[4831]: I1204 10:37:05.387419 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c7dc6587-6d8c-43f7-a7d2-23690ff571f9","Type":"ContainerStarted","Data":"368c9e18ec8bc9af31ae086423a6b6fa2395306257817c6b2896befc3a5e61b4"} Dec 04 10:37:05 crc kubenswrapper[4831]: I1204 10:37:05.387475 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c7dc6587-6d8c-43f7-a7d2-23690ff571f9","Type":"ContainerStarted","Data":"870d7fb50022d8f55b028624bc2a9edee7ffe430ebc89f5326751f9570dda3fa"} Dec 04 10:37:05 crc kubenswrapper[4831]: I1204 10:37:05.390520 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:37:06 crc kubenswrapper[4831]: I1204 10:37:06.398696 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c7dc6587-6d8c-43f7-a7d2-23690ff571f9","Type":"ContainerStarted","Data":"a5e61a3d44fe52293c2d9408ac3f0c6ebead1bd974b479b1d844cf13bc68c4ef"} Dec 04 10:37:06 crc kubenswrapper[4831]: I1204 10:37:06.399232 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c7dc6587-6d8c-43f7-a7d2-23690ff571f9","Type":"ContainerStarted","Data":"7ee267d8a5b029b774e3eb52c6394c0275ead877110f5c918833668c29f8bbb3"} Dec 04 10:37:06 crc kubenswrapper[4831]: I1204 10:37:06.400864 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3225b9e-74b2-4cc6-a556-75ececc9db23","Type":"ContainerStarted","Data":"5028bd9bad1b12b7d482dd9afd63090bfcd1eb222d8ad48aaf6ff3300c32ae80"} Dec 04 10:37:06 crc kubenswrapper[4831]: I1204 10:37:06.400887 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3225b9e-74b2-4cc6-a556-75ececc9db23","Type":"ContainerStarted","Data":"2cff2fabf6cbb56748d000860c8e85dba7edbe10c105db83be8930c2838be03e"} Dec 04 10:37:06 crc kubenswrapper[4831]: I1204 10:37:06.400898 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3225b9e-74b2-4cc6-a556-75ececc9db23","Type":"ContainerStarted","Data":"ecd900dc76366e08ffc106b3c63181313c598399f62a606dc75dce5e2ecd77f6"} Dec 04 10:37:06 crc kubenswrapper[4831]: I1204 10:37:06.419340 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.419321407 podStartE2EDuration="2.419321407s" podCreationTimestamp="2025-12-04 10:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:37:06.418429983 +0000 UTC m=+1323.367605317" watchObservedRunningTime="2025-12-04 10:37:06.419321407 +0000 UTC m=+1323.368496731" Dec 04 10:37:08 crc kubenswrapper[4831]: I1204 10:37:08.419042 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c7dc6587-6d8c-43f7-a7d2-23690ff571f9","Type":"ContainerStarted","Data":"f4682efeb59be148392c7aaa917239f03447d8862f4cc2dfd4e63545de8effa0"} Dec 04 10:37:08 crc kubenswrapper[4831]: I1204 10:37:08.419924 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 10:37:08 crc kubenswrapper[4831]: I1204 10:37:08.419193 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c7dc6587-6d8c-43f7-a7d2-23690ff571f9" containerName="proxy-httpd" containerID="cri-o://f4682efeb59be148392c7aaa917239f03447d8862f4cc2dfd4e63545de8effa0" gracePeriod=30 Dec 04 10:37:08 crc kubenswrapper[4831]: I1204 10:37:08.419153 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c7dc6587-6d8c-43f7-a7d2-23690ff571f9" containerName="ceilometer-central-agent" containerID="cri-o://368c9e18ec8bc9af31ae086423a6b6fa2395306257817c6b2896befc3a5e61b4" gracePeriod=30 Dec 04 10:37:08 crc kubenswrapper[4831]: I1204 10:37:08.419222 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c7dc6587-6d8c-43f7-a7d2-23690ff571f9" containerName="sg-core" containerID="cri-o://a5e61a3d44fe52293c2d9408ac3f0c6ebead1bd974b479b1d844cf13bc68c4ef" gracePeriod=30 Dec 04 10:37:08 crc kubenswrapper[4831]: I1204 10:37:08.419230 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c7dc6587-6d8c-43f7-a7d2-23690ff571f9" containerName="ceilometer-notification-agent" containerID="cri-o://7ee267d8a5b029b774e3eb52c6394c0275ead877110f5c918833668c29f8bbb3" gracePeriod=30 Dec 04 10:37:08 crc kubenswrapper[4831]: I1204 10:37:08.446245 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.099570019 podStartE2EDuration="5.446222947s" podCreationTimestamp="2025-12-04 10:37:03 +0000 UTC" firstStartedPulling="2025-12-04 10:37:04.971907836 +0000 UTC m=+1321.921083140" lastFinishedPulling="2025-12-04 10:37:07.318560754 +0000 UTC m=+1324.267736068" observedRunningTime="2025-12-04 10:37:08.445007684 +0000 UTC m=+1325.394182998" watchObservedRunningTime="2025-12-04 10:37:08.446222947 +0000 UTC m=+1325.395398271" Dec 04 10:37:08 crc kubenswrapper[4831]: I1204 10:37:08.735840 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86dddb665-dtvkq" Dec 04 10:37:08 crc kubenswrapper[4831]: I1204 10:37:08.816302 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8df4bd59-49bbv"] Dec 04 10:37:08 crc kubenswrapper[4831]: I1204 10:37:08.816787 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8df4bd59-49bbv" podUID="708027bc-d695-4f49-bb70-31173245a13f" containerName="dnsmasq-dns" containerID="cri-o://0eb8a5f2a0e0b126fe24e5f3096174fd5168eb32b10acfa9221b786103471aba" gracePeriod=10 Dec 04 10:37:09 crc kubenswrapper[4831]: I1204 10:37:09.293617 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8df4bd59-49bbv" Dec 04 10:37:09 crc kubenswrapper[4831]: I1204 10:37:09.421799 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/708027bc-d695-4f49-bb70-31173245a13f-config\") pod \"708027bc-d695-4f49-bb70-31173245a13f\" (UID: \"708027bc-d695-4f49-bb70-31173245a13f\") " Dec 04 10:37:09 crc kubenswrapper[4831]: I1204 10:37:09.422805 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/708027bc-d695-4f49-bb70-31173245a13f-ovsdbserver-sb\") pod \"708027bc-d695-4f49-bb70-31173245a13f\" (UID: \"708027bc-d695-4f49-bb70-31173245a13f\") " Dec 04 10:37:09 crc kubenswrapper[4831]: I1204 10:37:09.422940 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/708027bc-d695-4f49-bb70-31173245a13f-dns-svc\") pod \"708027bc-d695-4f49-bb70-31173245a13f\" (UID: \"708027bc-d695-4f49-bb70-31173245a13f\") " Dec 04 10:37:09 crc kubenswrapper[4831]: I1204 10:37:09.423051 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/708027bc-d695-4f49-bb70-31173245a13f-dns-swift-storage-0\") pod \"708027bc-d695-4f49-bb70-31173245a13f\" (UID: \"708027bc-d695-4f49-bb70-31173245a13f\") " Dec 04 10:37:09 crc kubenswrapper[4831]: I1204 10:37:09.423239 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/708027bc-d695-4f49-bb70-31173245a13f-ovsdbserver-nb\") pod \"708027bc-d695-4f49-bb70-31173245a13f\" (UID: \"708027bc-d695-4f49-bb70-31173245a13f\") " Dec 04 10:37:09 crc kubenswrapper[4831]: I1204 10:37:09.423467 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnrsx\" (UniqueName: \"kubernetes.io/projected/708027bc-d695-4f49-bb70-31173245a13f-kube-api-access-vnrsx\") pod \"708027bc-d695-4f49-bb70-31173245a13f\" (UID: \"708027bc-d695-4f49-bb70-31173245a13f\") " Dec 04 10:37:09 crc kubenswrapper[4831]: I1204 10:37:09.454866 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/708027bc-d695-4f49-bb70-31173245a13f-kube-api-access-vnrsx" (OuterVolumeSpecName: "kube-api-access-vnrsx") pod "708027bc-d695-4f49-bb70-31173245a13f" (UID: "708027bc-d695-4f49-bb70-31173245a13f"). InnerVolumeSpecName "kube-api-access-vnrsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:37:09 crc kubenswrapper[4831]: I1204 10:37:09.531688 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/708027bc-d695-4f49-bb70-31173245a13f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "708027bc-d695-4f49-bb70-31173245a13f" (UID: "708027bc-d695-4f49-bb70-31173245a13f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:37:09 crc kubenswrapper[4831]: I1204 10:37:09.532139 4831 generic.go:334] "Generic (PLEG): container finished" podID="708027bc-d695-4f49-bb70-31173245a13f" containerID="0eb8a5f2a0e0b126fe24e5f3096174fd5168eb32b10acfa9221b786103471aba" exitCode=0 Dec 04 10:37:09 crc kubenswrapper[4831]: I1204 10:37:09.532571 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8df4bd59-49bbv" Dec 04 10:37:09 crc kubenswrapper[4831]: I1204 10:37:09.532888 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8df4bd59-49bbv" event={"ID":"708027bc-d695-4f49-bb70-31173245a13f","Type":"ContainerDied","Data":"0eb8a5f2a0e0b126fe24e5f3096174fd5168eb32b10acfa9221b786103471aba"} Dec 04 10:37:09 crc kubenswrapper[4831]: I1204 10:37:09.532939 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8df4bd59-49bbv" event={"ID":"708027bc-d695-4f49-bb70-31173245a13f","Type":"ContainerDied","Data":"78f36bf099587aa292210e9615949f1c80cce61599853371f6d087e82a2c41d4"} Dec 04 10:37:09 crc kubenswrapper[4831]: I1204 10:37:09.532959 4831 scope.go:117] "RemoveContainer" containerID="0eb8a5f2a0e0b126fe24e5f3096174fd5168eb32b10acfa9221b786103471aba" Dec 04 10:37:09 crc kubenswrapper[4831]: I1204 10:37:09.538291 4831 generic.go:334] "Generic (PLEG): container finished" podID="c7dc6587-6d8c-43f7-a7d2-23690ff571f9" containerID="f4682efeb59be148392c7aaa917239f03447d8862f4cc2dfd4e63545de8effa0" exitCode=0 Dec 04 10:37:09 crc kubenswrapper[4831]: I1204 10:37:09.538403 4831 generic.go:334] "Generic (PLEG): container finished" podID="c7dc6587-6d8c-43f7-a7d2-23690ff571f9" containerID="a5e61a3d44fe52293c2d9408ac3f0c6ebead1bd974b479b1d844cf13bc68c4ef" exitCode=2 Dec 04 10:37:09 crc kubenswrapper[4831]: I1204 10:37:09.538368 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c7dc6587-6d8c-43f7-a7d2-23690ff571f9","Type":"ContainerDied","Data":"f4682efeb59be148392c7aaa917239f03447d8862f4cc2dfd4e63545de8effa0"} Dec 04 10:37:09 crc kubenswrapper[4831]: I1204 10:37:09.538509 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c7dc6587-6d8c-43f7-a7d2-23690ff571f9","Type":"ContainerDied","Data":"a5e61a3d44fe52293c2d9408ac3f0c6ebead1bd974b479b1d844cf13bc68c4ef"} Dec 04 10:37:09 crc kubenswrapper[4831]: I1204 10:37:09.538526 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c7dc6587-6d8c-43f7-a7d2-23690ff571f9","Type":"ContainerDied","Data":"7ee267d8a5b029b774e3eb52c6394c0275ead877110f5c918833668c29f8bbb3"} Dec 04 10:37:09 crc kubenswrapper[4831]: I1204 10:37:09.538475 4831 generic.go:334] "Generic (PLEG): container finished" podID="c7dc6587-6d8c-43f7-a7d2-23690ff571f9" containerID="7ee267d8a5b029b774e3eb52c6394c0275ead877110f5c918833668c29f8bbb3" exitCode=0 Dec 04 10:37:09 crc kubenswrapper[4831]: I1204 10:37:09.540169 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/708027bc-d695-4f49-bb70-31173245a13f-ovsdbserver-nb\") pod \"708027bc-d695-4f49-bb70-31173245a13f\" (UID: \"708027bc-d695-4f49-bb70-31173245a13f\") " Dec 04 10:37:09 crc kubenswrapper[4831]: W1204 10:37:09.540313 4831 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/708027bc-d695-4f49-bb70-31173245a13f/volumes/kubernetes.io~configmap/ovsdbserver-nb Dec 04 10:37:09 crc kubenswrapper[4831]: I1204 10:37:09.540333 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/708027bc-d695-4f49-bb70-31173245a13f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "708027bc-d695-4f49-bb70-31173245a13f" (UID: "708027bc-d695-4f49-bb70-31173245a13f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:37:09 crc kubenswrapper[4831]: I1204 10:37:09.540868 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnrsx\" (UniqueName: \"kubernetes.io/projected/708027bc-d695-4f49-bb70-31173245a13f-kube-api-access-vnrsx\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:09 crc kubenswrapper[4831]: I1204 10:37:09.540881 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/708027bc-d695-4f49-bb70-31173245a13f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:09 crc kubenswrapper[4831]: I1204 10:37:09.542741 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/708027bc-d695-4f49-bb70-31173245a13f-config" (OuterVolumeSpecName: "config") pod "708027bc-d695-4f49-bb70-31173245a13f" (UID: "708027bc-d695-4f49-bb70-31173245a13f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:37:09 crc kubenswrapper[4831]: I1204 10:37:09.558953 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/708027bc-d695-4f49-bb70-31173245a13f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "708027bc-d695-4f49-bb70-31173245a13f" (UID: "708027bc-d695-4f49-bb70-31173245a13f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:37:09 crc kubenswrapper[4831]: I1204 10:37:09.562135 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/708027bc-d695-4f49-bb70-31173245a13f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "708027bc-d695-4f49-bb70-31173245a13f" (UID: "708027bc-d695-4f49-bb70-31173245a13f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:37:09 crc kubenswrapper[4831]: I1204 10:37:09.582172 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/708027bc-d695-4f49-bb70-31173245a13f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "708027bc-d695-4f49-bb70-31173245a13f" (UID: "708027bc-d695-4f49-bb70-31173245a13f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:37:09 crc kubenswrapper[4831]: I1204 10:37:09.646170 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/708027bc-d695-4f49-bb70-31173245a13f-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:09 crc kubenswrapper[4831]: I1204 10:37:09.646201 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/708027bc-d695-4f49-bb70-31173245a13f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:09 crc kubenswrapper[4831]: I1204 10:37:09.646216 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/708027bc-d695-4f49-bb70-31173245a13f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:09 crc kubenswrapper[4831]: I1204 10:37:09.646227 4831 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/708027bc-d695-4f49-bb70-31173245a13f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:09 crc kubenswrapper[4831]: I1204 10:37:09.651195 4831 scope.go:117] "RemoveContainer" containerID="f4c2ff65ae25feb78d8e09bda55e78fca42b761b0f747b220faae5f59c50d54f" Dec 04 10:37:09 crc kubenswrapper[4831]: I1204 10:37:09.678599 4831 scope.go:117] "RemoveContainer" containerID="0eb8a5f2a0e0b126fe24e5f3096174fd5168eb32b10acfa9221b786103471aba" Dec 04 10:37:09 crc kubenswrapper[4831]: E1204 10:37:09.680187 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0eb8a5f2a0e0b126fe24e5f3096174fd5168eb32b10acfa9221b786103471aba\": container with ID starting with 0eb8a5f2a0e0b126fe24e5f3096174fd5168eb32b10acfa9221b786103471aba not found: ID does not exist" containerID="0eb8a5f2a0e0b126fe24e5f3096174fd5168eb32b10acfa9221b786103471aba" Dec 04 10:37:09 crc kubenswrapper[4831]: I1204 10:37:09.680240 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eb8a5f2a0e0b126fe24e5f3096174fd5168eb32b10acfa9221b786103471aba"} err="failed to get container status \"0eb8a5f2a0e0b126fe24e5f3096174fd5168eb32b10acfa9221b786103471aba\": rpc error: code = NotFound desc = could not find container \"0eb8a5f2a0e0b126fe24e5f3096174fd5168eb32b10acfa9221b786103471aba\": container with ID starting with 0eb8a5f2a0e0b126fe24e5f3096174fd5168eb32b10acfa9221b786103471aba not found: ID does not exist" Dec 04 10:37:09 crc kubenswrapper[4831]: I1204 10:37:09.680269 4831 scope.go:117] "RemoveContainer" containerID="f4c2ff65ae25feb78d8e09bda55e78fca42b761b0f747b220faae5f59c50d54f" Dec 04 10:37:09 crc kubenswrapper[4831]: E1204 10:37:09.680615 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4c2ff65ae25feb78d8e09bda55e78fca42b761b0f747b220faae5f59c50d54f\": container with ID starting with f4c2ff65ae25feb78d8e09bda55e78fca42b761b0f747b220faae5f59c50d54f not found: ID does not exist" containerID="f4c2ff65ae25feb78d8e09bda55e78fca42b761b0f747b220faae5f59c50d54f" Dec 04 10:37:09 crc kubenswrapper[4831]: I1204 10:37:09.680648 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4c2ff65ae25feb78d8e09bda55e78fca42b761b0f747b220faae5f59c50d54f"} err="failed to get container status \"f4c2ff65ae25feb78d8e09bda55e78fca42b761b0f747b220faae5f59c50d54f\": rpc error: code = NotFound desc = could not find container \"f4c2ff65ae25feb78d8e09bda55e78fca42b761b0f747b220faae5f59c50d54f\": container with ID starting with f4c2ff65ae25feb78d8e09bda55e78fca42b761b0f747b220faae5f59c50d54f not found: ID does not exist" Dec 04 10:37:09 crc kubenswrapper[4831]: I1204 10:37:09.840603 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 04 10:37:09 crc kubenswrapper[4831]: I1204 10:37:09.840669 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 04 10:37:09 crc kubenswrapper[4831]: I1204 10:37:09.868197 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8df4bd59-49bbv"] Dec 04 10:37:09 crc kubenswrapper[4831]: I1204 10:37:09.897610 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8df4bd59-49bbv"] Dec 04 10:37:09 crc kubenswrapper[4831]: I1204 10:37:09.904427 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:37:09 crc kubenswrapper[4831]: I1204 10:37:09.933109 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:37:10 crc kubenswrapper[4831]: I1204 10:37:10.571956 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:37:10 crc kubenswrapper[4831]: I1204 10:37:10.799191 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-6wj7l"] Dec 04 10:37:10 crc kubenswrapper[4831]: E1204 10:37:10.799759 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="708027bc-d695-4f49-bb70-31173245a13f" containerName="init" Dec 04 10:37:10 crc kubenswrapper[4831]: I1204 10:37:10.799785 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="708027bc-d695-4f49-bb70-31173245a13f" containerName="init" Dec 04 10:37:10 crc kubenswrapper[4831]: E1204 10:37:10.799819 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="708027bc-d695-4f49-bb70-31173245a13f" containerName="dnsmasq-dns" Dec 04 10:37:10 crc kubenswrapper[4831]: I1204 10:37:10.799830 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="708027bc-d695-4f49-bb70-31173245a13f" containerName="dnsmasq-dns" Dec 04 10:37:10 crc kubenswrapper[4831]: I1204 10:37:10.800105 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="708027bc-d695-4f49-bb70-31173245a13f" containerName="dnsmasq-dns" Dec 04 10:37:10 crc kubenswrapper[4831]: I1204 10:37:10.800933 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6wj7l" Dec 04 10:37:10 crc kubenswrapper[4831]: I1204 10:37:10.803312 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 04 10:37:10 crc kubenswrapper[4831]: I1204 10:37:10.803901 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 04 10:37:10 crc kubenswrapper[4831]: I1204 10:37:10.813851 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-6wj7l"] Dec 04 10:37:10 crc kubenswrapper[4831]: I1204 10:37:10.852859 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="085dfcdf-cc1f-4769-b7bd-181169b959c4" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.216:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 10:37:10 crc kubenswrapper[4831]: I1204 10:37:10.853218 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="085dfcdf-cc1f-4769-b7bd-181169b959c4" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.216:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 10:37:10 crc kubenswrapper[4831]: I1204 10:37:10.972192 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/577e755d-3044-4c40-bdba-51fe1291b774-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6wj7l\" (UID: \"577e755d-3044-4c40-bdba-51fe1291b774\") " pod="openstack/nova-cell1-cell-mapping-6wj7l" Dec 04 10:37:10 crc kubenswrapper[4831]: I1204 10:37:10.972523 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnk8r\" (UniqueName: \"kubernetes.io/projected/577e755d-3044-4c40-bdba-51fe1291b774-kube-api-access-gnk8r\") pod \"nova-cell1-cell-mapping-6wj7l\" (UID: \"577e755d-3044-4c40-bdba-51fe1291b774\") " pod="openstack/nova-cell1-cell-mapping-6wj7l" Dec 04 10:37:10 crc kubenswrapper[4831]: I1204 10:37:10.972734 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/577e755d-3044-4c40-bdba-51fe1291b774-config-data\") pod \"nova-cell1-cell-mapping-6wj7l\" (UID: \"577e755d-3044-4c40-bdba-51fe1291b774\") " pod="openstack/nova-cell1-cell-mapping-6wj7l" Dec 04 10:37:10 crc kubenswrapper[4831]: I1204 10:37:10.972874 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/577e755d-3044-4c40-bdba-51fe1291b774-scripts\") pod \"nova-cell1-cell-mapping-6wj7l\" (UID: \"577e755d-3044-4c40-bdba-51fe1291b774\") " pod="openstack/nova-cell1-cell-mapping-6wj7l" Dec 04 10:37:11 crc kubenswrapper[4831]: I1204 10:37:11.075325 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/577e755d-3044-4c40-bdba-51fe1291b774-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6wj7l\" (UID: \"577e755d-3044-4c40-bdba-51fe1291b774\") " pod="openstack/nova-cell1-cell-mapping-6wj7l" Dec 04 10:37:11 crc kubenswrapper[4831]: I1204 10:37:11.075519 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnk8r\" (UniqueName: \"kubernetes.io/projected/577e755d-3044-4c40-bdba-51fe1291b774-kube-api-access-gnk8r\") pod \"nova-cell1-cell-mapping-6wj7l\" (UID: \"577e755d-3044-4c40-bdba-51fe1291b774\") " pod="openstack/nova-cell1-cell-mapping-6wj7l" Dec 04 10:37:11 crc kubenswrapper[4831]: I1204 10:37:11.075577 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/577e755d-3044-4c40-bdba-51fe1291b774-config-data\") pod \"nova-cell1-cell-mapping-6wj7l\" (UID: \"577e755d-3044-4c40-bdba-51fe1291b774\") " pod="openstack/nova-cell1-cell-mapping-6wj7l" Dec 04 10:37:11 crc kubenswrapper[4831]: I1204 10:37:11.075638 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/577e755d-3044-4c40-bdba-51fe1291b774-scripts\") pod \"nova-cell1-cell-mapping-6wj7l\" (UID: \"577e755d-3044-4c40-bdba-51fe1291b774\") " pod="openstack/nova-cell1-cell-mapping-6wj7l" Dec 04 10:37:11 crc kubenswrapper[4831]: I1204 10:37:11.080734 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/577e755d-3044-4c40-bdba-51fe1291b774-scripts\") pod \"nova-cell1-cell-mapping-6wj7l\" (UID: \"577e755d-3044-4c40-bdba-51fe1291b774\") " pod="openstack/nova-cell1-cell-mapping-6wj7l" Dec 04 10:37:11 crc kubenswrapper[4831]: I1204 10:37:11.081030 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/577e755d-3044-4c40-bdba-51fe1291b774-config-data\") pod \"nova-cell1-cell-mapping-6wj7l\" (UID: \"577e755d-3044-4c40-bdba-51fe1291b774\") " pod="openstack/nova-cell1-cell-mapping-6wj7l" Dec 04 10:37:11 crc kubenswrapper[4831]: I1204 10:37:11.081475 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/577e755d-3044-4c40-bdba-51fe1291b774-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6wj7l\" (UID: \"577e755d-3044-4c40-bdba-51fe1291b774\") " pod="openstack/nova-cell1-cell-mapping-6wj7l" Dec 04 10:37:11 crc kubenswrapper[4831]: I1204 10:37:11.094597 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnk8r\" (UniqueName: \"kubernetes.io/projected/577e755d-3044-4c40-bdba-51fe1291b774-kube-api-access-gnk8r\") pod \"nova-cell1-cell-mapping-6wj7l\" (UID: \"577e755d-3044-4c40-bdba-51fe1291b774\") " pod="openstack/nova-cell1-cell-mapping-6wj7l" Dec 04 10:37:11 crc kubenswrapper[4831]: I1204 10:37:11.123779 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6wj7l" Dec 04 10:37:11 crc kubenswrapper[4831]: I1204 10:37:11.294506 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="708027bc-d695-4f49-bb70-31173245a13f" path="/var/lib/kubelet/pods/708027bc-d695-4f49-bb70-31173245a13f/volumes" Dec 04 10:37:11 crc kubenswrapper[4831]: I1204 10:37:11.693553 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-6wj7l"] Dec 04 10:37:11 crc kubenswrapper[4831]: W1204 10:37:11.735919 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod577e755d_3044_4c40_bdba_51fe1291b774.slice/crio-7c4a16ef7462224c6c190ca3fd21f43aefc1cd0f62f1bcbbcc4830a8fe33f6af WatchSource:0}: Error finding container 7c4a16ef7462224c6c190ca3fd21f43aefc1cd0f62f1bcbbcc4830a8fe33f6af: Status 404 returned error can't find the container with id 7c4a16ef7462224c6c190ca3fd21f43aefc1cd0f62f1bcbbcc4830a8fe33f6af Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.012840 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.207762 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-log-httpd\") pod \"c7dc6587-6d8c-43f7-a7d2-23690ff571f9\" (UID: \"c7dc6587-6d8c-43f7-a7d2-23690ff571f9\") " Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.207939 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-scripts\") pod \"c7dc6587-6d8c-43f7-a7d2-23690ff571f9\" (UID: \"c7dc6587-6d8c-43f7-a7d2-23690ff571f9\") " Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.207970 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-ceilometer-tls-certs\") pod \"c7dc6587-6d8c-43f7-a7d2-23690ff571f9\" (UID: \"c7dc6587-6d8c-43f7-a7d2-23690ff571f9\") " Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.208000 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrlj9\" (UniqueName: \"kubernetes.io/projected/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-kube-api-access-zrlj9\") pod \"c7dc6587-6d8c-43f7-a7d2-23690ff571f9\" (UID: \"c7dc6587-6d8c-43f7-a7d2-23690ff571f9\") " Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.208035 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-config-data\") pod \"c7dc6587-6d8c-43f7-a7d2-23690ff571f9\" (UID: \"c7dc6587-6d8c-43f7-a7d2-23690ff571f9\") " Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.208109 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-sg-core-conf-yaml\") pod \"c7dc6587-6d8c-43f7-a7d2-23690ff571f9\" (UID: \"c7dc6587-6d8c-43f7-a7d2-23690ff571f9\") " Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.208178 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-combined-ca-bundle\") pod \"c7dc6587-6d8c-43f7-a7d2-23690ff571f9\" (UID: \"c7dc6587-6d8c-43f7-a7d2-23690ff571f9\") " Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.208216 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-run-httpd\") pod \"c7dc6587-6d8c-43f7-a7d2-23690ff571f9\" (UID: \"c7dc6587-6d8c-43f7-a7d2-23690ff571f9\") " Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.208715 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c7dc6587-6d8c-43f7-a7d2-23690ff571f9" (UID: "c7dc6587-6d8c-43f7-a7d2-23690ff571f9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.208857 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c7dc6587-6d8c-43f7-a7d2-23690ff571f9" (UID: "c7dc6587-6d8c-43f7-a7d2-23690ff571f9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.212882 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-kube-api-access-zrlj9" (OuterVolumeSpecName: "kube-api-access-zrlj9") pod "c7dc6587-6d8c-43f7-a7d2-23690ff571f9" (UID: "c7dc6587-6d8c-43f7-a7d2-23690ff571f9"). InnerVolumeSpecName "kube-api-access-zrlj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.214858 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-scripts" (OuterVolumeSpecName: "scripts") pod "c7dc6587-6d8c-43f7-a7d2-23690ff571f9" (UID: "c7dc6587-6d8c-43f7-a7d2-23690ff571f9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.246197 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c7dc6587-6d8c-43f7-a7d2-23690ff571f9" (UID: "c7dc6587-6d8c-43f7-a7d2-23690ff571f9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.278913 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "c7dc6587-6d8c-43f7-a7d2-23690ff571f9" (UID: "c7dc6587-6d8c-43f7-a7d2-23690ff571f9"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.307285 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7dc6587-6d8c-43f7-a7d2-23690ff571f9" (UID: "c7dc6587-6d8c-43f7-a7d2-23690ff571f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.309994 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.310020 4831 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.310033 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrlj9\" (UniqueName: \"kubernetes.io/projected/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-kube-api-access-zrlj9\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.310044 4831 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.310052 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.310060 4831 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.310068 4831 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.319605 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-config-data" (OuterVolumeSpecName: "config-data") pod "c7dc6587-6d8c-43f7-a7d2-23690ff571f9" (UID: "c7dc6587-6d8c-43f7-a7d2-23690ff571f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.412163 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7dc6587-6d8c-43f7-a7d2-23690ff571f9-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.570920 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6wj7l" event={"ID":"577e755d-3044-4c40-bdba-51fe1291b774","Type":"ContainerStarted","Data":"b61459417786d9bc254d7a785d825caf7f95f76f29934f94f5ef29526dfebf73"} Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.570975 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6wj7l" event={"ID":"577e755d-3044-4c40-bdba-51fe1291b774","Type":"ContainerStarted","Data":"7c4a16ef7462224c6c190ca3fd21f43aefc1cd0f62f1bcbbcc4830a8fe33f6af"} Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.574201 4831 generic.go:334] "Generic (PLEG): container finished" podID="c7dc6587-6d8c-43f7-a7d2-23690ff571f9" containerID="368c9e18ec8bc9af31ae086423a6b6fa2395306257817c6b2896befc3a5e61b4" exitCode=0 Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.574246 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c7dc6587-6d8c-43f7-a7d2-23690ff571f9","Type":"ContainerDied","Data":"368c9e18ec8bc9af31ae086423a6b6fa2395306257817c6b2896befc3a5e61b4"} Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.574263 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.574282 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c7dc6587-6d8c-43f7-a7d2-23690ff571f9","Type":"ContainerDied","Data":"870d7fb50022d8f55b028624bc2a9edee7ffe430ebc89f5326751f9570dda3fa"} Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.574306 4831 scope.go:117] "RemoveContainer" containerID="f4682efeb59be148392c7aaa917239f03447d8862f4cc2dfd4e63545de8effa0" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.609195 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-6wj7l" podStartSLOduration=2.609176213 podStartE2EDuration="2.609176213s" podCreationTimestamp="2025-12-04 10:37:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:37:12.603872802 +0000 UTC m=+1329.553048116" watchObservedRunningTime="2025-12-04 10:37:12.609176213 +0000 UTC m=+1329.558351527" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.614154 4831 scope.go:117] "RemoveContainer" containerID="a5e61a3d44fe52293c2d9408ac3f0c6ebead1bd974b479b1d844cf13bc68c4ef" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.661844 4831 scope.go:117] "RemoveContainer" containerID="7ee267d8a5b029b774e3eb52c6394c0275ead877110f5c918833668c29f8bbb3" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.662019 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.696721 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.740724 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:37:12 crc kubenswrapper[4831]: E1204 10:37:12.741211 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7dc6587-6d8c-43f7-a7d2-23690ff571f9" containerName="ceilometer-central-agent" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.741223 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7dc6587-6d8c-43f7-a7d2-23690ff571f9" containerName="ceilometer-central-agent" Dec 04 10:37:12 crc kubenswrapper[4831]: E1204 10:37:12.741248 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7dc6587-6d8c-43f7-a7d2-23690ff571f9" containerName="ceilometer-notification-agent" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.741254 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7dc6587-6d8c-43f7-a7d2-23690ff571f9" containerName="ceilometer-notification-agent" Dec 04 10:37:12 crc kubenswrapper[4831]: E1204 10:37:12.741270 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7dc6587-6d8c-43f7-a7d2-23690ff571f9" containerName="proxy-httpd" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.741276 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7dc6587-6d8c-43f7-a7d2-23690ff571f9" containerName="proxy-httpd" Dec 04 10:37:12 crc kubenswrapper[4831]: E1204 10:37:12.741284 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7dc6587-6d8c-43f7-a7d2-23690ff571f9" containerName="sg-core" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.741290 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7dc6587-6d8c-43f7-a7d2-23690ff571f9" containerName="sg-core" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.741460 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7dc6587-6d8c-43f7-a7d2-23690ff571f9" containerName="ceilometer-notification-agent" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.741476 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7dc6587-6d8c-43f7-a7d2-23690ff571f9" containerName="ceilometer-central-agent" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.741499 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7dc6587-6d8c-43f7-a7d2-23690ff571f9" containerName="proxy-httpd" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.741509 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7dc6587-6d8c-43f7-a7d2-23690ff571f9" containerName="sg-core" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.743286 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.752329 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.752982 4831 scope.go:117] "RemoveContainer" containerID="368c9e18ec8bc9af31ae086423a6b6fa2395306257817c6b2896befc3a5e61b4" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.753105 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.753258 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.802503 4831 scope.go:117] "RemoveContainer" containerID="f4682efeb59be148392c7aaa917239f03447d8862f4cc2dfd4e63545de8effa0" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.802809 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:37:12 crc kubenswrapper[4831]: E1204 10:37:12.806501 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4682efeb59be148392c7aaa917239f03447d8862f4cc2dfd4e63545de8effa0\": container with ID starting with f4682efeb59be148392c7aaa917239f03447d8862f4cc2dfd4e63545de8effa0 not found: ID does not exist" containerID="f4682efeb59be148392c7aaa917239f03447d8862f4cc2dfd4e63545de8effa0" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.806583 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4682efeb59be148392c7aaa917239f03447d8862f4cc2dfd4e63545de8effa0"} err="failed to get container status \"f4682efeb59be148392c7aaa917239f03447d8862f4cc2dfd4e63545de8effa0\": rpc error: code = NotFound desc = could not find container \"f4682efeb59be148392c7aaa917239f03447d8862f4cc2dfd4e63545de8effa0\": container with ID starting with f4682efeb59be148392c7aaa917239f03447d8862f4cc2dfd4e63545de8effa0 not found: ID does not exist" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.806634 4831 scope.go:117] "RemoveContainer" containerID="a5e61a3d44fe52293c2d9408ac3f0c6ebead1bd974b479b1d844cf13bc68c4ef" Dec 04 10:37:12 crc kubenswrapper[4831]: E1204 10:37:12.808327 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5e61a3d44fe52293c2d9408ac3f0c6ebead1bd974b479b1d844cf13bc68c4ef\": container with ID starting with a5e61a3d44fe52293c2d9408ac3f0c6ebead1bd974b479b1d844cf13bc68c4ef not found: ID does not exist" containerID="a5e61a3d44fe52293c2d9408ac3f0c6ebead1bd974b479b1d844cf13bc68c4ef" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.808391 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5e61a3d44fe52293c2d9408ac3f0c6ebead1bd974b479b1d844cf13bc68c4ef"} err="failed to get container status \"a5e61a3d44fe52293c2d9408ac3f0c6ebead1bd974b479b1d844cf13bc68c4ef\": rpc error: code = NotFound desc = could not find container \"a5e61a3d44fe52293c2d9408ac3f0c6ebead1bd974b479b1d844cf13bc68c4ef\": container with ID starting with a5e61a3d44fe52293c2d9408ac3f0c6ebead1bd974b479b1d844cf13bc68c4ef not found: ID does not exist" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.808415 4831 scope.go:117] "RemoveContainer" containerID="7ee267d8a5b029b774e3eb52c6394c0275ead877110f5c918833668c29f8bbb3" Dec 04 10:37:12 crc kubenswrapper[4831]: E1204 10:37:12.810348 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ee267d8a5b029b774e3eb52c6394c0275ead877110f5c918833668c29f8bbb3\": container with ID starting with 7ee267d8a5b029b774e3eb52c6394c0275ead877110f5c918833668c29f8bbb3 not found: ID does not exist" containerID="7ee267d8a5b029b774e3eb52c6394c0275ead877110f5c918833668c29f8bbb3" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.810387 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ee267d8a5b029b774e3eb52c6394c0275ead877110f5c918833668c29f8bbb3"} err="failed to get container status \"7ee267d8a5b029b774e3eb52c6394c0275ead877110f5c918833668c29f8bbb3\": rpc error: code = NotFound desc = could not find container \"7ee267d8a5b029b774e3eb52c6394c0275ead877110f5c918833668c29f8bbb3\": container with ID starting with 7ee267d8a5b029b774e3eb52c6394c0275ead877110f5c918833668c29f8bbb3 not found: ID does not exist" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.810414 4831 scope.go:117] "RemoveContainer" containerID="368c9e18ec8bc9af31ae086423a6b6fa2395306257817c6b2896befc3a5e61b4" Dec 04 10:37:12 crc kubenswrapper[4831]: E1204 10:37:12.813433 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"368c9e18ec8bc9af31ae086423a6b6fa2395306257817c6b2896befc3a5e61b4\": container with ID starting with 368c9e18ec8bc9af31ae086423a6b6fa2395306257817c6b2896befc3a5e61b4 not found: ID does not exist" containerID="368c9e18ec8bc9af31ae086423a6b6fa2395306257817c6b2896befc3a5e61b4" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.813483 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"368c9e18ec8bc9af31ae086423a6b6fa2395306257817c6b2896befc3a5e61b4"} err="failed to get container status \"368c9e18ec8bc9af31ae086423a6b6fa2395306257817c6b2896befc3a5e61b4\": rpc error: code = NotFound desc = could not find container \"368c9e18ec8bc9af31ae086423a6b6fa2395306257817c6b2896befc3a5e61b4\": container with ID starting with 368c9e18ec8bc9af31ae086423a6b6fa2395306257817c6b2896befc3a5e61b4 not found: ID does not exist" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.925494 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/840623b4-007c-441a-9c28-53ebf2e02b5c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"840623b4-007c-441a-9c28-53ebf2e02b5c\") " pod="openstack/ceilometer-0" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.925545 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/840623b4-007c-441a-9c28-53ebf2e02b5c-run-httpd\") pod \"ceilometer-0\" (UID: \"840623b4-007c-441a-9c28-53ebf2e02b5c\") " pod="openstack/ceilometer-0" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.925583 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpn2m\" (UniqueName: \"kubernetes.io/projected/840623b4-007c-441a-9c28-53ebf2e02b5c-kube-api-access-hpn2m\") pod \"ceilometer-0\" (UID: \"840623b4-007c-441a-9c28-53ebf2e02b5c\") " pod="openstack/ceilometer-0" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.925621 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/840623b4-007c-441a-9c28-53ebf2e02b5c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"840623b4-007c-441a-9c28-53ebf2e02b5c\") " pod="openstack/ceilometer-0" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.925675 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/840623b4-007c-441a-9c28-53ebf2e02b5c-log-httpd\") pod \"ceilometer-0\" (UID: \"840623b4-007c-441a-9c28-53ebf2e02b5c\") " pod="openstack/ceilometer-0" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.925736 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/840623b4-007c-441a-9c28-53ebf2e02b5c-scripts\") pod \"ceilometer-0\" (UID: \"840623b4-007c-441a-9c28-53ebf2e02b5c\") " pod="openstack/ceilometer-0" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.925791 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/840623b4-007c-441a-9c28-53ebf2e02b5c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"840623b4-007c-441a-9c28-53ebf2e02b5c\") " pod="openstack/ceilometer-0" Dec 04 10:37:12 crc kubenswrapper[4831]: I1204 10:37:12.925815 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/840623b4-007c-441a-9c28-53ebf2e02b5c-config-data\") pod \"ceilometer-0\" (UID: \"840623b4-007c-441a-9c28-53ebf2e02b5c\") " pod="openstack/ceilometer-0" Dec 04 10:37:13 crc kubenswrapper[4831]: I1204 10:37:13.027611 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/840623b4-007c-441a-9c28-53ebf2e02b5c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"840623b4-007c-441a-9c28-53ebf2e02b5c\") " pod="openstack/ceilometer-0" Dec 04 10:37:13 crc kubenswrapper[4831]: I1204 10:37:13.027701 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/840623b4-007c-441a-9c28-53ebf2e02b5c-config-data\") pod \"ceilometer-0\" (UID: \"840623b4-007c-441a-9c28-53ebf2e02b5c\") " pod="openstack/ceilometer-0" Dec 04 10:37:13 crc kubenswrapper[4831]: I1204 10:37:13.027765 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/840623b4-007c-441a-9c28-53ebf2e02b5c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"840623b4-007c-441a-9c28-53ebf2e02b5c\") " pod="openstack/ceilometer-0" Dec 04 10:37:13 crc kubenswrapper[4831]: I1204 10:37:13.027794 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/840623b4-007c-441a-9c28-53ebf2e02b5c-run-httpd\") pod \"ceilometer-0\" (UID: \"840623b4-007c-441a-9c28-53ebf2e02b5c\") " pod="openstack/ceilometer-0" Dec 04 10:37:13 crc kubenswrapper[4831]: I1204 10:37:13.027836 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpn2m\" (UniqueName: \"kubernetes.io/projected/840623b4-007c-441a-9c28-53ebf2e02b5c-kube-api-access-hpn2m\") pod \"ceilometer-0\" (UID: \"840623b4-007c-441a-9c28-53ebf2e02b5c\") " pod="openstack/ceilometer-0" Dec 04 10:37:13 crc kubenswrapper[4831]: I1204 10:37:13.027882 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/840623b4-007c-441a-9c28-53ebf2e02b5c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"840623b4-007c-441a-9c28-53ebf2e02b5c\") " pod="openstack/ceilometer-0" Dec 04 10:37:13 crc kubenswrapper[4831]: I1204 10:37:13.027938 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/840623b4-007c-441a-9c28-53ebf2e02b5c-log-httpd\") pod \"ceilometer-0\" (UID: \"840623b4-007c-441a-9c28-53ebf2e02b5c\") " pod="openstack/ceilometer-0" Dec 04 10:37:13 crc kubenswrapper[4831]: I1204 10:37:13.027987 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/840623b4-007c-441a-9c28-53ebf2e02b5c-scripts\") pod \"ceilometer-0\" (UID: \"840623b4-007c-441a-9c28-53ebf2e02b5c\") " pod="openstack/ceilometer-0" Dec 04 10:37:13 crc kubenswrapper[4831]: I1204 10:37:13.029378 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/840623b4-007c-441a-9c28-53ebf2e02b5c-run-httpd\") pod \"ceilometer-0\" (UID: \"840623b4-007c-441a-9c28-53ebf2e02b5c\") " pod="openstack/ceilometer-0" Dec 04 10:37:13 crc kubenswrapper[4831]: I1204 10:37:13.029742 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/840623b4-007c-441a-9c28-53ebf2e02b5c-log-httpd\") pod \"ceilometer-0\" (UID: \"840623b4-007c-441a-9c28-53ebf2e02b5c\") " pod="openstack/ceilometer-0" Dec 04 10:37:13 crc kubenswrapper[4831]: I1204 10:37:13.035060 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/840623b4-007c-441a-9c28-53ebf2e02b5c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"840623b4-007c-441a-9c28-53ebf2e02b5c\") " pod="openstack/ceilometer-0" Dec 04 10:37:13 crc kubenswrapper[4831]: I1204 10:37:13.035083 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/840623b4-007c-441a-9c28-53ebf2e02b5c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"840623b4-007c-441a-9c28-53ebf2e02b5c\") " pod="openstack/ceilometer-0" Dec 04 10:37:13 crc kubenswrapper[4831]: I1204 10:37:13.035540 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/840623b4-007c-441a-9c28-53ebf2e02b5c-scripts\") pod \"ceilometer-0\" (UID: \"840623b4-007c-441a-9c28-53ebf2e02b5c\") " pod="openstack/ceilometer-0" Dec 04 10:37:13 crc kubenswrapper[4831]: I1204 10:37:13.035972 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/840623b4-007c-441a-9c28-53ebf2e02b5c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"840623b4-007c-441a-9c28-53ebf2e02b5c\") " pod="openstack/ceilometer-0" Dec 04 10:37:13 crc kubenswrapper[4831]: I1204 10:37:13.036011 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/840623b4-007c-441a-9c28-53ebf2e02b5c-config-data\") pod \"ceilometer-0\" (UID: \"840623b4-007c-441a-9c28-53ebf2e02b5c\") " pod="openstack/ceilometer-0" Dec 04 10:37:13 crc kubenswrapper[4831]: I1204 10:37:13.053832 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpn2m\" (UniqueName: \"kubernetes.io/projected/840623b4-007c-441a-9c28-53ebf2e02b5c-kube-api-access-hpn2m\") pod \"ceilometer-0\" (UID: \"840623b4-007c-441a-9c28-53ebf2e02b5c\") " pod="openstack/ceilometer-0" Dec 04 10:37:13 crc kubenswrapper[4831]: I1204 10:37:13.094511 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:37:13 crc kubenswrapper[4831]: I1204 10:37:13.292061 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7dc6587-6d8c-43f7-a7d2-23690ff571f9" path="/var/lib/kubelet/pods/c7dc6587-6d8c-43f7-a7d2-23690ff571f9/volumes" Dec 04 10:37:14 crc kubenswrapper[4831]: I1204 10:37:13.566282 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:37:14 crc kubenswrapper[4831]: I1204 10:37:13.612212 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"840623b4-007c-441a-9c28-53ebf2e02b5c","Type":"ContainerStarted","Data":"2d83258efac37bcade4a42342bf1d57dcbdcd799240d921eed2c41bf307f5223"} Dec 04 10:37:14 crc kubenswrapper[4831]: I1204 10:37:14.623432 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"840623b4-007c-441a-9c28-53ebf2e02b5c","Type":"ContainerStarted","Data":"0d00c7ee3470e2a6ff3ed80bae1df51fb5a08495f5de58ce5eba1789a46c06f8"} Dec 04 10:37:14 crc kubenswrapper[4831]: I1204 10:37:14.623796 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"840623b4-007c-441a-9c28-53ebf2e02b5c","Type":"ContainerStarted","Data":"c19615e4afe5c8ccfee5cffc1151b34deda5590a924df7c823fca7796dd94cd8"} Dec 04 10:37:14 crc kubenswrapper[4831]: I1204 10:37:14.913378 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 10:37:14 crc kubenswrapper[4831]: I1204 10:37:14.913425 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 10:37:15 crc kubenswrapper[4831]: I1204 10:37:15.633486 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"840623b4-007c-441a-9c28-53ebf2e02b5c","Type":"ContainerStarted","Data":"df465fd4d0b4341795511990b3196b3c6ffe08c8019eedc87079cf63adc09b14"} Dec 04 10:37:15 crc kubenswrapper[4831]: I1204 10:37:15.964002 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f3225b9e-74b2-4cc6-a556-75ececc9db23" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.219:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 10:37:15 crc kubenswrapper[4831]: I1204 10:37:15.963966 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f3225b9e-74b2-4cc6-a556-75ececc9db23" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.219:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 10:37:17 crc kubenswrapper[4831]: I1204 10:37:17.657340 4831 generic.go:334] "Generic (PLEG): container finished" podID="577e755d-3044-4c40-bdba-51fe1291b774" containerID="b61459417786d9bc254d7a785d825caf7f95f76f29934f94f5ef29526dfebf73" exitCode=0 Dec 04 10:37:17 crc kubenswrapper[4831]: I1204 10:37:17.657399 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6wj7l" event={"ID":"577e755d-3044-4c40-bdba-51fe1291b774","Type":"ContainerDied","Data":"b61459417786d9bc254d7a785d825caf7f95f76f29934f94f5ef29526dfebf73"} Dec 04 10:37:17 crc kubenswrapper[4831]: I1204 10:37:17.664013 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"840623b4-007c-441a-9c28-53ebf2e02b5c","Type":"ContainerStarted","Data":"796ae94dcaf02e63992fce6c0c37628dc2b5024641b678c31e86603f7276885f"} Dec 04 10:37:17 crc kubenswrapper[4831]: I1204 10:37:17.665456 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 10:37:17 crc kubenswrapper[4831]: I1204 10:37:17.764498 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.662480032 podStartE2EDuration="5.764474728s" podCreationTimestamp="2025-12-04 10:37:12 +0000 UTC" firstStartedPulling="2025-12-04 10:37:13.596631482 +0000 UTC m=+1330.545806796" lastFinishedPulling="2025-12-04 10:37:16.698626178 +0000 UTC m=+1333.647801492" observedRunningTime="2025-12-04 10:37:17.75440051 +0000 UTC m=+1334.703575824" watchObservedRunningTime="2025-12-04 10:37:17.764474728 +0000 UTC m=+1334.713650032" Dec 04 10:37:19 crc kubenswrapper[4831]: I1204 10:37:19.071079 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6wj7l" Dec 04 10:37:19 crc kubenswrapper[4831]: I1204 10:37:19.257390 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnk8r\" (UniqueName: \"kubernetes.io/projected/577e755d-3044-4c40-bdba-51fe1291b774-kube-api-access-gnk8r\") pod \"577e755d-3044-4c40-bdba-51fe1291b774\" (UID: \"577e755d-3044-4c40-bdba-51fe1291b774\") " Dec 04 10:37:19 crc kubenswrapper[4831]: I1204 10:37:19.257501 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/577e755d-3044-4c40-bdba-51fe1291b774-config-data\") pod \"577e755d-3044-4c40-bdba-51fe1291b774\" (UID: \"577e755d-3044-4c40-bdba-51fe1291b774\") " Dec 04 10:37:19 crc kubenswrapper[4831]: I1204 10:37:19.257634 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/577e755d-3044-4c40-bdba-51fe1291b774-combined-ca-bundle\") pod \"577e755d-3044-4c40-bdba-51fe1291b774\" (UID: \"577e755d-3044-4c40-bdba-51fe1291b774\") " Dec 04 10:37:19 crc kubenswrapper[4831]: I1204 10:37:19.257680 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/577e755d-3044-4c40-bdba-51fe1291b774-scripts\") pod \"577e755d-3044-4c40-bdba-51fe1291b774\" (UID: \"577e755d-3044-4c40-bdba-51fe1291b774\") " Dec 04 10:37:19 crc kubenswrapper[4831]: I1204 10:37:19.262822 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/577e755d-3044-4c40-bdba-51fe1291b774-scripts" (OuterVolumeSpecName: "scripts") pod "577e755d-3044-4c40-bdba-51fe1291b774" (UID: "577e755d-3044-4c40-bdba-51fe1291b774"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:37:19 crc kubenswrapper[4831]: I1204 10:37:19.265273 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/577e755d-3044-4c40-bdba-51fe1291b774-kube-api-access-gnk8r" (OuterVolumeSpecName: "kube-api-access-gnk8r") pod "577e755d-3044-4c40-bdba-51fe1291b774" (UID: "577e755d-3044-4c40-bdba-51fe1291b774"). InnerVolumeSpecName "kube-api-access-gnk8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:37:19 crc kubenswrapper[4831]: I1204 10:37:19.288371 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/577e755d-3044-4c40-bdba-51fe1291b774-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "577e755d-3044-4c40-bdba-51fe1291b774" (UID: "577e755d-3044-4c40-bdba-51fe1291b774"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:37:19 crc kubenswrapper[4831]: I1204 10:37:19.296957 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/577e755d-3044-4c40-bdba-51fe1291b774-config-data" (OuterVolumeSpecName: "config-data") pod "577e755d-3044-4c40-bdba-51fe1291b774" (UID: "577e755d-3044-4c40-bdba-51fe1291b774"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:37:19 crc kubenswrapper[4831]: I1204 10:37:19.359890 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnk8r\" (UniqueName: \"kubernetes.io/projected/577e755d-3044-4c40-bdba-51fe1291b774-kube-api-access-gnk8r\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:19 crc kubenswrapper[4831]: I1204 10:37:19.360213 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/577e755d-3044-4c40-bdba-51fe1291b774-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:19 crc kubenswrapper[4831]: I1204 10:37:19.360223 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/577e755d-3044-4c40-bdba-51fe1291b774-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:19 crc kubenswrapper[4831]: I1204 10:37:19.360231 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/577e755d-3044-4c40-bdba-51fe1291b774-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:19 crc kubenswrapper[4831]: I1204 10:37:19.689296 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6wj7l" event={"ID":"577e755d-3044-4c40-bdba-51fe1291b774","Type":"ContainerDied","Data":"7c4a16ef7462224c6c190ca3fd21f43aefc1cd0f62f1bcbbcc4830a8fe33f6af"} Dec 04 10:37:19 crc kubenswrapper[4831]: I1204 10:37:19.689328 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6wj7l" Dec 04 10:37:19 crc kubenswrapper[4831]: I1204 10:37:19.689348 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c4a16ef7462224c6c190ca3fd21f43aefc1cd0f62f1bcbbcc4830a8fe33f6af" Dec 04 10:37:19 crc kubenswrapper[4831]: I1204 10:37:19.844798 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 04 10:37:19 crc kubenswrapper[4831]: I1204 10:37:19.848621 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 04 10:37:19 crc kubenswrapper[4831]: I1204 10:37:19.850245 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 04 10:37:19 crc kubenswrapper[4831]: I1204 10:37:19.957214 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:37:19 crc kubenswrapper[4831]: I1204 10:37:19.957807 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f3225b9e-74b2-4cc6-a556-75ececc9db23" containerName="nova-api-log" containerID="cri-o://2cff2fabf6cbb56748d000860c8e85dba7edbe10c105db83be8930c2838be03e" gracePeriod=30 Dec 04 10:37:19 crc kubenswrapper[4831]: I1204 10:37:19.958441 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f3225b9e-74b2-4cc6-a556-75ececc9db23" containerName="nova-api-api" containerID="cri-o://5028bd9bad1b12b7d482dd9afd63090bfcd1eb222d8ad48aaf6ff3300c32ae80" gracePeriod=30 Dec 04 10:37:19 crc kubenswrapper[4831]: I1204 10:37:19.993705 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:37:19 crc kubenswrapper[4831]: I1204 10:37:19.994349 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="dbd6d571-6f32-4d3d-8149-02b1c74c9ec2" containerName="nova-scheduler-scheduler" containerID="cri-o://0362367f1999aa21a31283dbc6c3cdd4a16ff0d4c505294f6c9a56cd58fd5792" gracePeriod=30 Dec 04 10:37:20 crc kubenswrapper[4831]: I1204 10:37:20.013051 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:37:20 crc kubenswrapper[4831]: E1204 10:37:20.221405 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0362367f1999aa21a31283dbc6c3cdd4a16ff0d4c505294f6c9a56cd58fd5792" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 10:37:20 crc kubenswrapper[4831]: E1204 10:37:20.226139 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0362367f1999aa21a31283dbc6c3cdd4a16ff0d4c505294f6c9a56cd58fd5792" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 10:37:20 crc kubenswrapper[4831]: E1204 10:37:20.228888 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0362367f1999aa21a31283dbc6c3cdd4a16ff0d4c505294f6c9a56cd58fd5792" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 10:37:20 crc kubenswrapper[4831]: E1204 10:37:20.228950 4831 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="dbd6d571-6f32-4d3d-8149-02b1c74c9ec2" containerName="nova-scheduler-scheduler" Dec 04 10:37:20 crc kubenswrapper[4831]: I1204 10:37:20.701804 4831 generic.go:334] "Generic (PLEG): container finished" podID="f3225b9e-74b2-4cc6-a556-75ececc9db23" containerID="2cff2fabf6cbb56748d000860c8e85dba7edbe10c105db83be8930c2838be03e" exitCode=143 Dec 04 10:37:20 crc kubenswrapper[4831]: I1204 10:37:20.702719 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3225b9e-74b2-4cc6-a556-75ececc9db23","Type":"ContainerDied","Data":"2cff2fabf6cbb56748d000860c8e85dba7edbe10c105db83be8930c2838be03e"} Dec 04 10:37:20 crc kubenswrapper[4831]: I1204 10:37:20.715110 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.463927 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.621261 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3225b9e-74b2-4cc6-a556-75ececc9db23-public-tls-certs\") pod \"f3225b9e-74b2-4cc6-a556-75ececc9db23\" (UID: \"f3225b9e-74b2-4cc6-a556-75ececc9db23\") " Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.621387 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3225b9e-74b2-4cc6-a556-75ececc9db23-config-data\") pod \"f3225b9e-74b2-4cc6-a556-75ececc9db23\" (UID: \"f3225b9e-74b2-4cc6-a556-75ececc9db23\") " Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.621529 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3225b9e-74b2-4cc6-a556-75ececc9db23-logs\") pod \"f3225b9e-74b2-4cc6-a556-75ececc9db23\" (UID: \"f3225b9e-74b2-4cc6-a556-75ececc9db23\") " Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.621562 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcp6w\" (UniqueName: \"kubernetes.io/projected/f3225b9e-74b2-4cc6-a556-75ececc9db23-kube-api-access-rcp6w\") pod \"f3225b9e-74b2-4cc6-a556-75ececc9db23\" (UID: \"f3225b9e-74b2-4cc6-a556-75ececc9db23\") " Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.621596 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3225b9e-74b2-4cc6-a556-75ececc9db23-combined-ca-bundle\") pod \"f3225b9e-74b2-4cc6-a556-75ececc9db23\" (UID: \"f3225b9e-74b2-4cc6-a556-75ececc9db23\") " Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.621694 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3225b9e-74b2-4cc6-a556-75ececc9db23-internal-tls-certs\") pod \"f3225b9e-74b2-4cc6-a556-75ececc9db23\" (UID: \"f3225b9e-74b2-4cc6-a556-75ececc9db23\") " Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.622567 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3225b9e-74b2-4cc6-a556-75ececc9db23-logs" (OuterVolumeSpecName: "logs") pod "f3225b9e-74b2-4cc6-a556-75ececc9db23" (UID: "f3225b9e-74b2-4cc6-a556-75ececc9db23"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.627693 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3225b9e-74b2-4cc6-a556-75ececc9db23-kube-api-access-rcp6w" (OuterVolumeSpecName: "kube-api-access-rcp6w") pod "f3225b9e-74b2-4cc6-a556-75ececc9db23" (UID: "f3225b9e-74b2-4cc6-a556-75ececc9db23"). InnerVolumeSpecName "kube-api-access-rcp6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.653978 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3225b9e-74b2-4cc6-a556-75ececc9db23-config-data" (OuterVolumeSpecName: "config-data") pod "f3225b9e-74b2-4cc6-a556-75ececc9db23" (UID: "f3225b9e-74b2-4cc6-a556-75ececc9db23"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.686796 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3225b9e-74b2-4cc6-a556-75ececc9db23-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f3225b9e-74b2-4cc6-a556-75ececc9db23" (UID: "f3225b9e-74b2-4cc6-a556-75ececc9db23"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.688956 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3225b9e-74b2-4cc6-a556-75ececc9db23-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3225b9e-74b2-4cc6-a556-75ececc9db23" (UID: "f3225b9e-74b2-4cc6-a556-75ececc9db23"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.699317 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3225b9e-74b2-4cc6-a556-75ececc9db23-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f3225b9e-74b2-4cc6-a556-75ececc9db23" (UID: "f3225b9e-74b2-4cc6-a556-75ececc9db23"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.718844 4831 generic.go:334] "Generic (PLEG): container finished" podID="f3225b9e-74b2-4cc6-a556-75ececc9db23" containerID="5028bd9bad1b12b7d482dd9afd63090bfcd1eb222d8ad48aaf6ff3300c32ae80" exitCode=0 Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.719239 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="085dfcdf-cc1f-4769-b7bd-181169b959c4" containerName="nova-metadata-log" containerID="cri-o://22b6b312a91c400aedc2ded68076b60054520725a89ab67a126094001bb86f9e" gracePeriod=30 Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.719368 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3225b9e-74b2-4cc6-a556-75ececc9db23","Type":"ContainerDied","Data":"5028bd9bad1b12b7d482dd9afd63090bfcd1eb222d8ad48aaf6ff3300c32ae80"} Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.719421 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3225b9e-74b2-4cc6-a556-75ececc9db23","Type":"ContainerDied","Data":"ecd900dc76366e08ffc106b3c63181313c598399f62a606dc75dce5e2ecd77f6"} Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.719441 4831 scope.go:117] "RemoveContainer" containerID="5028bd9bad1b12b7d482dd9afd63090bfcd1eb222d8ad48aaf6ff3300c32ae80" Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.719476 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="085dfcdf-cc1f-4769-b7bd-181169b959c4" containerName="nova-metadata-metadata" containerID="cri-o://acc19b0820a5d0abbc4f1e31955ec10a41ad6c86d5df871544bbc29a58be13ab" gracePeriod=30 Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.719168 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.728066 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3225b9e-74b2-4cc6-a556-75ececc9db23-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.728104 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3225b9e-74b2-4cc6-a556-75ececc9db23-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.728117 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcp6w\" (UniqueName: \"kubernetes.io/projected/f3225b9e-74b2-4cc6-a556-75ececc9db23-kube-api-access-rcp6w\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.728132 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3225b9e-74b2-4cc6-a556-75ececc9db23-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.728147 4831 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3225b9e-74b2-4cc6-a556-75ececc9db23-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.728158 4831 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3225b9e-74b2-4cc6-a556-75ececc9db23-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.804196 4831 scope.go:117] "RemoveContainer" containerID="2cff2fabf6cbb56748d000860c8e85dba7edbe10c105db83be8930c2838be03e" Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.808280 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.833892 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.853717 4831 scope.go:117] "RemoveContainer" containerID="5028bd9bad1b12b7d482dd9afd63090bfcd1eb222d8ad48aaf6ff3300c32ae80" Dec 04 10:37:21 crc kubenswrapper[4831]: E1204 10:37:21.860235 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5028bd9bad1b12b7d482dd9afd63090bfcd1eb222d8ad48aaf6ff3300c32ae80\": container with ID starting with 5028bd9bad1b12b7d482dd9afd63090bfcd1eb222d8ad48aaf6ff3300c32ae80 not found: ID does not exist" containerID="5028bd9bad1b12b7d482dd9afd63090bfcd1eb222d8ad48aaf6ff3300c32ae80" Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.860292 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5028bd9bad1b12b7d482dd9afd63090bfcd1eb222d8ad48aaf6ff3300c32ae80"} err="failed to get container status \"5028bd9bad1b12b7d482dd9afd63090bfcd1eb222d8ad48aaf6ff3300c32ae80\": rpc error: code = NotFound desc = could not find container \"5028bd9bad1b12b7d482dd9afd63090bfcd1eb222d8ad48aaf6ff3300c32ae80\": container with ID starting with 5028bd9bad1b12b7d482dd9afd63090bfcd1eb222d8ad48aaf6ff3300c32ae80 not found: ID does not exist" Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.860327 4831 scope.go:117] "RemoveContainer" containerID="2cff2fabf6cbb56748d000860c8e85dba7edbe10c105db83be8930c2838be03e" Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.863447 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 04 10:37:21 crc kubenswrapper[4831]: E1204 10:37:21.863924 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="577e755d-3044-4c40-bdba-51fe1291b774" containerName="nova-manage" Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.863943 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="577e755d-3044-4c40-bdba-51fe1291b774" containerName="nova-manage" Dec 04 10:37:21 crc kubenswrapper[4831]: E1204 10:37:21.863965 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3225b9e-74b2-4cc6-a556-75ececc9db23" containerName="nova-api-log" Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.863972 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3225b9e-74b2-4cc6-a556-75ececc9db23" containerName="nova-api-log" Dec 04 10:37:21 crc kubenswrapper[4831]: E1204 10:37:21.863983 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3225b9e-74b2-4cc6-a556-75ececc9db23" containerName="nova-api-api" Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.863991 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3225b9e-74b2-4cc6-a556-75ececc9db23" containerName="nova-api-api" Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.864185 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3225b9e-74b2-4cc6-a556-75ececc9db23" containerName="nova-api-log" Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.864213 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3225b9e-74b2-4cc6-a556-75ececc9db23" containerName="nova-api-api" Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.864236 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="577e755d-3044-4c40-bdba-51fe1291b774" containerName="nova-manage" Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.865505 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.870867 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.871072 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.871179 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 04 10:37:21 crc kubenswrapper[4831]: E1204 10:37:21.877563 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cff2fabf6cbb56748d000860c8e85dba7edbe10c105db83be8930c2838be03e\": container with ID starting with 2cff2fabf6cbb56748d000860c8e85dba7edbe10c105db83be8930c2838be03e not found: ID does not exist" containerID="2cff2fabf6cbb56748d000860c8e85dba7edbe10c105db83be8930c2838be03e" Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.877608 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cff2fabf6cbb56748d000860c8e85dba7edbe10c105db83be8930c2838be03e"} err="failed to get container status \"2cff2fabf6cbb56748d000860c8e85dba7edbe10c105db83be8930c2838be03e\": rpc error: code = NotFound desc = could not find container \"2cff2fabf6cbb56748d000860c8e85dba7edbe10c105db83be8930c2838be03e\": container with ID starting with 2cff2fabf6cbb56748d000860c8e85dba7edbe10c105db83be8930c2838be03e not found: ID does not exist" Dec 04 10:37:21 crc kubenswrapper[4831]: I1204 10:37:21.884060 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:37:22 crc kubenswrapper[4831]: I1204 10:37:22.034702 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4df7ced-bd99-4850-aced-9704ea48a817-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a4df7ced-bd99-4850-aced-9704ea48a817\") " pod="openstack/nova-api-0" Dec 04 10:37:22 crc kubenswrapper[4831]: I1204 10:37:22.034780 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4df7ced-bd99-4850-aced-9704ea48a817-logs\") pod \"nova-api-0\" (UID: \"a4df7ced-bd99-4850-aced-9704ea48a817\") " pod="openstack/nova-api-0" Dec 04 10:37:22 crc kubenswrapper[4831]: I1204 10:37:22.035012 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4df7ced-bd99-4850-aced-9704ea48a817-config-data\") pod \"nova-api-0\" (UID: \"a4df7ced-bd99-4850-aced-9704ea48a817\") " pod="openstack/nova-api-0" Dec 04 10:37:22 crc kubenswrapper[4831]: I1204 10:37:22.035061 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6zbc\" (UniqueName: \"kubernetes.io/projected/a4df7ced-bd99-4850-aced-9704ea48a817-kube-api-access-p6zbc\") pod \"nova-api-0\" (UID: \"a4df7ced-bd99-4850-aced-9704ea48a817\") " pod="openstack/nova-api-0" Dec 04 10:37:22 crc kubenswrapper[4831]: I1204 10:37:22.035158 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4df7ced-bd99-4850-aced-9704ea48a817-public-tls-certs\") pod \"nova-api-0\" (UID: \"a4df7ced-bd99-4850-aced-9704ea48a817\") " pod="openstack/nova-api-0" Dec 04 10:37:22 crc kubenswrapper[4831]: I1204 10:37:22.035320 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4df7ced-bd99-4850-aced-9704ea48a817-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a4df7ced-bd99-4850-aced-9704ea48a817\") " pod="openstack/nova-api-0" Dec 04 10:37:22 crc kubenswrapper[4831]: I1204 10:37:22.136618 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4df7ced-bd99-4850-aced-9704ea48a817-config-data\") pod \"nova-api-0\" (UID: \"a4df7ced-bd99-4850-aced-9704ea48a817\") " pod="openstack/nova-api-0" Dec 04 10:37:22 crc kubenswrapper[4831]: I1204 10:37:22.137002 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6zbc\" (UniqueName: \"kubernetes.io/projected/a4df7ced-bd99-4850-aced-9704ea48a817-kube-api-access-p6zbc\") pod \"nova-api-0\" (UID: \"a4df7ced-bd99-4850-aced-9704ea48a817\") " pod="openstack/nova-api-0" Dec 04 10:37:22 crc kubenswrapper[4831]: I1204 10:37:22.137039 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4df7ced-bd99-4850-aced-9704ea48a817-public-tls-certs\") pod \"nova-api-0\" (UID: \"a4df7ced-bd99-4850-aced-9704ea48a817\") " pod="openstack/nova-api-0" Dec 04 10:37:22 crc kubenswrapper[4831]: I1204 10:37:22.137104 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4df7ced-bd99-4850-aced-9704ea48a817-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a4df7ced-bd99-4850-aced-9704ea48a817\") " pod="openstack/nova-api-0" Dec 04 10:37:22 crc kubenswrapper[4831]: I1204 10:37:22.137247 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4df7ced-bd99-4850-aced-9704ea48a817-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a4df7ced-bd99-4850-aced-9704ea48a817\") " pod="openstack/nova-api-0" Dec 04 10:37:22 crc kubenswrapper[4831]: I1204 10:37:22.137288 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4df7ced-bd99-4850-aced-9704ea48a817-logs\") pod \"nova-api-0\" (UID: \"a4df7ced-bd99-4850-aced-9704ea48a817\") " pod="openstack/nova-api-0" Dec 04 10:37:22 crc kubenswrapper[4831]: I1204 10:37:22.137603 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4df7ced-bd99-4850-aced-9704ea48a817-logs\") pod \"nova-api-0\" (UID: \"a4df7ced-bd99-4850-aced-9704ea48a817\") " pod="openstack/nova-api-0" Dec 04 10:37:22 crc kubenswrapper[4831]: I1204 10:37:22.140989 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4df7ced-bd99-4850-aced-9704ea48a817-config-data\") pod \"nova-api-0\" (UID: \"a4df7ced-bd99-4850-aced-9704ea48a817\") " pod="openstack/nova-api-0" Dec 04 10:37:22 crc kubenswrapper[4831]: I1204 10:37:22.141526 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4df7ced-bd99-4850-aced-9704ea48a817-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a4df7ced-bd99-4850-aced-9704ea48a817\") " pod="openstack/nova-api-0" Dec 04 10:37:22 crc kubenswrapper[4831]: I1204 10:37:22.144051 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4df7ced-bd99-4850-aced-9704ea48a817-public-tls-certs\") pod \"nova-api-0\" (UID: \"a4df7ced-bd99-4850-aced-9704ea48a817\") " pod="openstack/nova-api-0" Dec 04 10:37:22 crc kubenswrapper[4831]: I1204 10:37:22.145260 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4df7ced-bd99-4850-aced-9704ea48a817-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a4df7ced-bd99-4850-aced-9704ea48a817\") " pod="openstack/nova-api-0" Dec 04 10:37:22 crc kubenswrapper[4831]: I1204 10:37:22.197407 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6zbc\" (UniqueName: \"kubernetes.io/projected/a4df7ced-bd99-4850-aced-9704ea48a817-kube-api-access-p6zbc\") pod \"nova-api-0\" (UID: \"a4df7ced-bd99-4850-aced-9704ea48a817\") " pod="openstack/nova-api-0" Dec 04 10:37:22 crc kubenswrapper[4831]: I1204 10:37:22.493904 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 10:37:22 crc kubenswrapper[4831]: I1204 10:37:22.744360 4831 generic.go:334] "Generic (PLEG): container finished" podID="085dfcdf-cc1f-4769-b7bd-181169b959c4" containerID="acc19b0820a5d0abbc4f1e31955ec10a41ad6c86d5df871544bbc29a58be13ab" exitCode=0 Dec 04 10:37:22 crc kubenswrapper[4831]: I1204 10:37:22.744630 4831 generic.go:334] "Generic (PLEG): container finished" podID="085dfcdf-cc1f-4769-b7bd-181169b959c4" containerID="22b6b312a91c400aedc2ded68076b60054520725a89ab67a126094001bb86f9e" exitCode=143 Dec 04 10:37:22 crc kubenswrapper[4831]: I1204 10:37:22.744446 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"085dfcdf-cc1f-4769-b7bd-181169b959c4","Type":"ContainerDied","Data":"acc19b0820a5d0abbc4f1e31955ec10a41ad6c86d5df871544bbc29a58be13ab"} Dec 04 10:37:22 crc kubenswrapper[4831]: I1204 10:37:22.744685 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"085dfcdf-cc1f-4769-b7bd-181169b959c4","Type":"ContainerDied","Data":"22b6b312a91c400aedc2ded68076b60054520725a89ab67a126094001bb86f9e"} Dec 04 10:37:23 crc kubenswrapper[4831]: I1204 10:37:23.022238 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:37:23 crc kubenswrapper[4831]: W1204 10:37:23.028159 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4df7ced_bd99_4850_aced_9704ea48a817.slice/crio-8c7645ebeb3f1cdb23a577cc66847ce0518825a188f9e677bf2e79e384450b2e WatchSource:0}: Error finding container 8c7645ebeb3f1cdb23a577cc66847ce0518825a188f9e677bf2e79e384450b2e: Status 404 returned error can't find the container with id 8c7645ebeb3f1cdb23a577cc66847ce0518825a188f9e677bf2e79e384450b2e Dec 04 10:37:23 crc kubenswrapper[4831]: I1204 10:37:23.072485 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 10:37:23 crc kubenswrapper[4831]: I1204 10:37:23.259834 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2gnt\" (UniqueName: \"kubernetes.io/projected/085dfcdf-cc1f-4769-b7bd-181169b959c4-kube-api-access-h2gnt\") pod \"085dfcdf-cc1f-4769-b7bd-181169b959c4\" (UID: \"085dfcdf-cc1f-4769-b7bd-181169b959c4\") " Dec 04 10:37:23 crc kubenswrapper[4831]: I1204 10:37:23.260311 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/085dfcdf-cc1f-4769-b7bd-181169b959c4-config-data\") pod \"085dfcdf-cc1f-4769-b7bd-181169b959c4\" (UID: \"085dfcdf-cc1f-4769-b7bd-181169b959c4\") " Dec 04 10:37:23 crc kubenswrapper[4831]: I1204 10:37:23.260347 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/085dfcdf-cc1f-4769-b7bd-181169b959c4-logs\") pod \"085dfcdf-cc1f-4769-b7bd-181169b959c4\" (UID: \"085dfcdf-cc1f-4769-b7bd-181169b959c4\") " Dec 04 10:37:23 crc kubenswrapper[4831]: I1204 10:37:23.260379 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/085dfcdf-cc1f-4769-b7bd-181169b959c4-nova-metadata-tls-certs\") pod \"085dfcdf-cc1f-4769-b7bd-181169b959c4\" (UID: \"085dfcdf-cc1f-4769-b7bd-181169b959c4\") " Dec 04 10:37:23 crc kubenswrapper[4831]: I1204 10:37:23.260442 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/085dfcdf-cc1f-4769-b7bd-181169b959c4-combined-ca-bundle\") pod \"085dfcdf-cc1f-4769-b7bd-181169b959c4\" (UID: \"085dfcdf-cc1f-4769-b7bd-181169b959c4\") " Dec 04 10:37:23 crc kubenswrapper[4831]: I1204 10:37:23.260781 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/085dfcdf-cc1f-4769-b7bd-181169b959c4-logs" (OuterVolumeSpecName: "logs") pod "085dfcdf-cc1f-4769-b7bd-181169b959c4" (UID: "085dfcdf-cc1f-4769-b7bd-181169b959c4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:37:23 crc kubenswrapper[4831]: I1204 10:37:23.261011 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/085dfcdf-cc1f-4769-b7bd-181169b959c4-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:23 crc kubenswrapper[4831]: I1204 10:37:23.264416 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/085dfcdf-cc1f-4769-b7bd-181169b959c4-kube-api-access-h2gnt" (OuterVolumeSpecName: "kube-api-access-h2gnt") pod "085dfcdf-cc1f-4769-b7bd-181169b959c4" (UID: "085dfcdf-cc1f-4769-b7bd-181169b959c4"). InnerVolumeSpecName "kube-api-access-h2gnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:37:23 crc kubenswrapper[4831]: I1204 10:37:23.305603 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/085dfcdf-cc1f-4769-b7bd-181169b959c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "085dfcdf-cc1f-4769-b7bd-181169b959c4" (UID: "085dfcdf-cc1f-4769-b7bd-181169b959c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:37:23 crc kubenswrapper[4831]: I1204 10:37:23.309609 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3225b9e-74b2-4cc6-a556-75ececc9db23" path="/var/lib/kubelet/pods/f3225b9e-74b2-4cc6-a556-75ececc9db23/volumes" Dec 04 10:37:23 crc kubenswrapper[4831]: I1204 10:37:23.315236 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/085dfcdf-cc1f-4769-b7bd-181169b959c4-config-data" (OuterVolumeSpecName: "config-data") pod "085dfcdf-cc1f-4769-b7bd-181169b959c4" (UID: "085dfcdf-cc1f-4769-b7bd-181169b959c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:37:23 crc kubenswrapper[4831]: I1204 10:37:23.341729 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/085dfcdf-cc1f-4769-b7bd-181169b959c4-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "085dfcdf-cc1f-4769-b7bd-181169b959c4" (UID: "085dfcdf-cc1f-4769-b7bd-181169b959c4"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:37:23 crc kubenswrapper[4831]: I1204 10:37:23.362805 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/085dfcdf-cc1f-4769-b7bd-181169b959c4-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:23 crc kubenswrapper[4831]: I1204 10:37:23.362879 4831 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/085dfcdf-cc1f-4769-b7bd-181169b959c4-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:23 crc kubenswrapper[4831]: I1204 10:37:23.362895 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/085dfcdf-cc1f-4769-b7bd-181169b959c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:23 crc kubenswrapper[4831]: I1204 10:37:23.362907 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2gnt\" (UniqueName: \"kubernetes.io/projected/085dfcdf-cc1f-4769-b7bd-181169b959c4-kube-api-access-h2gnt\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:23 crc kubenswrapper[4831]: I1204 10:37:23.756364 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"085dfcdf-cc1f-4769-b7bd-181169b959c4","Type":"ContainerDied","Data":"b3988714f961583fec2d936ce8b0069914bc8c3290679797c5da4ddb65761721"} Dec 04 10:37:23 crc kubenswrapper[4831]: I1204 10:37:23.756441 4831 scope.go:117] "RemoveContainer" containerID="acc19b0820a5d0abbc4f1e31955ec10a41ad6c86d5df871544bbc29a58be13ab" Dec 04 10:37:23 crc kubenswrapper[4831]: I1204 10:37:23.756979 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 10:37:23 crc kubenswrapper[4831]: I1204 10:37:23.759116 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a4df7ced-bd99-4850-aced-9704ea48a817","Type":"ContainerStarted","Data":"fc68ea352810a84d701845eec609de06f62d29c7cf905421e5fa7ce5795d0028"} Dec 04 10:37:23 crc kubenswrapper[4831]: I1204 10:37:23.759160 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a4df7ced-bd99-4850-aced-9704ea48a817","Type":"ContainerStarted","Data":"37383fab53361704a7c6779e695f8f0aa31a1fb59cd139cd06fd851ac877b37b"} Dec 04 10:37:23 crc kubenswrapper[4831]: I1204 10:37:23.759176 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a4df7ced-bd99-4850-aced-9704ea48a817","Type":"ContainerStarted","Data":"8c7645ebeb3f1cdb23a577cc66847ce0518825a188f9e677bf2e79e384450b2e"} Dec 04 10:37:23 crc kubenswrapper[4831]: I1204 10:37:23.786852 4831 scope.go:117] "RemoveContainer" containerID="22b6b312a91c400aedc2ded68076b60054520725a89ab67a126094001bb86f9e" Dec 04 10:37:23 crc kubenswrapper[4831]: I1204 10:37:23.797856 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.797841467 podStartE2EDuration="2.797841467s" podCreationTimestamp="2025-12-04 10:37:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:37:23.796171813 +0000 UTC m=+1340.745347127" watchObservedRunningTime="2025-12-04 10:37:23.797841467 +0000 UTC m=+1340.747016781" Dec 04 10:37:23 crc kubenswrapper[4831]: I1204 10:37:23.829692 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:37:23 crc kubenswrapper[4831]: I1204 10:37:23.842789 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:37:23 crc kubenswrapper[4831]: I1204 10:37:23.854558 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:37:23 crc kubenswrapper[4831]: E1204 10:37:23.855179 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="085dfcdf-cc1f-4769-b7bd-181169b959c4" containerName="nova-metadata-metadata" Dec 04 10:37:23 crc kubenswrapper[4831]: I1204 10:37:23.855202 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="085dfcdf-cc1f-4769-b7bd-181169b959c4" containerName="nova-metadata-metadata" Dec 04 10:37:23 crc kubenswrapper[4831]: E1204 10:37:23.855231 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="085dfcdf-cc1f-4769-b7bd-181169b959c4" containerName="nova-metadata-log" Dec 04 10:37:23 crc kubenswrapper[4831]: I1204 10:37:23.855237 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="085dfcdf-cc1f-4769-b7bd-181169b959c4" containerName="nova-metadata-log" Dec 04 10:37:23 crc kubenswrapper[4831]: I1204 10:37:23.855449 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="085dfcdf-cc1f-4769-b7bd-181169b959c4" containerName="nova-metadata-log" Dec 04 10:37:23 crc kubenswrapper[4831]: I1204 10:37:23.855470 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="085dfcdf-cc1f-4769-b7bd-181169b959c4" containerName="nova-metadata-metadata" Dec 04 10:37:23 crc kubenswrapper[4831]: I1204 10:37:23.856529 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 10:37:23 crc kubenswrapper[4831]: I1204 10:37:23.859075 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 04 10:37:23 crc kubenswrapper[4831]: I1204 10:37:23.859277 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 04 10:37:23 crc kubenswrapper[4831]: I1204 10:37:23.874896 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:37:23 crc kubenswrapper[4831]: I1204 10:37:23.979538 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/11eb651c-b7cf-4ab7-af1c-4d9824621711-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"11eb651c-b7cf-4ab7-af1c-4d9824621711\") " pod="openstack/nova-metadata-0" Dec 04 10:37:23 crc kubenswrapper[4831]: I1204 10:37:23.979822 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11eb651c-b7cf-4ab7-af1c-4d9824621711-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"11eb651c-b7cf-4ab7-af1c-4d9824621711\") " pod="openstack/nova-metadata-0" Dec 04 10:37:23 crc kubenswrapper[4831]: I1204 10:37:23.979970 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11eb651c-b7cf-4ab7-af1c-4d9824621711-logs\") pod \"nova-metadata-0\" (UID: \"11eb651c-b7cf-4ab7-af1c-4d9824621711\") " pod="openstack/nova-metadata-0" Dec 04 10:37:23 crc kubenswrapper[4831]: I1204 10:37:23.980053 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11eb651c-b7cf-4ab7-af1c-4d9824621711-config-data\") pod \"nova-metadata-0\" (UID: \"11eb651c-b7cf-4ab7-af1c-4d9824621711\") " pod="openstack/nova-metadata-0" Dec 04 10:37:23 crc kubenswrapper[4831]: I1204 10:37:23.980170 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xkbw\" (UniqueName: \"kubernetes.io/projected/11eb651c-b7cf-4ab7-af1c-4d9824621711-kube-api-access-8xkbw\") pod \"nova-metadata-0\" (UID: \"11eb651c-b7cf-4ab7-af1c-4d9824621711\") " pod="openstack/nova-metadata-0" Dec 04 10:37:24 crc kubenswrapper[4831]: I1204 10:37:24.081706 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11eb651c-b7cf-4ab7-af1c-4d9824621711-config-data\") pod \"nova-metadata-0\" (UID: \"11eb651c-b7cf-4ab7-af1c-4d9824621711\") " pod="openstack/nova-metadata-0" Dec 04 10:37:24 crc kubenswrapper[4831]: I1204 10:37:24.081751 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xkbw\" (UniqueName: \"kubernetes.io/projected/11eb651c-b7cf-4ab7-af1c-4d9824621711-kube-api-access-8xkbw\") pod \"nova-metadata-0\" (UID: \"11eb651c-b7cf-4ab7-af1c-4d9824621711\") " pod="openstack/nova-metadata-0" Dec 04 10:37:24 crc kubenswrapper[4831]: I1204 10:37:24.081868 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/11eb651c-b7cf-4ab7-af1c-4d9824621711-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"11eb651c-b7cf-4ab7-af1c-4d9824621711\") " pod="openstack/nova-metadata-0" Dec 04 10:37:24 crc kubenswrapper[4831]: I1204 10:37:24.081923 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11eb651c-b7cf-4ab7-af1c-4d9824621711-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"11eb651c-b7cf-4ab7-af1c-4d9824621711\") " pod="openstack/nova-metadata-0" Dec 04 10:37:24 crc kubenswrapper[4831]: I1204 10:37:24.081996 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11eb651c-b7cf-4ab7-af1c-4d9824621711-logs\") pod \"nova-metadata-0\" (UID: \"11eb651c-b7cf-4ab7-af1c-4d9824621711\") " pod="openstack/nova-metadata-0" Dec 04 10:37:24 crc kubenswrapper[4831]: I1204 10:37:24.082398 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11eb651c-b7cf-4ab7-af1c-4d9824621711-logs\") pod \"nova-metadata-0\" (UID: \"11eb651c-b7cf-4ab7-af1c-4d9824621711\") " pod="openstack/nova-metadata-0" Dec 04 10:37:24 crc kubenswrapper[4831]: I1204 10:37:24.088114 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11eb651c-b7cf-4ab7-af1c-4d9824621711-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"11eb651c-b7cf-4ab7-af1c-4d9824621711\") " pod="openstack/nova-metadata-0" Dec 04 10:37:24 crc kubenswrapper[4831]: I1204 10:37:24.088621 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/11eb651c-b7cf-4ab7-af1c-4d9824621711-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"11eb651c-b7cf-4ab7-af1c-4d9824621711\") " pod="openstack/nova-metadata-0" Dec 04 10:37:24 crc kubenswrapper[4831]: I1204 10:37:24.089458 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11eb651c-b7cf-4ab7-af1c-4d9824621711-config-data\") pod \"nova-metadata-0\" (UID: \"11eb651c-b7cf-4ab7-af1c-4d9824621711\") " pod="openstack/nova-metadata-0" Dec 04 10:37:24 crc kubenswrapper[4831]: I1204 10:37:24.098975 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xkbw\" (UniqueName: \"kubernetes.io/projected/11eb651c-b7cf-4ab7-af1c-4d9824621711-kube-api-access-8xkbw\") pod \"nova-metadata-0\" (UID: \"11eb651c-b7cf-4ab7-af1c-4d9824621711\") " pod="openstack/nova-metadata-0" Dec 04 10:37:24 crc kubenswrapper[4831]: I1204 10:37:24.187550 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 10:37:24 crc kubenswrapper[4831]: I1204 10:37:24.636624 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:37:24 crc kubenswrapper[4831]: I1204 10:37:24.772864 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"11eb651c-b7cf-4ab7-af1c-4d9824621711","Type":"ContainerStarted","Data":"90c9d93b198b916de22e0630f1a0cc206cc7a26047f3243005bdffc8e530e6ad"} Dec 04 10:37:24 crc kubenswrapper[4831]: I1204 10:37:24.776132 4831 generic.go:334] "Generic (PLEG): container finished" podID="dbd6d571-6f32-4d3d-8149-02b1c74c9ec2" containerID="0362367f1999aa21a31283dbc6c3cdd4a16ff0d4c505294f6c9a56cd58fd5792" exitCode=0 Dec 04 10:37:24 crc kubenswrapper[4831]: I1204 10:37:24.776189 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dbd6d571-6f32-4d3d-8149-02b1c74c9ec2","Type":"ContainerDied","Data":"0362367f1999aa21a31283dbc6c3cdd4a16ff0d4c505294f6c9a56cd58fd5792"} Dec 04 10:37:25 crc kubenswrapper[4831]: I1204 10:37:25.159839 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 10:37:25 crc kubenswrapper[4831]: I1204 10:37:25.290902 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="085dfcdf-cc1f-4769-b7bd-181169b959c4" path="/var/lib/kubelet/pods/085dfcdf-cc1f-4769-b7bd-181169b959c4/volumes" Dec 04 10:37:25 crc kubenswrapper[4831]: I1204 10:37:25.307303 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l742x\" (UniqueName: \"kubernetes.io/projected/dbd6d571-6f32-4d3d-8149-02b1c74c9ec2-kube-api-access-l742x\") pod \"dbd6d571-6f32-4d3d-8149-02b1c74c9ec2\" (UID: \"dbd6d571-6f32-4d3d-8149-02b1c74c9ec2\") " Dec 04 10:37:25 crc kubenswrapper[4831]: I1204 10:37:25.307367 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd6d571-6f32-4d3d-8149-02b1c74c9ec2-combined-ca-bundle\") pod \"dbd6d571-6f32-4d3d-8149-02b1c74c9ec2\" (UID: \"dbd6d571-6f32-4d3d-8149-02b1c74c9ec2\") " Dec 04 10:37:25 crc kubenswrapper[4831]: I1204 10:37:25.307485 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbd6d571-6f32-4d3d-8149-02b1c74c9ec2-config-data\") pod \"dbd6d571-6f32-4d3d-8149-02b1c74c9ec2\" (UID: \"dbd6d571-6f32-4d3d-8149-02b1c74c9ec2\") " Dec 04 10:37:25 crc kubenswrapper[4831]: I1204 10:37:25.312117 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbd6d571-6f32-4d3d-8149-02b1c74c9ec2-kube-api-access-l742x" (OuterVolumeSpecName: "kube-api-access-l742x") pod "dbd6d571-6f32-4d3d-8149-02b1c74c9ec2" (UID: "dbd6d571-6f32-4d3d-8149-02b1c74c9ec2"). InnerVolumeSpecName "kube-api-access-l742x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:37:25 crc kubenswrapper[4831]: I1204 10:37:25.337917 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbd6d571-6f32-4d3d-8149-02b1c74c9ec2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dbd6d571-6f32-4d3d-8149-02b1c74c9ec2" (UID: "dbd6d571-6f32-4d3d-8149-02b1c74c9ec2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:37:25 crc kubenswrapper[4831]: I1204 10:37:25.360840 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbd6d571-6f32-4d3d-8149-02b1c74c9ec2-config-data" (OuterVolumeSpecName: "config-data") pod "dbd6d571-6f32-4d3d-8149-02b1c74c9ec2" (UID: "dbd6d571-6f32-4d3d-8149-02b1c74c9ec2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:37:25 crc kubenswrapper[4831]: I1204 10:37:25.409635 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l742x\" (UniqueName: \"kubernetes.io/projected/dbd6d571-6f32-4d3d-8149-02b1c74c9ec2-kube-api-access-l742x\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:25 crc kubenswrapper[4831]: I1204 10:37:25.409697 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd6d571-6f32-4d3d-8149-02b1c74c9ec2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:25 crc kubenswrapper[4831]: I1204 10:37:25.409711 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbd6d571-6f32-4d3d-8149-02b1c74c9ec2-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:25 crc kubenswrapper[4831]: I1204 10:37:25.788012 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dbd6d571-6f32-4d3d-8149-02b1c74c9ec2","Type":"ContainerDied","Data":"3ae98c822c8829ee8972feff04bb4b66cb3f7a8baa1f146ae8df4d0cabf65bd3"} Dec 04 10:37:25 crc kubenswrapper[4831]: I1204 10:37:25.788029 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 10:37:25 crc kubenswrapper[4831]: I1204 10:37:25.788313 4831 scope.go:117] "RemoveContainer" containerID="0362367f1999aa21a31283dbc6c3cdd4a16ff0d4c505294f6c9a56cd58fd5792" Dec 04 10:37:25 crc kubenswrapper[4831]: I1204 10:37:25.790054 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"11eb651c-b7cf-4ab7-af1c-4d9824621711","Type":"ContainerStarted","Data":"99d87a696f3ec01acaec2e43c28df99a827c0e02767352631d2bd669a46cc726"} Dec 04 10:37:25 crc kubenswrapper[4831]: I1204 10:37:25.790097 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"11eb651c-b7cf-4ab7-af1c-4d9824621711","Type":"ContainerStarted","Data":"c2eb44686cd95cc23c667ead639de5dadd978e5ee18a93521c7b00d881fe61ae"} Dec 04 10:37:25 crc kubenswrapper[4831]: I1204 10:37:25.871206 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.871182829 podStartE2EDuration="2.871182829s" podCreationTimestamp="2025-12-04 10:37:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:37:25.823560124 +0000 UTC m=+1342.772735438" watchObservedRunningTime="2025-12-04 10:37:25.871182829 +0000 UTC m=+1342.820358143" Dec 04 10:37:25 crc kubenswrapper[4831]: I1204 10:37:25.875467 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:37:25 crc kubenswrapper[4831]: I1204 10:37:25.908911 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:37:25 crc kubenswrapper[4831]: I1204 10:37:25.918574 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:37:25 crc kubenswrapper[4831]: E1204 10:37:25.919212 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbd6d571-6f32-4d3d-8149-02b1c74c9ec2" containerName="nova-scheduler-scheduler" Dec 04 10:37:25 crc kubenswrapper[4831]: I1204 10:37:25.919239 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbd6d571-6f32-4d3d-8149-02b1c74c9ec2" containerName="nova-scheduler-scheduler" Dec 04 10:37:25 crc kubenswrapper[4831]: I1204 10:37:25.919508 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbd6d571-6f32-4d3d-8149-02b1c74c9ec2" containerName="nova-scheduler-scheduler" Dec 04 10:37:25 crc kubenswrapper[4831]: I1204 10:37:25.920485 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 10:37:25 crc kubenswrapper[4831]: I1204 10:37:25.924788 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 04 10:37:25 crc kubenswrapper[4831]: I1204 10:37:25.928142 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:37:26 crc kubenswrapper[4831]: I1204 10:37:26.021591 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1c3c00f-9970-4177-913d-64eb1e895bec-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d1c3c00f-9970-4177-913d-64eb1e895bec\") " pod="openstack/nova-scheduler-0" Dec 04 10:37:26 crc kubenswrapper[4831]: I1204 10:37:26.021733 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1c3c00f-9970-4177-913d-64eb1e895bec-config-data\") pod \"nova-scheduler-0\" (UID: \"d1c3c00f-9970-4177-913d-64eb1e895bec\") " pod="openstack/nova-scheduler-0" Dec 04 10:37:26 crc kubenswrapper[4831]: I1204 10:37:26.021825 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltvxp\" (UniqueName: \"kubernetes.io/projected/d1c3c00f-9970-4177-913d-64eb1e895bec-kube-api-access-ltvxp\") pod \"nova-scheduler-0\" (UID: \"d1c3c00f-9970-4177-913d-64eb1e895bec\") " pod="openstack/nova-scheduler-0" Dec 04 10:37:26 crc kubenswrapper[4831]: I1204 10:37:26.123480 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1c3c00f-9970-4177-913d-64eb1e895bec-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d1c3c00f-9970-4177-913d-64eb1e895bec\") " pod="openstack/nova-scheduler-0" Dec 04 10:37:26 crc kubenswrapper[4831]: I1204 10:37:26.123876 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1c3c00f-9970-4177-913d-64eb1e895bec-config-data\") pod \"nova-scheduler-0\" (UID: \"d1c3c00f-9970-4177-913d-64eb1e895bec\") " pod="openstack/nova-scheduler-0" Dec 04 10:37:26 crc kubenswrapper[4831]: I1204 10:37:26.123999 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltvxp\" (UniqueName: \"kubernetes.io/projected/d1c3c00f-9970-4177-913d-64eb1e895bec-kube-api-access-ltvxp\") pod \"nova-scheduler-0\" (UID: \"d1c3c00f-9970-4177-913d-64eb1e895bec\") " pod="openstack/nova-scheduler-0" Dec 04 10:37:26 crc kubenswrapper[4831]: I1204 10:37:26.129293 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1c3c00f-9970-4177-913d-64eb1e895bec-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d1c3c00f-9970-4177-913d-64eb1e895bec\") " pod="openstack/nova-scheduler-0" Dec 04 10:37:26 crc kubenswrapper[4831]: I1204 10:37:26.129784 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1c3c00f-9970-4177-913d-64eb1e895bec-config-data\") pod \"nova-scheduler-0\" (UID: \"d1c3c00f-9970-4177-913d-64eb1e895bec\") " pod="openstack/nova-scheduler-0" Dec 04 10:37:26 crc kubenswrapper[4831]: I1204 10:37:26.140089 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltvxp\" (UniqueName: \"kubernetes.io/projected/d1c3c00f-9970-4177-913d-64eb1e895bec-kube-api-access-ltvxp\") pod \"nova-scheduler-0\" (UID: \"d1c3c00f-9970-4177-913d-64eb1e895bec\") " pod="openstack/nova-scheduler-0" Dec 04 10:37:26 crc kubenswrapper[4831]: I1204 10:37:26.247741 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 10:37:26 crc kubenswrapper[4831]: W1204 10:37:26.717708 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1c3c00f_9970_4177_913d_64eb1e895bec.slice/crio-833be4f4188d70f047eef7d3ed39892cf50581227a515dec3da28077107b58e3 WatchSource:0}: Error finding container 833be4f4188d70f047eef7d3ed39892cf50581227a515dec3da28077107b58e3: Status 404 returned error can't find the container with id 833be4f4188d70f047eef7d3ed39892cf50581227a515dec3da28077107b58e3 Dec 04 10:37:26 crc kubenswrapper[4831]: I1204 10:37:26.718058 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:37:26 crc kubenswrapper[4831]: I1204 10:37:26.803123 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d1c3c00f-9970-4177-913d-64eb1e895bec","Type":"ContainerStarted","Data":"833be4f4188d70f047eef7d3ed39892cf50581227a515dec3da28077107b58e3"} Dec 04 10:37:27 crc kubenswrapper[4831]: I1204 10:37:27.287115 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbd6d571-6f32-4d3d-8149-02b1c74c9ec2" path="/var/lib/kubelet/pods/dbd6d571-6f32-4d3d-8149-02b1c74c9ec2/volumes" Dec 04 10:37:27 crc kubenswrapper[4831]: I1204 10:37:27.815003 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d1c3c00f-9970-4177-913d-64eb1e895bec","Type":"ContainerStarted","Data":"1ea150eba589a743a5c4f0082e731755d72dc63c051a3d2d4dfc12d468281917"} Dec 04 10:37:27 crc kubenswrapper[4831]: I1204 10:37:27.834232 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.834214181 podStartE2EDuration="2.834214181s" podCreationTimestamp="2025-12-04 10:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:37:27.828037907 +0000 UTC m=+1344.777213231" watchObservedRunningTime="2025-12-04 10:37:27.834214181 +0000 UTC m=+1344.783389495" Dec 04 10:37:29 crc kubenswrapper[4831]: I1204 10:37:29.188840 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 10:37:29 crc kubenswrapper[4831]: I1204 10:37:29.190154 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 10:37:31 crc kubenswrapper[4831]: I1204 10:37:31.248763 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 04 10:37:32 crc kubenswrapper[4831]: I1204 10:37:32.495101 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 10:37:32 crc kubenswrapper[4831]: I1204 10:37:32.495151 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 10:37:33 crc kubenswrapper[4831]: I1204 10:37:33.507892 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a4df7ced-bd99-4850-aced-9704ea48a817" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.222:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 10:37:33 crc kubenswrapper[4831]: I1204 10:37:33.508498 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a4df7ced-bd99-4850-aced-9704ea48a817" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.222:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 10:37:34 crc kubenswrapper[4831]: I1204 10:37:34.188846 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 04 10:37:34 crc kubenswrapper[4831]: I1204 10:37:34.188909 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 04 10:37:35 crc kubenswrapper[4831]: I1204 10:37:35.202942 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="11eb651c-b7cf-4ab7-af1c-4d9824621711" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.223:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 10:37:35 crc kubenswrapper[4831]: I1204 10:37:35.203016 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="11eb651c-b7cf-4ab7-af1c-4d9824621711" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.223:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 10:37:36 crc kubenswrapper[4831]: I1204 10:37:36.247964 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 04 10:37:36 crc kubenswrapper[4831]: I1204 10:37:36.281111 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 04 10:37:36 crc kubenswrapper[4831]: I1204 10:37:36.944485 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 04 10:37:42 crc kubenswrapper[4831]: I1204 10:37:42.505586 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 04 10:37:42 crc kubenswrapper[4831]: I1204 10:37:42.506482 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 04 10:37:42 crc kubenswrapper[4831]: I1204 10:37:42.507455 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 04 10:37:42 crc kubenswrapper[4831]: I1204 10:37:42.515215 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 04 10:37:42 crc kubenswrapper[4831]: I1204 10:37:42.959572 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 04 10:37:42 crc kubenswrapper[4831]: I1204 10:37:42.971141 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 04 10:37:43 crc kubenswrapper[4831]: I1204 10:37:43.237425 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 04 10:37:44 crc kubenswrapper[4831]: I1204 10:37:44.193803 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 04 10:37:44 crc kubenswrapper[4831]: I1204 10:37:44.195504 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 04 10:37:44 crc kubenswrapper[4831]: I1204 10:37:44.199608 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 04 10:37:44 crc kubenswrapper[4831]: I1204 10:37:44.978915 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 04 10:37:53 crc kubenswrapper[4831]: I1204 10:37:53.850922 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 10:37:54 crc kubenswrapper[4831]: I1204 10:37:54.837252 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 10:37:57 crc kubenswrapper[4831]: I1204 10:37:57.259299 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="462ff702-35a4-4cbe-8155-3ce8a321bf48" containerName="rabbitmq" containerID="cri-o://06ead54d3b9961b374f03f78f071a94d7c8a3739adc54b4521b471bd6df743d1" gracePeriod=604797 Dec 04 10:37:58 crc kubenswrapper[4831]: I1204 10:37:58.108600 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="2d1e04df-4c2a-440f-b533-9903a58c8ecc" containerName="rabbitmq" containerID="cri-o://13a0cc1dacfc84e2f2f951677b6763271ea13bee4c69f204082d8cd8e7422c77" gracePeriod=604797 Dec 04 10:37:58 crc kubenswrapper[4831]: I1204 10:37:58.892918 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 10:37:58 crc kubenswrapper[4831]: I1204 10:37:58.963601 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5k49\" (UniqueName: \"kubernetes.io/projected/462ff702-35a4-4cbe-8155-3ce8a321bf48-kube-api-access-d5k49\") pod \"462ff702-35a4-4cbe-8155-3ce8a321bf48\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " Dec 04 10:37:58 crc kubenswrapper[4831]: I1204 10:37:58.963742 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"462ff702-35a4-4cbe-8155-3ce8a321bf48\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " Dec 04 10:37:58 crc kubenswrapper[4831]: I1204 10:37:58.963888 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/462ff702-35a4-4cbe-8155-3ce8a321bf48-rabbitmq-confd\") pod \"462ff702-35a4-4cbe-8155-3ce8a321bf48\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " Dec 04 10:37:58 crc kubenswrapper[4831]: I1204 10:37:58.963941 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/462ff702-35a4-4cbe-8155-3ce8a321bf48-config-data\") pod \"462ff702-35a4-4cbe-8155-3ce8a321bf48\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " Dec 04 10:37:58 crc kubenswrapper[4831]: I1204 10:37:58.963978 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/462ff702-35a4-4cbe-8155-3ce8a321bf48-plugins-conf\") pod \"462ff702-35a4-4cbe-8155-3ce8a321bf48\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " Dec 04 10:37:58 crc kubenswrapper[4831]: I1204 10:37:58.964040 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/462ff702-35a4-4cbe-8155-3ce8a321bf48-server-conf\") pod \"462ff702-35a4-4cbe-8155-3ce8a321bf48\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " Dec 04 10:37:58 crc kubenswrapper[4831]: I1204 10:37:58.964109 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/462ff702-35a4-4cbe-8155-3ce8a321bf48-pod-info\") pod \"462ff702-35a4-4cbe-8155-3ce8a321bf48\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " Dec 04 10:37:58 crc kubenswrapper[4831]: I1204 10:37:58.964199 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/462ff702-35a4-4cbe-8155-3ce8a321bf48-rabbitmq-tls\") pod \"462ff702-35a4-4cbe-8155-3ce8a321bf48\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " Dec 04 10:37:58 crc kubenswrapper[4831]: I1204 10:37:58.964286 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/462ff702-35a4-4cbe-8155-3ce8a321bf48-rabbitmq-erlang-cookie\") pod \"462ff702-35a4-4cbe-8155-3ce8a321bf48\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " Dec 04 10:37:58 crc kubenswrapper[4831]: I1204 10:37:58.964362 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/462ff702-35a4-4cbe-8155-3ce8a321bf48-erlang-cookie-secret\") pod \"462ff702-35a4-4cbe-8155-3ce8a321bf48\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " Dec 04 10:37:58 crc kubenswrapper[4831]: I1204 10:37:58.964482 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/462ff702-35a4-4cbe-8155-3ce8a321bf48-rabbitmq-plugins\") pod \"462ff702-35a4-4cbe-8155-3ce8a321bf48\" (UID: \"462ff702-35a4-4cbe-8155-3ce8a321bf48\") " Dec 04 10:37:58 crc kubenswrapper[4831]: I1204 10:37:58.965061 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/462ff702-35a4-4cbe-8155-3ce8a321bf48-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "462ff702-35a4-4cbe-8155-3ce8a321bf48" (UID: "462ff702-35a4-4cbe-8155-3ce8a321bf48"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:37:58 crc kubenswrapper[4831]: I1204 10:37:58.965202 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/462ff702-35a4-4cbe-8155-3ce8a321bf48-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "462ff702-35a4-4cbe-8155-3ce8a321bf48" (UID: "462ff702-35a4-4cbe-8155-3ce8a321bf48"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:37:58 crc kubenswrapper[4831]: I1204 10:37:58.965440 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/462ff702-35a4-4cbe-8155-3ce8a321bf48-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "462ff702-35a4-4cbe-8155-3ce8a321bf48" (UID: "462ff702-35a4-4cbe-8155-3ce8a321bf48"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:37:58 crc kubenswrapper[4831]: I1204 10:37:58.971184 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/462ff702-35a4-4cbe-8155-3ce8a321bf48-pod-info" (OuterVolumeSpecName: "pod-info") pod "462ff702-35a4-4cbe-8155-3ce8a321bf48" (UID: "462ff702-35a4-4cbe-8155-3ce8a321bf48"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 04 10:37:58 crc kubenswrapper[4831]: I1204 10:37:58.971773 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/462ff702-35a4-4cbe-8155-3ce8a321bf48-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "462ff702-35a4-4cbe-8155-3ce8a321bf48" (UID: "462ff702-35a4-4cbe-8155-3ce8a321bf48"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:37:58 crc kubenswrapper[4831]: I1204 10:37:58.972062 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "462ff702-35a4-4cbe-8155-3ce8a321bf48" (UID: "462ff702-35a4-4cbe-8155-3ce8a321bf48"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 10:37:58 crc kubenswrapper[4831]: I1204 10:37:58.989455 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/462ff702-35a4-4cbe-8155-3ce8a321bf48-kube-api-access-d5k49" (OuterVolumeSpecName: "kube-api-access-d5k49") pod "462ff702-35a4-4cbe-8155-3ce8a321bf48" (UID: "462ff702-35a4-4cbe-8155-3ce8a321bf48"). InnerVolumeSpecName "kube-api-access-d5k49". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:37:58 crc kubenswrapper[4831]: I1204 10:37:58.994058 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/462ff702-35a4-4cbe-8155-3ce8a321bf48-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "462ff702-35a4-4cbe-8155-3ce8a321bf48" (UID: "462ff702-35a4-4cbe-8155-3ce8a321bf48"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.003207 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/462ff702-35a4-4cbe-8155-3ce8a321bf48-config-data" (OuterVolumeSpecName: "config-data") pod "462ff702-35a4-4cbe-8155-3ce8a321bf48" (UID: "462ff702-35a4-4cbe-8155-3ce8a321bf48"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.046839 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/462ff702-35a4-4cbe-8155-3ce8a321bf48-server-conf" (OuterVolumeSpecName: "server-conf") pod "462ff702-35a4-4cbe-8155-3ce8a321bf48" (UID: "462ff702-35a4-4cbe-8155-3ce8a321bf48"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.068995 4831 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/462ff702-35a4-4cbe-8155-3ce8a321bf48-pod-info\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.069038 4831 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/462ff702-35a4-4cbe-8155-3ce8a321bf48-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.069055 4831 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/462ff702-35a4-4cbe-8155-3ce8a321bf48-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.069067 4831 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/462ff702-35a4-4cbe-8155-3ce8a321bf48-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.069079 4831 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/462ff702-35a4-4cbe-8155-3ce8a321bf48-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.069090 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5k49\" (UniqueName: \"kubernetes.io/projected/462ff702-35a4-4cbe-8155-3ce8a321bf48-kube-api-access-d5k49\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.069129 4831 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.069140 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/462ff702-35a4-4cbe-8155-3ce8a321bf48-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.069151 4831 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/462ff702-35a4-4cbe-8155-3ce8a321bf48-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.069161 4831 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/462ff702-35a4-4cbe-8155-3ce8a321bf48-server-conf\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.108876 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/462ff702-35a4-4cbe-8155-3ce8a321bf48-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "462ff702-35a4-4cbe-8155-3ce8a321bf48" (UID: "462ff702-35a4-4cbe-8155-3ce8a321bf48"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.114415 4831 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.116105 4831 generic.go:334] "Generic (PLEG): container finished" podID="462ff702-35a4-4cbe-8155-3ce8a321bf48" containerID="06ead54d3b9961b374f03f78f071a94d7c8a3739adc54b4521b471bd6df743d1" exitCode=0 Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.116160 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"462ff702-35a4-4cbe-8155-3ce8a321bf48","Type":"ContainerDied","Data":"06ead54d3b9961b374f03f78f071a94d7c8a3739adc54b4521b471bd6df743d1"} Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.116191 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"462ff702-35a4-4cbe-8155-3ce8a321bf48","Type":"ContainerDied","Data":"13964deb8d083e42080833315f3f4f2ab2c52f2675d2f0a8d620b17210c9e80b"} Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.116206 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.116211 4831 scope.go:117] "RemoveContainer" containerID="06ead54d3b9961b374f03f78f071a94d7c8a3739adc54b4521b471bd6df743d1" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.155185 4831 scope.go:117] "RemoveContainer" containerID="92a2d996ef4484246c9e0c6a7199e478aa295d1ca7aa713e7d588129a1cc9d96" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.174248 4831 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.174289 4831 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/462ff702-35a4-4cbe-8155-3ce8a321bf48-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.186695 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.199267 4831 scope.go:117] "RemoveContainer" containerID="06ead54d3b9961b374f03f78f071a94d7c8a3739adc54b4521b471bd6df743d1" Dec 04 10:37:59 crc kubenswrapper[4831]: E1204 10:37:59.200138 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06ead54d3b9961b374f03f78f071a94d7c8a3739adc54b4521b471bd6df743d1\": container with ID starting with 06ead54d3b9961b374f03f78f071a94d7c8a3739adc54b4521b471bd6df743d1 not found: ID does not exist" containerID="06ead54d3b9961b374f03f78f071a94d7c8a3739adc54b4521b471bd6df743d1" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.200177 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06ead54d3b9961b374f03f78f071a94d7c8a3739adc54b4521b471bd6df743d1"} err="failed to get container status \"06ead54d3b9961b374f03f78f071a94d7c8a3739adc54b4521b471bd6df743d1\": rpc error: code = NotFound desc = could not find container \"06ead54d3b9961b374f03f78f071a94d7c8a3739adc54b4521b471bd6df743d1\": container with ID starting with 06ead54d3b9961b374f03f78f071a94d7c8a3739adc54b4521b471bd6df743d1 not found: ID does not exist" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.200200 4831 scope.go:117] "RemoveContainer" containerID="92a2d996ef4484246c9e0c6a7199e478aa295d1ca7aa713e7d588129a1cc9d96" Dec 04 10:37:59 crc kubenswrapper[4831]: E1204 10:37:59.200799 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92a2d996ef4484246c9e0c6a7199e478aa295d1ca7aa713e7d588129a1cc9d96\": container with ID starting with 92a2d996ef4484246c9e0c6a7199e478aa295d1ca7aa713e7d588129a1cc9d96 not found: ID does not exist" containerID="92a2d996ef4484246c9e0c6a7199e478aa295d1ca7aa713e7d588129a1cc9d96" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.200827 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92a2d996ef4484246c9e0c6a7199e478aa295d1ca7aa713e7d588129a1cc9d96"} err="failed to get container status \"92a2d996ef4484246c9e0c6a7199e478aa295d1ca7aa713e7d588129a1cc9d96\": rpc error: code = NotFound desc = could not find container \"92a2d996ef4484246c9e0c6a7199e478aa295d1ca7aa713e7d588129a1cc9d96\": container with ID starting with 92a2d996ef4484246c9e0c6a7199e478aa295d1ca7aa713e7d588129a1cc9d96 not found: ID does not exist" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.212752 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.223433 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 10:37:59 crc kubenswrapper[4831]: E1204 10:37:59.224127 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="462ff702-35a4-4cbe-8155-3ce8a321bf48" containerName="setup-container" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.224209 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="462ff702-35a4-4cbe-8155-3ce8a321bf48" containerName="setup-container" Dec 04 10:37:59 crc kubenswrapper[4831]: E1204 10:37:59.224311 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="462ff702-35a4-4cbe-8155-3ce8a321bf48" containerName="rabbitmq" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.224369 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="462ff702-35a4-4cbe-8155-3ce8a321bf48" containerName="rabbitmq" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.224689 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="462ff702-35a4-4cbe-8155-3ce8a321bf48" containerName="rabbitmq" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.225971 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.227695 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.233007 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.233177 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.233255 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.234039 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.234194 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.234386 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-gh87l" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.236294 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.275825 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/75f6fd12-4651-4b5f-9eec-d192367b85ad-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"75f6fd12-4651-4b5f-9eec-d192367b85ad\") " pod="openstack/rabbitmq-server-0" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.275877 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/75f6fd12-4651-4b5f-9eec-d192367b85ad-config-data\") pod \"rabbitmq-server-0\" (UID: \"75f6fd12-4651-4b5f-9eec-d192367b85ad\") " pod="openstack/rabbitmq-server-0" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.275956 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/75f6fd12-4651-4b5f-9eec-d192367b85ad-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"75f6fd12-4651-4b5f-9eec-d192367b85ad\") " pod="openstack/rabbitmq-server-0" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.276002 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"75f6fd12-4651-4b5f-9eec-d192367b85ad\") " pod="openstack/rabbitmq-server-0" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.276027 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/75f6fd12-4651-4b5f-9eec-d192367b85ad-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"75f6fd12-4651-4b5f-9eec-d192367b85ad\") " pod="openstack/rabbitmq-server-0" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.276080 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w89zn\" (UniqueName: \"kubernetes.io/projected/75f6fd12-4651-4b5f-9eec-d192367b85ad-kube-api-access-w89zn\") pod \"rabbitmq-server-0\" (UID: \"75f6fd12-4651-4b5f-9eec-d192367b85ad\") " pod="openstack/rabbitmq-server-0" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.276103 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/75f6fd12-4651-4b5f-9eec-d192367b85ad-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"75f6fd12-4651-4b5f-9eec-d192367b85ad\") " pod="openstack/rabbitmq-server-0" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.276137 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/75f6fd12-4651-4b5f-9eec-d192367b85ad-pod-info\") pod \"rabbitmq-server-0\" (UID: \"75f6fd12-4651-4b5f-9eec-d192367b85ad\") " pod="openstack/rabbitmq-server-0" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.276172 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/75f6fd12-4651-4b5f-9eec-d192367b85ad-server-conf\") pod \"rabbitmq-server-0\" (UID: \"75f6fd12-4651-4b5f-9eec-d192367b85ad\") " pod="openstack/rabbitmq-server-0" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.276187 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/75f6fd12-4651-4b5f-9eec-d192367b85ad-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"75f6fd12-4651-4b5f-9eec-d192367b85ad\") " pod="openstack/rabbitmq-server-0" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.276209 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/75f6fd12-4651-4b5f-9eec-d192367b85ad-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"75f6fd12-4651-4b5f-9eec-d192367b85ad\") " pod="openstack/rabbitmq-server-0" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.293001 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="462ff702-35a4-4cbe-8155-3ce8a321bf48" path="/var/lib/kubelet/pods/462ff702-35a4-4cbe-8155-3ce8a321bf48/volumes" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.378628 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/75f6fd12-4651-4b5f-9eec-d192367b85ad-pod-info\") pod \"rabbitmq-server-0\" (UID: \"75f6fd12-4651-4b5f-9eec-d192367b85ad\") " pod="openstack/rabbitmq-server-0" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.378721 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/75f6fd12-4651-4b5f-9eec-d192367b85ad-server-conf\") pod \"rabbitmq-server-0\" (UID: \"75f6fd12-4651-4b5f-9eec-d192367b85ad\") " pod="openstack/rabbitmq-server-0" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.378750 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/75f6fd12-4651-4b5f-9eec-d192367b85ad-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"75f6fd12-4651-4b5f-9eec-d192367b85ad\") " pod="openstack/rabbitmq-server-0" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.378780 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/75f6fd12-4651-4b5f-9eec-d192367b85ad-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"75f6fd12-4651-4b5f-9eec-d192367b85ad\") " pod="openstack/rabbitmq-server-0" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.378801 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/75f6fd12-4651-4b5f-9eec-d192367b85ad-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"75f6fd12-4651-4b5f-9eec-d192367b85ad\") " pod="openstack/rabbitmq-server-0" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.378820 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/75f6fd12-4651-4b5f-9eec-d192367b85ad-config-data\") pod \"rabbitmq-server-0\" (UID: \"75f6fd12-4651-4b5f-9eec-d192367b85ad\") " pod="openstack/rabbitmq-server-0" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.378888 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/75f6fd12-4651-4b5f-9eec-d192367b85ad-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"75f6fd12-4651-4b5f-9eec-d192367b85ad\") " pod="openstack/rabbitmq-server-0" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.378927 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"75f6fd12-4651-4b5f-9eec-d192367b85ad\") " pod="openstack/rabbitmq-server-0" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.378956 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/75f6fd12-4651-4b5f-9eec-d192367b85ad-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"75f6fd12-4651-4b5f-9eec-d192367b85ad\") " pod="openstack/rabbitmq-server-0" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.379023 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w89zn\" (UniqueName: \"kubernetes.io/projected/75f6fd12-4651-4b5f-9eec-d192367b85ad-kube-api-access-w89zn\") pod \"rabbitmq-server-0\" (UID: \"75f6fd12-4651-4b5f-9eec-d192367b85ad\") " pod="openstack/rabbitmq-server-0" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.379061 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/75f6fd12-4651-4b5f-9eec-d192367b85ad-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"75f6fd12-4651-4b5f-9eec-d192367b85ad\") " pod="openstack/rabbitmq-server-0" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.379773 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/75f6fd12-4651-4b5f-9eec-d192367b85ad-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"75f6fd12-4651-4b5f-9eec-d192367b85ad\") " pod="openstack/rabbitmq-server-0" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.380133 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"75f6fd12-4651-4b5f-9eec-d192367b85ad\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.380155 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/75f6fd12-4651-4b5f-9eec-d192367b85ad-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"75f6fd12-4651-4b5f-9eec-d192367b85ad\") " pod="openstack/rabbitmq-server-0" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.380765 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/75f6fd12-4651-4b5f-9eec-d192367b85ad-config-data\") pod \"rabbitmq-server-0\" (UID: \"75f6fd12-4651-4b5f-9eec-d192367b85ad\") " pod="openstack/rabbitmq-server-0" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.382024 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/75f6fd12-4651-4b5f-9eec-d192367b85ad-server-conf\") pod \"rabbitmq-server-0\" (UID: \"75f6fd12-4651-4b5f-9eec-d192367b85ad\") " pod="openstack/rabbitmq-server-0" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.384589 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/75f6fd12-4651-4b5f-9eec-d192367b85ad-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"75f6fd12-4651-4b5f-9eec-d192367b85ad\") " pod="openstack/rabbitmq-server-0" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.385438 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/75f6fd12-4651-4b5f-9eec-d192367b85ad-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"75f6fd12-4651-4b5f-9eec-d192367b85ad\") " pod="openstack/rabbitmq-server-0" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.386375 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/75f6fd12-4651-4b5f-9eec-d192367b85ad-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"75f6fd12-4651-4b5f-9eec-d192367b85ad\") " pod="openstack/rabbitmq-server-0" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.392233 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/75f6fd12-4651-4b5f-9eec-d192367b85ad-pod-info\") pod \"rabbitmq-server-0\" (UID: \"75f6fd12-4651-4b5f-9eec-d192367b85ad\") " pod="openstack/rabbitmq-server-0" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.395766 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/75f6fd12-4651-4b5f-9eec-d192367b85ad-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"75f6fd12-4651-4b5f-9eec-d192367b85ad\") " pod="openstack/rabbitmq-server-0" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.405653 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w89zn\" (UniqueName: \"kubernetes.io/projected/75f6fd12-4651-4b5f-9eec-d192367b85ad-kube-api-access-w89zn\") pod \"rabbitmq-server-0\" (UID: \"75f6fd12-4651-4b5f-9eec-d192367b85ad\") " pod="openstack/rabbitmq-server-0" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.448013 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"75f6fd12-4651-4b5f-9eec-d192367b85ad\") " pod="openstack/rabbitmq-server-0" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.555402 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.805607 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.896150 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2d1e04df-4c2a-440f-b533-9903a58c8ecc-server-conf\") pod \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.896395 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2d1e04df-4c2a-440f-b533-9903a58c8ecc-rabbitmq-erlang-cookie\") pod \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.896424 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2d1e04df-4c2a-440f-b533-9903a58c8ecc-plugins-conf\") pod \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.896574 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vw7nj\" (UniqueName: \"kubernetes.io/projected/2d1e04df-4c2a-440f-b533-9903a58c8ecc-kube-api-access-vw7nj\") pod \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.896627 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2d1e04df-4c2a-440f-b533-9903a58c8ecc-rabbitmq-confd\") pod \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.896669 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d1e04df-4c2a-440f-b533-9903a58c8ecc-config-data\") pod \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.896693 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2d1e04df-4c2a-440f-b533-9903a58c8ecc-rabbitmq-plugins\") pod \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.896748 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2d1e04df-4c2a-440f-b533-9903a58c8ecc-erlang-cookie-secret\") pod \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.896778 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2d1e04df-4c2a-440f-b533-9903a58c8ecc-rabbitmq-tls\") pod \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.896824 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2d1e04df-4c2a-440f-b533-9903a58c8ecc-pod-info\") pod \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.896868 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\" (UID: \"2d1e04df-4c2a-440f-b533-9903a58c8ecc\") " Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.897372 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d1e04df-4c2a-440f-b533-9903a58c8ecc-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "2d1e04df-4c2a-440f-b533-9903a58c8ecc" (UID: "2d1e04df-4c2a-440f-b533-9903a58c8ecc"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.897607 4831 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2d1e04df-4c2a-440f-b533-9903a58c8ecc-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.897732 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d1e04df-4c2a-440f-b533-9903a58c8ecc-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "2d1e04df-4c2a-440f-b533-9903a58c8ecc" (UID: "2d1e04df-4c2a-440f-b533-9903a58c8ecc"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.897869 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d1e04df-4c2a-440f-b533-9903a58c8ecc-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "2d1e04df-4c2a-440f-b533-9903a58c8ecc" (UID: "2d1e04df-4c2a-440f-b533-9903a58c8ecc"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.901733 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "persistence") pod "2d1e04df-4c2a-440f-b533-9903a58c8ecc" (UID: "2d1e04df-4c2a-440f-b533-9903a58c8ecc"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.902370 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d1e04df-4c2a-440f-b533-9903a58c8ecc-kube-api-access-vw7nj" (OuterVolumeSpecName: "kube-api-access-vw7nj") pod "2d1e04df-4c2a-440f-b533-9903a58c8ecc" (UID: "2d1e04df-4c2a-440f-b533-9903a58c8ecc"). InnerVolumeSpecName "kube-api-access-vw7nj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.902506 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d1e04df-4c2a-440f-b533-9903a58c8ecc-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "2d1e04df-4c2a-440f-b533-9903a58c8ecc" (UID: "2d1e04df-4c2a-440f-b533-9903a58c8ecc"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.903635 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/2d1e04df-4c2a-440f-b533-9903a58c8ecc-pod-info" (OuterVolumeSpecName: "pod-info") pod "2d1e04df-4c2a-440f-b533-9903a58c8ecc" (UID: "2d1e04df-4c2a-440f-b533-9903a58c8ecc"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.910969 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d1e04df-4c2a-440f-b533-9903a58c8ecc-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "2d1e04df-4c2a-440f-b533-9903a58c8ecc" (UID: "2d1e04df-4c2a-440f-b533-9903a58c8ecc"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.955371 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d1e04df-4c2a-440f-b533-9903a58c8ecc-config-data" (OuterVolumeSpecName: "config-data") pod "2d1e04df-4c2a-440f-b533-9903a58c8ecc" (UID: "2d1e04df-4c2a-440f-b533-9903a58c8ecc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:37:59 crc kubenswrapper[4831]: I1204 10:37:59.973590 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d1e04df-4c2a-440f-b533-9903a58c8ecc-server-conf" (OuterVolumeSpecName: "server-conf") pod "2d1e04df-4c2a-440f-b533-9903a58c8ecc" (UID: "2d1e04df-4c2a-440f-b533-9903a58c8ecc"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.000009 4831 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2d1e04df-4c2a-440f-b533-9903a58c8ecc-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.000046 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vw7nj\" (UniqueName: \"kubernetes.io/projected/2d1e04df-4c2a-440f-b533-9903a58c8ecc-kube-api-access-vw7nj\") on node \"crc\" DevicePath \"\"" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.000060 4831 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2d1e04df-4c2a-440f-b533-9903a58c8ecc-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.000074 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d1e04df-4c2a-440f-b533-9903a58c8ecc-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.000085 4831 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2d1e04df-4c2a-440f-b533-9903a58c8ecc-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.000095 4831 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2d1e04df-4c2a-440f-b533-9903a58c8ecc-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.000105 4831 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2d1e04df-4c2a-440f-b533-9903a58c8ecc-pod-info\") on node \"crc\" DevicePath \"\"" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.000137 4831 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.000149 4831 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2d1e04df-4c2a-440f-b533-9903a58c8ecc-server-conf\") on node \"crc\" DevicePath \"\"" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.023903 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d1e04df-4c2a-440f-b533-9903a58c8ecc-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "2d1e04df-4c2a-440f-b533-9903a58c8ecc" (UID: "2d1e04df-4c2a-440f-b533-9903a58c8ecc"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.029795 4831 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.050039 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.102049 4831 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.102095 4831 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2d1e04df-4c2a-440f-b533-9903a58c8ecc-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.126475 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"75f6fd12-4651-4b5f-9eec-d192367b85ad","Type":"ContainerStarted","Data":"f4599101aa9cb0c1af958af4f6cb217b581c576787843b760c02bb00e489517a"} Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.128119 4831 generic.go:334] "Generic (PLEG): container finished" podID="2d1e04df-4c2a-440f-b533-9903a58c8ecc" containerID="13a0cc1dacfc84e2f2f951677b6763271ea13bee4c69f204082d8cd8e7422c77" exitCode=0 Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.128151 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2d1e04df-4c2a-440f-b533-9903a58c8ecc","Type":"ContainerDied","Data":"13a0cc1dacfc84e2f2f951677b6763271ea13bee4c69f204082d8cd8e7422c77"} Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.128171 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2d1e04df-4c2a-440f-b533-9903a58c8ecc","Type":"ContainerDied","Data":"82a1c770b6e2cc059228f39ee796551f8e0645e72f5d1690a5c64bdb79664d6e"} Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.128172 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.128188 4831 scope.go:117] "RemoveContainer" containerID="13a0cc1dacfc84e2f2f951677b6763271ea13bee4c69f204082d8cd8e7422c77" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.157479 4831 scope.go:117] "RemoveContainer" containerID="c1f343b6b370799b8079dd35ff1cdd9be187941142d4acadb941c8c7445a556a" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.185594 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.196693 4831 scope.go:117] "RemoveContainer" containerID="13a0cc1dacfc84e2f2f951677b6763271ea13bee4c69f204082d8cd8e7422c77" Dec 04 10:38:00 crc kubenswrapper[4831]: E1204 10:38:00.197103 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13a0cc1dacfc84e2f2f951677b6763271ea13bee4c69f204082d8cd8e7422c77\": container with ID starting with 13a0cc1dacfc84e2f2f951677b6763271ea13bee4c69f204082d8cd8e7422c77 not found: ID does not exist" containerID="13a0cc1dacfc84e2f2f951677b6763271ea13bee4c69f204082d8cd8e7422c77" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.197142 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13a0cc1dacfc84e2f2f951677b6763271ea13bee4c69f204082d8cd8e7422c77"} err="failed to get container status \"13a0cc1dacfc84e2f2f951677b6763271ea13bee4c69f204082d8cd8e7422c77\": rpc error: code = NotFound desc = could not find container \"13a0cc1dacfc84e2f2f951677b6763271ea13bee4c69f204082d8cd8e7422c77\": container with ID starting with 13a0cc1dacfc84e2f2f951677b6763271ea13bee4c69f204082d8cd8e7422c77 not found: ID does not exist" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.197168 4831 scope.go:117] "RemoveContainer" containerID="c1f343b6b370799b8079dd35ff1cdd9be187941142d4acadb941c8c7445a556a" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.197517 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 10:38:00 crc kubenswrapper[4831]: E1204 10:38:00.197587 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1f343b6b370799b8079dd35ff1cdd9be187941142d4acadb941c8c7445a556a\": container with ID starting with c1f343b6b370799b8079dd35ff1cdd9be187941142d4acadb941c8c7445a556a not found: ID does not exist" containerID="c1f343b6b370799b8079dd35ff1cdd9be187941142d4acadb941c8c7445a556a" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.197607 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1f343b6b370799b8079dd35ff1cdd9be187941142d4acadb941c8c7445a556a"} err="failed to get container status \"c1f343b6b370799b8079dd35ff1cdd9be187941142d4acadb941c8c7445a556a\": rpc error: code = NotFound desc = could not find container \"c1f343b6b370799b8079dd35ff1cdd9be187941142d4acadb941c8c7445a556a\": container with ID starting with c1f343b6b370799b8079dd35ff1cdd9be187941142d4acadb941c8c7445a556a not found: ID does not exist" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.213953 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 10:38:00 crc kubenswrapper[4831]: E1204 10:38:00.227051 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d1e04df-4c2a-440f-b533-9903a58c8ecc" containerName="setup-container" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.227089 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d1e04df-4c2a-440f-b533-9903a58c8ecc" containerName="setup-container" Dec 04 10:38:00 crc kubenswrapper[4831]: E1204 10:38:00.227137 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d1e04df-4c2a-440f-b533-9903a58c8ecc" containerName="rabbitmq" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.227144 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d1e04df-4c2a-440f-b533-9903a58c8ecc" containerName="rabbitmq" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.227463 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d1e04df-4c2a-440f-b533-9903a58c8ecc" containerName="rabbitmq" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.230912 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.231056 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.233401 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.233679 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.233737 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.233865 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-ll27g" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.234010 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.234391 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.234517 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.305870 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7544cda9-54d6-47c9-8ba1-0834b882e674-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"7544cda9-54d6-47c9-8ba1-0834b882e674\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.305939 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"7544cda9-54d6-47c9-8ba1-0834b882e674\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.306032 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnbkv\" (UniqueName: \"kubernetes.io/projected/7544cda9-54d6-47c9-8ba1-0834b882e674-kube-api-access-qnbkv\") pod \"rabbitmq-cell1-server-0\" (UID: \"7544cda9-54d6-47c9-8ba1-0834b882e674\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.306054 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7544cda9-54d6-47c9-8ba1-0834b882e674-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"7544cda9-54d6-47c9-8ba1-0834b882e674\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.306084 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7544cda9-54d6-47c9-8ba1-0834b882e674-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"7544cda9-54d6-47c9-8ba1-0834b882e674\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.306100 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7544cda9-54d6-47c9-8ba1-0834b882e674-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"7544cda9-54d6-47c9-8ba1-0834b882e674\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.306136 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7544cda9-54d6-47c9-8ba1-0834b882e674-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"7544cda9-54d6-47c9-8ba1-0834b882e674\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.306174 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7544cda9-54d6-47c9-8ba1-0834b882e674-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7544cda9-54d6-47c9-8ba1-0834b882e674\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.306258 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7544cda9-54d6-47c9-8ba1-0834b882e674-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"7544cda9-54d6-47c9-8ba1-0834b882e674\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.306278 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7544cda9-54d6-47c9-8ba1-0834b882e674-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"7544cda9-54d6-47c9-8ba1-0834b882e674\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.306294 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7544cda9-54d6-47c9-8ba1-0834b882e674-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7544cda9-54d6-47c9-8ba1-0834b882e674\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.407818 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7544cda9-54d6-47c9-8ba1-0834b882e674-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"7544cda9-54d6-47c9-8ba1-0834b882e674\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.407881 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"7544cda9-54d6-47c9-8ba1-0834b882e674\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.407940 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnbkv\" (UniqueName: \"kubernetes.io/projected/7544cda9-54d6-47c9-8ba1-0834b882e674-kube-api-access-qnbkv\") pod \"rabbitmq-cell1-server-0\" (UID: \"7544cda9-54d6-47c9-8ba1-0834b882e674\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.407968 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7544cda9-54d6-47c9-8ba1-0834b882e674-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"7544cda9-54d6-47c9-8ba1-0834b882e674\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.408030 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7544cda9-54d6-47c9-8ba1-0834b882e674-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"7544cda9-54d6-47c9-8ba1-0834b882e674\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.408048 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7544cda9-54d6-47c9-8ba1-0834b882e674-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"7544cda9-54d6-47c9-8ba1-0834b882e674\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.408073 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7544cda9-54d6-47c9-8ba1-0834b882e674-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"7544cda9-54d6-47c9-8ba1-0834b882e674\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.408107 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7544cda9-54d6-47c9-8ba1-0834b882e674-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7544cda9-54d6-47c9-8ba1-0834b882e674\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.408160 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7544cda9-54d6-47c9-8ba1-0834b882e674-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"7544cda9-54d6-47c9-8ba1-0834b882e674\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.408182 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7544cda9-54d6-47c9-8ba1-0834b882e674-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7544cda9-54d6-47c9-8ba1-0834b882e674\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.408203 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7544cda9-54d6-47c9-8ba1-0834b882e674-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"7544cda9-54d6-47c9-8ba1-0834b882e674\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.408700 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7544cda9-54d6-47c9-8ba1-0834b882e674-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"7544cda9-54d6-47c9-8ba1-0834b882e674\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.409150 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7544cda9-54d6-47c9-8ba1-0834b882e674-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"7544cda9-54d6-47c9-8ba1-0834b882e674\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.409151 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"7544cda9-54d6-47c9-8ba1-0834b882e674\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.409638 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7544cda9-54d6-47c9-8ba1-0834b882e674-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7544cda9-54d6-47c9-8ba1-0834b882e674\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.409711 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7544cda9-54d6-47c9-8ba1-0834b882e674-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"7544cda9-54d6-47c9-8ba1-0834b882e674\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.410607 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7544cda9-54d6-47c9-8ba1-0834b882e674-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7544cda9-54d6-47c9-8ba1-0834b882e674\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.413764 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7544cda9-54d6-47c9-8ba1-0834b882e674-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"7544cda9-54d6-47c9-8ba1-0834b882e674\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.414161 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7544cda9-54d6-47c9-8ba1-0834b882e674-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"7544cda9-54d6-47c9-8ba1-0834b882e674\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.414187 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7544cda9-54d6-47c9-8ba1-0834b882e674-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"7544cda9-54d6-47c9-8ba1-0834b882e674\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.417309 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7544cda9-54d6-47c9-8ba1-0834b882e674-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"7544cda9-54d6-47c9-8ba1-0834b882e674\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.427284 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnbkv\" (UniqueName: \"kubernetes.io/projected/7544cda9-54d6-47c9-8ba1-0834b882e674-kube-api-access-qnbkv\") pod \"rabbitmq-cell1-server-0\" (UID: \"7544cda9-54d6-47c9-8ba1-0834b882e674\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.443478 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"7544cda9-54d6-47c9-8ba1-0834b882e674\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:38:00 crc kubenswrapper[4831]: I1204 10:38:00.559910 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:38:01 crc kubenswrapper[4831]: I1204 10:38:01.043794 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 10:38:01 crc kubenswrapper[4831]: W1204 10:38:01.088131 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7544cda9_54d6_47c9_8ba1_0834b882e674.slice/crio-07eb03d24d2ba63063cf4f54247769c49715c7a1fab34e12e92a43775b07208f WatchSource:0}: Error finding container 07eb03d24d2ba63063cf4f54247769c49715c7a1fab34e12e92a43775b07208f: Status 404 returned error can't find the container with id 07eb03d24d2ba63063cf4f54247769c49715c7a1fab34e12e92a43775b07208f Dec 04 10:38:01 crc kubenswrapper[4831]: I1204 10:38:01.159280 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7544cda9-54d6-47c9-8ba1-0834b882e674","Type":"ContainerStarted","Data":"07eb03d24d2ba63063cf4f54247769c49715c7a1fab34e12e92a43775b07208f"} Dec 04 10:38:01 crc kubenswrapper[4831]: I1204 10:38:01.291578 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d1e04df-4c2a-440f-b533-9903a58c8ecc" path="/var/lib/kubelet/pods/2d1e04df-4c2a-440f-b533-9903a58c8ecc/volumes" Dec 04 10:38:02 crc kubenswrapper[4831]: I1204 10:38:02.172188 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"75f6fd12-4651-4b5f-9eec-d192367b85ad","Type":"ContainerStarted","Data":"3ff8e17d0e7fa0f6cb3e4b39b90300c3de29c1e95b07913ddaae6d333bf4bd9f"} Dec 04 10:38:03 crc kubenswrapper[4831]: I1204 10:38:03.185311 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7544cda9-54d6-47c9-8ba1-0834b882e674","Type":"ContainerStarted","Data":"d93f2d24d73a2018885211a49ed6910ce3b853978a2b5c499b7e78e7a4c5eb2a"} Dec 04 10:38:09 crc kubenswrapper[4831]: I1204 10:38:09.785460 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c94479b55-qlwx5"] Dec 04 10:38:09 crc kubenswrapper[4831]: I1204 10:38:09.788117 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c94479b55-qlwx5" Dec 04 10:38:09 crc kubenswrapper[4831]: I1204 10:38:09.791611 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 04 10:38:09 crc kubenswrapper[4831]: I1204 10:38:09.815006 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c94479b55-qlwx5"] Dec 04 10:38:09 crc kubenswrapper[4831]: I1204 10:38:09.899935 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d642828-f7be-4185-82f1-cfd45f494f9b-ovsdbserver-sb\") pod \"dnsmasq-dns-5c94479b55-qlwx5\" (UID: \"8d642828-f7be-4185-82f1-cfd45f494f9b\") " pod="openstack/dnsmasq-dns-5c94479b55-qlwx5" Dec 04 10:38:09 crc kubenswrapper[4831]: I1204 10:38:09.900003 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8d642828-f7be-4185-82f1-cfd45f494f9b-dns-swift-storage-0\") pod \"dnsmasq-dns-5c94479b55-qlwx5\" (UID: \"8d642828-f7be-4185-82f1-cfd45f494f9b\") " pod="openstack/dnsmasq-dns-5c94479b55-qlwx5" Dec 04 10:38:09 crc kubenswrapper[4831]: I1204 10:38:09.900071 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d642828-f7be-4185-82f1-cfd45f494f9b-config\") pod \"dnsmasq-dns-5c94479b55-qlwx5\" (UID: \"8d642828-f7be-4185-82f1-cfd45f494f9b\") " pod="openstack/dnsmasq-dns-5c94479b55-qlwx5" Dec 04 10:38:09 crc kubenswrapper[4831]: I1204 10:38:09.900304 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d642828-f7be-4185-82f1-cfd45f494f9b-ovsdbserver-nb\") pod \"dnsmasq-dns-5c94479b55-qlwx5\" (UID: \"8d642828-f7be-4185-82f1-cfd45f494f9b\") " pod="openstack/dnsmasq-dns-5c94479b55-qlwx5" Dec 04 10:38:09 crc kubenswrapper[4831]: I1204 10:38:09.900357 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d642828-f7be-4185-82f1-cfd45f494f9b-dns-svc\") pod \"dnsmasq-dns-5c94479b55-qlwx5\" (UID: \"8d642828-f7be-4185-82f1-cfd45f494f9b\") " pod="openstack/dnsmasq-dns-5c94479b55-qlwx5" Dec 04 10:38:09 crc kubenswrapper[4831]: I1204 10:38:09.900566 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8d642828-f7be-4185-82f1-cfd45f494f9b-openstack-edpm-ipam\") pod \"dnsmasq-dns-5c94479b55-qlwx5\" (UID: \"8d642828-f7be-4185-82f1-cfd45f494f9b\") " pod="openstack/dnsmasq-dns-5c94479b55-qlwx5" Dec 04 10:38:09 crc kubenswrapper[4831]: I1204 10:38:09.900650 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngbjw\" (UniqueName: \"kubernetes.io/projected/8d642828-f7be-4185-82f1-cfd45f494f9b-kube-api-access-ngbjw\") pod \"dnsmasq-dns-5c94479b55-qlwx5\" (UID: \"8d642828-f7be-4185-82f1-cfd45f494f9b\") " pod="openstack/dnsmasq-dns-5c94479b55-qlwx5" Dec 04 10:38:10 crc kubenswrapper[4831]: I1204 10:38:10.002309 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d642828-f7be-4185-82f1-cfd45f494f9b-dns-svc\") pod \"dnsmasq-dns-5c94479b55-qlwx5\" (UID: \"8d642828-f7be-4185-82f1-cfd45f494f9b\") " pod="openstack/dnsmasq-dns-5c94479b55-qlwx5" Dec 04 10:38:10 crc kubenswrapper[4831]: I1204 10:38:10.002414 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8d642828-f7be-4185-82f1-cfd45f494f9b-openstack-edpm-ipam\") pod \"dnsmasq-dns-5c94479b55-qlwx5\" (UID: \"8d642828-f7be-4185-82f1-cfd45f494f9b\") " pod="openstack/dnsmasq-dns-5c94479b55-qlwx5" Dec 04 10:38:10 crc kubenswrapper[4831]: I1204 10:38:10.002438 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngbjw\" (UniqueName: \"kubernetes.io/projected/8d642828-f7be-4185-82f1-cfd45f494f9b-kube-api-access-ngbjw\") pod \"dnsmasq-dns-5c94479b55-qlwx5\" (UID: \"8d642828-f7be-4185-82f1-cfd45f494f9b\") " pod="openstack/dnsmasq-dns-5c94479b55-qlwx5" Dec 04 10:38:10 crc kubenswrapper[4831]: I1204 10:38:10.003455 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d642828-f7be-4185-82f1-cfd45f494f9b-dns-svc\") pod \"dnsmasq-dns-5c94479b55-qlwx5\" (UID: \"8d642828-f7be-4185-82f1-cfd45f494f9b\") " pod="openstack/dnsmasq-dns-5c94479b55-qlwx5" Dec 04 10:38:10 crc kubenswrapper[4831]: I1204 10:38:10.003485 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8d642828-f7be-4185-82f1-cfd45f494f9b-openstack-edpm-ipam\") pod \"dnsmasq-dns-5c94479b55-qlwx5\" (UID: \"8d642828-f7be-4185-82f1-cfd45f494f9b\") " pod="openstack/dnsmasq-dns-5c94479b55-qlwx5" Dec 04 10:38:10 crc kubenswrapper[4831]: I1204 10:38:10.003624 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d642828-f7be-4185-82f1-cfd45f494f9b-ovsdbserver-sb\") pod \"dnsmasq-dns-5c94479b55-qlwx5\" (UID: \"8d642828-f7be-4185-82f1-cfd45f494f9b\") " pod="openstack/dnsmasq-dns-5c94479b55-qlwx5" Dec 04 10:38:10 crc kubenswrapper[4831]: I1204 10:38:10.004218 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d642828-f7be-4185-82f1-cfd45f494f9b-ovsdbserver-sb\") pod \"dnsmasq-dns-5c94479b55-qlwx5\" (UID: \"8d642828-f7be-4185-82f1-cfd45f494f9b\") " pod="openstack/dnsmasq-dns-5c94479b55-qlwx5" Dec 04 10:38:10 crc kubenswrapper[4831]: I1204 10:38:10.004277 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8d642828-f7be-4185-82f1-cfd45f494f9b-dns-swift-storage-0\") pod \"dnsmasq-dns-5c94479b55-qlwx5\" (UID: \"8d642828-f7be-4185-82f1-cfd45f494f9b\") " pod="openstack/dnsmasq-dns-5c94479b55-qlwx5" Dec 04 10:38:10 crc kubenswrapper[4831]: I1204 10:38:10.004820 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8d642828-f7be-4185-82f1-cfd45f494f9b-dns-swift-storage-0\") pod \"dnsmasq-dns-5c94479b55-qlwx5\" (UID: \"8d642828-f7be-4185-82f1-cfd45f494f9b\") " pod="openstack/dnsmasq-dns-5c94479b55-qlwx5" Dec 04 10:38:10 crc kubenswrapper[4831]: I1204 10:38:10.004884 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d642828-f7be-4185-82f1-cfd45f494f9b-config\") pod \"dnsmasq-dns-5c94479b55-qlwx5\" (UID: \"8d642828-f7be-4185-82f1-cfd45f494f9b\") " pod="openstack/dnsmasq-dns-5c94479b55-qlwx5" Dec 04 10:38:10 crc kubenswrapper[4831]: I1204 10:38:10.005428 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d642828-f7be-4185-82f1-cfd45f494f9b-config\") pod \"dnsmasq-dns-5c94479b55-qlwx5\" (UID: \"8d642828-f7be-4185-82f1-cfd45f494f9b\") " pod="openstack/dnsmasq-dns-5c94479b55-qlwx5" Dec 04 10:38:10 crc kubenswrapper[4831]: I1204 10:38:10.005609 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d642828-f7be-4185-82f1-cfd45f494f9b-ovsdbserver-nb\") pod \"dnsmasq-dns-5c94479b55-qlwx5\" (UID: \"8d642828-f7be-4185-82f1-cfd45f494f9b\") " pod="openstack/dnsmasq-dns-5c94479b55-qlwx5" Dec 04 10:38:10 crc kubenswrapper[4831]: I1204 10:38:10.006175 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d642828-f7be-4185-82f1-cfd45f494f9b-ovsdbserver-nb\") pod \"dnsmasq-dns-5c94479b55-qlwx5\" (UID: \"8d642828-f7be-4185-82f1-cfd45f494f9b\") " pod="openstack/dnsmasq-dns-5c94479b55-qlwx5" Dec 04 10:38:10 crc kubenswrapper[4831]: I1204 10:38:10.034134 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngbjw\" (UniqueName: \"kubernetes.io/projected/8d642828-f7be-4185-82f1-cfd45f494f9b-kube-api-access-ngbjw\") pod \"dnsmasq-dns-5c94479b55-qlwx5\" (UID: \"8d642828-f7be-4185-82f1-cfd45f494f9b\") " pod="openstack/dnsmasq-dns-5c94479b55-qlwx5" Dec 04 10:38:10 crc kubenswrapper[4831]: I1204 10:38:10.109470 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c94479b55-qlwx5" Dec 04 10:38:10 crc kubenswrapper[4831]: I1204 10:38:10.579904 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c94479b55-qlwx5"] Dec 04 10:38:10 crc kubenswrapper[4831]: W1204 10:38:10.603275 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d642828_f7be_4185_82f1_cfd45f494f9b.slice/crio-44b592b8e5f22bb80f9cf48cdeeb80c84217f7256b36908ac8e97b0aeee6524f WatchSource:0}: Error finding container 44b592b8e5f22bb80f9cf48cdeeb80c84217f7256b36908ac8e97b0aeee6524f: Status 404 returned error can't find the container with id 44b592b8e5f22bb80f9cf48cdeeb80c84217f7256b36908ac8e97b0aeee6524f Dec 04 10:38:11 crc kubenswrapper[4831]: I1204 10:38:11.274100 4831 generic.go:334] "Generic (PLEG): container finished" podID="8d642828-f7be-4185-82f1-cfd45f494f9b" containerID="36f0b13d4303560308d816c71bcefaaf71a38bb0b887290881ad7dfe5be8eff7" exitCode=0 Dec 04 10:38:11 crc kubenswrapper[4831]: I1204 10:38:11.274359 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c94479b55-qlwx5" event={"ID":"8d642828-f7be-4185-82f1-cfd45f494f9b","Type":"ContainerDied","Data":"36f0b13d4303560308d816c71bcefaaf71a38bb0b887290881ad7dfe5be8eff7"} Dec 04 10:38:11 crc kubenswrapper[4831]: I1204 10:38:11.274546 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c94479b55-qlwx5" event={"ID":"8d642828-f7be-4185-82f1-cfd45f494f9b","Type":"ContainerStarted","Data":"44b592b8e5f22bb80f9cf48cdeeb80c84217f7256b36908ac8e97b0aeee6524f"} Dec 04 10:38:12 crc kubenswrapper[4831]: I1204 10:38:12.284539 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c94479b55-qlwx5" event={"ID":"8d642828-f7be-4185-82f1-cfd45f494f9b","Type":"ContainerStarted","Data":"ba5855fcdee6a56e83a3c15c5b8a404b3134068c355b7bdbbdcc1eb94862ecc2"} Dec 04 10:38:12 crc kubenswrapper[4831]: I1204 10:38:12.286072 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c94479b55-qlwx5" Dec 04 10:38:12 crc kubenswrapper[4831]: I1204 10:38:12.306981 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c94479b55-qlwx5" podStartSLOduration=3.306963986 podStartE2EDuration="3.306963986s" podCreationTimestamp="2025-12-04 10:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:38:12.303950967 +0000 UTC m=+1389.253126281" watchObservedRunningTime="2025-12-04 10:38:12.306963986 +0000 UTC m=+1389.256139300" Dec 04 10:38:20 crc kubenswrapper[4831]: I1204 10:38:20.111898 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c94479b55-qlwx5" Dec 04 10:38:20 crc kubenswrapper[4831]: I1204 10:38:20.209671 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86dddb665-dtvkq"] Dec 04 10:38:20 crc kubenswrapper[4831]: I1204 10:38:20.211274 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86dddb665-dtvkq" podUID="88fcf3fe-10ba-460e-89bb-6b94936a183e" containerName="dnsmasq-dns" containerID="cri-o://b271798fadb7ecde85c8ec4c83e30da4c06274d7287f6522556443c07aa234e6" gracePeriod=10 Dec 04 10:38:20 crc kubenswrapper[4831]: I1204 10:38:20.385745 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fc4fb97c9-v2hrg"] Dec 04 10:38:20 crc kubenswrapper[4831]: I1204 10:38:20.391727 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fc4fb97c9-v2hrg" Dec 04 10:38:20 crc kubenswrapper[4831]: I1204 10:38:20.396395 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fc4fb97c9-v2hrg"] Dec 04 10:38:20 crc kubenswrapper[4831]: I1204 10:38:20.399248 4831 generic.go:334] "Generic (PLEG): container finished" podID="88fcf3fe-10ba-460e-89bb-6b94936a183e" containerID="b271798fadb7ecde85c8ec4c83e30da4c06274d7287f6522556443c07aa234e6" exitCode=0 Dec 04 10:38:20 crc kubenswrapper[4831]: I1204 10:38:20.399282 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86dddb665-dtvkq" event={"ID":"88fcf3fe-10ba-460e-89bb-6b94936a183e","Type":"ContainerDied","Data":"b271798fadb7ecde85c8ec4c83e30da4c06274d7287f6522556443c07aa234e6"} Dec 04 10:38:20 crc kubenswrapper[4831]: I1204 10:38:20.517973 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/06199f1c-bca8-4702-8fe5-f7e6512884f6-openstack-edpm-ipam\") pod \"dnsmasq-dns-5fc4fb97c9-v2hrg\" (UID: \"06199f1c-bca8-4702-8fe5-f7e6512884f6\") " pod="openstack/dnsmasq-dns-5fc4fb97c9-v2hrg" Dec 04 10:38:20 crc kubenswrapper[4831]: I1204 10:38:20.518073 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06199f1c-bca8-4702-8fe5-f7e6512884f6-ovsdbserver-nb\") pod \"dnsmasq-dns-5fc4fb97c9-v2hrg\" (UID: \"06199f1c-bca8-4702-8fe5-f7e6512884f6\") " pod="openstack/dnsmasq-dns-5fc4fb97c9-v2hrg" Dec 04 10:38:20 crc kubenswrapper[4831]: I1204 10:38:20.518103 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/06199f1c-bca8-4702-8fe5-f7e6512884f6-dns-swift-storage-0\") pod \"dnsmasq-dns-5fc4fb97c9-v2hrg\" (UID: \"06199f1c-bca8-4702-8fe5-f7e6512884f6\") " pod="openstack/dnsmasq-dns-5fc4fb97c9-v2hrg" Dec 04 10:38:20 crc kubenswrapper[4831]: I1204 10:38:20.518234 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06199f1c-bca8-4702-8fe5-f7e6512884f6-ovsdbserver-sb\") pod \"dnsmasq-dns-5fc4fb97c9-v2hrg\" (UID: \"06199f1c-bca8-4702-8fe5-f7e6512884f6\") " pod="openstack/dnsmasq-dns-5fc4fb97c9-v2hrg" Dec 04 10:38:20 crc kubenswrapper[4831]: I1204 10:38:20.518268 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06199f1c-bca8-4702-8fe5-f7e6512884f6-dns-svc\") pod \"dnsmasq-dns-5fc4fb97c9-v2hrg\" (UID: \"06199f1c-bca8-4702-8fe5-f7e6512884f6\") " pod="openstack/dnsmasq-dns-5fc4fb97c9-v2hrg" Dec 04 10:38:20 crc kubenswrapper[4831]: I1204 10:38:20.518299 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rdw9\" (UniqueName: \"kubernetes.io/projected/06199f1c-bca8-4702-8fe5-f7e6512884f6-kube-api-access-6rdw9\") pod \"dnsmasq-dns-5fc4fb97c9-v2hrg\" (UID: \"06199f1c-bca8-4702-8fe5-f7e6512884f6\") " pod="openstack/dnsmasq-dns-5fc4fb97c9-v2hrg" Dec 04 10:38:20 crc kubenswrapper[4831]: I1204 10:38:20.518440 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06199f1c-bca8-4702-8fe5-f7e6512884f6-config\") pod \"dnsmasq-dns-5fc4fb97c9-v2hrg\" (UID: \"06199f1c-bca8-4702-8fe5-f7e6512884f6\") " pod="openstack/dnsmasq-dns-5fc4fb97c9-v2hrg" Dec 04 10:38:20 crc kubenswrapper[4831]: I1204 10:38:20.621770 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/06199f1c-bca8-4702-8fe5-f7e6512884f6-openstack-edpm-ipam\") pod \"dnsmasq-dns-5fc4fb97c9-v2hrg\" (UID: \"06199f1c-bca8-4702-8fe5-f7e6512884f6\") " pod="openstack/dnsmasq-dns-5fc4fb97c9-v2hrg" Dec 04 10:38:20 crc kubenswrapper[4831]: I1204 10:38:20.621851 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06199f1c-bca8-4702-8fe5-f7e6512884f6-ovsdbserver-nb\") pod \"dnsmasq-dns-5fc4fb97c9-v2hrg\" (UID: \"06199f1c-bca8-4702-8fe5-f7e6512884f6\") " pod="openstack/dnsmasq-dns-5fc4fb97c9-v2hrg" Dec 04 10:38:20 crc kubenswrapper[4831]: I1204 10:38:20.621882 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/06199f1c-bca8-4702-8fe5-f7e6512884f6-dns-swift-storage-0\") pod \"dnsmasq-dns-5fc4fb97c9-v2hrg\" (UID: \"06199f1c-bca8-4702-8fe5-f7e6512884f6\") " pod="openstack/dnsmasq-dns-5fc4fb97c9-v2hrg" Dec 04 10:38:20 crc kubenswrapper[4831]: I1204 10:38:20.621992 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06199f1c-bca8-4702-8fe5-f7e6512884f6-ovsdbserver-sb\") pod \"dnsmasq-dns-5fc4fb97c9-v2hrg\" (UID: \"06199f1c-bca8-4702-8fe5-f7e6512884f6\") " pod="openstack/dnsmasq-dns-5fc4fb97c9-v2hrg" Dec 04 10:38:20 crc kubenswrapper[4831]: I1204 10:38:20.622028 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06199f1c-bca8-4702-8fe5-f7e6512884f6-dns-svc\") pod \"dnsmasq-dns-5fc4fb97c9-v2hrg\" (UID: \"06199f1c-bca8-4702-8fe5-f7e6512884f6\") " pod="openstack/dnsmasq-dns-5fc4fb97c9-v2hrg" Dec 04 10:38:20 crc kubenswrapper[4831]: I1204 10:38:20.622063 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rdw9\" (UniqueName: \"kubernetes.io/projected/06199f1c-bca8-4702-8fe5-f7e6512884f6-kube-api-access-6rdw9\") pod \"dnsmasq-dns-5fc4fb97c9-v2hrg\" (UID: \"06199f1c-bca8-4702-8fe5-f7e6512884f6\") " pod="openstack/dnsmasq-dns-5fc4fb97c9-v2hrg" Dec 04 10:38:20 crc kubenswrapper[4831]: I1204 10:38:20.622117 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06199f1c-bca8-4702-8fe5-f7e6512884f6-config\") pod \"dnsmasq-dns-5fc4fb97c9-v2hrg\" (UID: \"06199f1c-bca8-4702-8fe5-f7e6512884f6\") " pod="openstack/dnsmasq-dns-5fc4fb97c9-v2hrg" Dec 04 10:38:20 crc kubenswrapper[4831]: I1204 10:38:20.623290 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06199f1c-bca8-4702-8fe5-f7e6512884f6-ovsdbserver-sb\") pod \"dnsmasq-dns-5fc4fb97c9-v2hrg\" (UID: \"06199f1c-bca8-4702-8fe5-f7e6512884f6\") " pod="openstack/dnsmasq-dns-5fc4fb97c9-v2hrg" Dec 04 10:38:20 crc kubenswrapper[4831]: I1204 10:38:20.623331 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06199f1c-bca8-4702-8fe5-f7e6512884f6-ovsdbserver-nb\") pod \"dnsmasq-dns-5fc4fb97c9-v2hrg\" (UID: \"06199f1c-bca8-4702-8fe5-f7e6512884f6\") " pod="openstack/dnsmasq-dns-5fc4fb97c9-v2hrg" Dec 04 10:38:20 crc kubenswrapper[4831]: I1204 10:38:20.623297 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/06199f1c-bca8-4702-8fe5-f7e6512884f6-openstack-edpm-ipam\") pod \"dnsmasq-dns-5fc4fb97c9-v2hrg\" (UID: \"06199f1c-bca8-4702-8fe5-f7e6512884f6\") " pod="openstack/dnsmasq-dns-5fc4fb97c9-v2hrg" Dec 04 10:38:20 crc kubenswrapper[4831]: I1204 10:38:20.623486 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06199f1c-bca8-4702-8fe5-f7e6512884f6-config\") pod \"dnsmasq-dns-5fc4fb97c9-v2hrg\" (UID: \"06199f1c-bca8-4702-8fe5-f7e6512884f6\") " pod="openstack/dnsmasq-dns-5fc4fb97c9-v2hrg" Dec 04 10:38:20 crc kubenswrapper[4831]: I1204 10:38:20.624079 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/06199f1c-bca8-4702-8fe5-f7e6512884f6-dns-swift-storage-0\") pod \"dnsmasq-dns-5fc4fb97c9-v2hrg\" (UID: \"06199f1c-bca8-4702-8fe5-f7e6512884f6\") " pod="openstack/dnsmasq-dns-5fc4fb97c9-v2hrg" Dec 04 10:38:20 crc kubenswrapper[4831]: I1204 10:38:20.624950 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06199f1c-bca8-4702-8fe5-f7e6512884f6-dns-svc\") pod \"dnsmasq-dns-5fc4fb97c9-v2hrg\" (UID: \"06199f1c-bca8-4702-8fe5-f7e6512884f6\") " pod="openstack/dnsmasq-dns-5fc4fb97c9-v2hrg" Dec 04 10:38:20 crc kubenswrapper[4831]: I1204 10:38:20.644205 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rdw9\" (UniqueName: \"kubernetes.io/projected/06199f1c-bca8-4702-8fe5-f7e6512884f6-kube-api-access-6rdw9\") pod \"dnsmasq-dns-5fc4fb97c9-v2hrg\" (UID: \"06199f1c-bca8-4702-8fe5-f7e6512884f6\") " pod="openstack/dnsmasq-dns-5fc4fb97c9-v2hrg" Dec 04 10:38:20 crc kubenswrapper[4831]: I1204 10:38:20.728653 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fc4fb97c9-v2hrg" Dec 04 10:38:20 crc kubenswrapper[4831]: I1204 10:38:20.861370 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86dddb665-dtvkq" Dec 04 10:38:20 crc kubenswrapper[4831]: I1204 10:38:20.928093 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/88fcf3fe-10ba-460e-89bb-6b94936a183e-dns-swift-storage-0\") pod \"88fcf3fe-10ba-460e-89bb-6b94936a183e\" (UID: \"88fcf3fe-10ba-460e-89bb-6b94936a183e\") " Dec 04 10:38:20 crc kubenswrapper[4831]: I1204 10:38:20.928452 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/88fcf3fe-10ba-460e-89bb-6b94936a183e-ovsdbserver-nb\") pod \"88fcf3fe-10ba-460e-89bb-6b94936a183e\" (UID: \"88fcf3fe-10ba-460e-89bb-6b94936a183e\") " Dec 04 10:38:20 crc kubenswrapper[4831]: I1204 10:38:20.928505 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/88fcf3fe-10ba-460e-89bb-6b94936a183e-dns-svc\") pod \"88fcf3fe-10ba-460e-89bb-6b94936a183e\" (UID: \"88fcf3fe-10ba-460e-89bb-6b94936a183e\") " Dec 04 10:38:20 crc kubenswrapper[4831]: I1204 10:38:20.928560 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88fcf3fe-10ba-460e-89bb-6b94936a183e-config\") pod \"88fcf3fe-10ba-460e-89bb-6b94936a183e\" (UID: \"88fcf3fe-10ba-460e-89bb-6b94936a183e\") " Dec 04 10:38:20 crc kubenswrapper[4831]: I1204 10:38:20.928599 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kp2vf\" (UniqueName: \"kubernetes.io/projected/88fcf3fe-10ba-460e-89bb-6b94936a183e-kube-api-access-kp2vf\") pod \"88fcf3fe-10ba-460e-89bb-6b94936a183e\" (UID: \"88fcf3fe-10ba-460e-89bb-6b94936a183e\") " Dec 04 10:38:20 crc kubenswrapper[4831]: I1204 10:38:20.928674 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/88fcf3fe-10ba-460e-89bb-6b94936a183e-ovsdbserver-sb\") pod \"88fcf3fe-10ba-460e-89bb-6b94936a183e\" (UID: \"88fcf3fe-10ba-460e-89bb-6b94936a183e\") " Dec 04 10:38:20 crc kubenswrapper[4831]: I1204 10:38:20.949607 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88fcf3fe-10ba-460e-89bb-6b94936a183e-kube-api-access-kp2vf" (OuterVolumeSpecName: "kube-api-access-kp2vf") pod "88fcf3fe-10ba-460e-89bb-6b94936a183e" (UID: "88fcf3fe-10ba-460e-89bb-6b94936a183e"). InnerVolumeSpecName "kube-api-access-kp2vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:38:21 crc kubenswrapper[4831]: I1204 10:38:21.027422 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88fcf3fe-10ba-460e-89bb-6b94936a183e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "88fcf3fe-10ba-460e-89bb-6b94936a183e" (UID: "88fcf3fe-10ba-460e-89bb-6b94936a183e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:38:21 crc kubenswrapper[4831]: I1204 10:38:21.032459 4831 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/88fcf3fe-10ba-460e-89bb-6b94936a183e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:38:21 crc kubenswrapper[4831]: I1204 10:38:21.032499 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kp2vf\" (UniqueName: \"kubernetes.io/projected/88fcf3fe-10ba-460e-89bb-6b94936a183e-kube-api-access-kp2vf\") on node \"crc\" DevicePath \"\"" Dec 04 10:38:21 crc kubenswrapper[4831]: I1204 10:38:21.043126 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88fcf3fe-10ba-460e-89bb-6b94936a183e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "88fcf3fe-10ba-460e-89bb-6b94936a183e" (UID: "88fcf3fe-10ba-460e-89bb-6b94936a183e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:38:21 crc kubenswrapper[4831]: I1204 10:38:21.048128 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88fcf3fe-10ba-460e-89bb-6b94936a183e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "88fcf3fe-10ba-460e-89bb-6b94936a183e" (UID: "88fcf3fe-10ba-460e-89bb-6b94936a183e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:38:21 crc kubenswrapper[4831]: I1204 10:38:21.054268 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88fcf3fe-10ba-460e-89bb-6b94936a183e-config" (OuterVolumeSpecName: "config") pod "88fcf3fe-10ba-460e-89bb-6b94936a183e" (UID: "88fcf3fe-10ba-460e-89bb-6b94936a183e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:38:21 crc kubenswrapper[4831]: I1204 10:38:21.063719 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88fcf3fe-10ba-460e-89bb-6b94936a183e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "88fcf3fe-10ba-460e-89bb-6b94936a183e" (UID: "88fcf3fe-10ba-460e-89bb-6b94936a183e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:38:21 crc kubenswrapper[4831]: I1204 10:38:21.134920 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/88fcf3fe-10ba-460e-89bb-6b94936a183e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 10:38:21 crc kubenswrapper[4831]: I1204 10:38:21.134952 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/88fcf3fe-10ba-460e-89bb-6b94936a183e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 10:38:21 crc kubenswrapper[4831]: I1204 10:38:21.134965 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/88fcf3fe-10ba-460e-89bb-6b94936a183e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 10:38:21 crc kubenswrapper[4831]: I1204 10:38:21.134974 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88fcf3fe-10ba-460e-89bb-6b94936a183e-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:38:21 crc kubenswrapper[4831]: I1204 10:38:21.195775 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fc4fb97c9-v2hrg"] Dec 04 10:38:21 crc kubenswrapper[4831]: I1204 10:38:21.413723 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fc4fb97c9-v2hrg" event={"ID":"06199f1c-bca8-4702-8fe5-f7e6512884f6","Type":"ContainerStarted","Data":"c17f716461f8e063826f995fc9d84e69ced4eee4e101447c66fbbf204f09e73c"} Dec 04 10:38:21 crc kubenswrapper[4831]: I1204 10:38:21.414044 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fc4fb97c9-v2hrg" event={"ID":"06199f1c-bca8-4702-8fe5-f7e6512884f6","Type":"ContainerStarted","Data":"7848037eaace158188343e1e829a6ccef2551966b29ceb48ee72785473084f11"} Dec 04 10:38:21 crc kubenswrapper[4831]: I1204 10:38:21.415944 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86dddb665-dtvkq" event={"ID":"88fcf3fe-10ba-460e-89bb-6b94936a183e","Type":"ContainerDied","Data":"9330e3ef5f38b26cb332c32d800d7e1a3b40d798412ee6e0342feaab9c88ad4f"} Dec 04 10:38:21 crc kubenswrapper[4831]: I1204 10:38:21.415998 4831 scope.go:117] "RemoveContainer" containerID="b271798fadb7ecde85c8ec4c83e30da4c06274d7287f6522556443c07aa234e6" Dec 04 10:38:21 crc kubenswrapper[4831]: I1204 10:38:21.416028 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86dddb665-dtvkq" Dec 04 10:38:21 crc kubenswrapper[4831]: I1204 10:38:21.484926 4831 scope.go:117] "RemoveContainer" containerID="fd670f42a2981b0642eaa0634d258f6e32f944a33bd44d4f643decb26080e4e5" Dec 04 10:38:21 crc kubenswrapper[4831]: I1204 10:38:21.494416 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86dddb665-dtvkq"] Dec 04 10:38:21 crc kubenswrapper[4831]: I1204 10:38:21.503405 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86dddb665-dtvkq"] Dec 04 10:38:22 crc kubenswrapper[4831]: I1204 10:38:22.426918 4831 generic.go:334] "Generic (PLEG): container finished" podID="06199f1c-bca8-4702-8fe5-f7e6512884f6" containerID="c17f716461f8e063826f995fc9d84e69ced4eee4e101447c66fbbf204f09e73c" exitCode=0 Dec 04 10:38:22 crc kubenswrapper[4831]: I1204 10:38:22.426977 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fc4fb97c9-v2hrg" event={"ID":"06199f1c-bca8-4702-8fe5-f7e6512884f6","Type":"ContainerDied","Data":"c17f716461f8e063826f995fc9d84e69ced4eee4e101447c66fbbf204f09e73c"} Dec 04 10:38:23 crc kubenswrapper[4831]: I1204 10:38:23.290131 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88fcf3fe-10ba-460e-89bb-6b94936a183e" path="/var/lib/kubelet/pods/88fcf3fe-10ba-460e-89bb-6b94936a183e/volumes" Dec 04 10:38:23 crc kubenswrapper[4831]: I1204 10:38:23.441613 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fc4fb97c9-v2hrg" event={"ID":"06199f1c-bca8-4702-8fe5-f7e6512884f6","Type":"ContainerStarted","Data":"578cbd587b6ef06e62d19e964f857c7f1a58495be753d95e746abefc4c817aec"} Dec 04 10:38:23 crc kubenswrapper[4831]: I1204 10:38:23.441821 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5fc4fb97c9-v2hrg" Dec 04 10:38:23 crc kubenswrapper[4831]: I1204 10:38:23.462096 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5fc4fb97c9-v2hrg" podStartSLOduration=3.462078417 podStartE2EDuration="3.462078417s" podCreationTimestamp="2025-12-04 10:38:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:38:23.457343353 +0000 UTC m=+1400.406518667" watchObservedRunningTime="2025-12-04 10:38:23.462078417 +0000 UTC m=+1400.411253731" Dec 04 10:38:30 crc kubenswrapper[4831]: I1204 10:38:30.731887 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5fc4fb97c9-v2hrg" Dec 04 10:38:30 crc kubenswrapper[4831]: I1204 10:38:30.794418 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c94479b55-qlwx5"] Dec 04 10:38:30 crc kubenswrapper[4831]: I1204 10:38:30.794686 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c94479b55-qlwx5" podUID="8d642828-f7be-4185-82f1-cfd45f494f9b" containerName="dnsmasq-dns" containerID="cri-o://ba5855fcdee6a56e83a3c15c5b8a404b3134068c355b7bdbbdcc1eb94862ecc2" gracePeriod=10 Dec 04 10:38:31 crc kubenswrapper[4831]: I1204 10:38:31.289306 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c94479b55-qlwx5" Dec 04 10:38:31 crc kubenswrapper[4831]: I1204 10:38:31.417500 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d642828-f7be-4185-82f1-cfd45f494f9b-ovsdbserver-sb\") pod \"8d642828-f7be-4185-82f1-cfd45f494f9b\" (UID: \"8d642828-f7be-4185-82f1-cfd45f494f9b\") " Dec 04 10:38:31 crc kubenswrapper[4831]: I1204 10:38:31.417597 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d642828-f7be-4185-82f1-cfd45f494f9b-dns-svc\") pod \"8d642828-f7be-4185-82f1-cfd45f494f9b\" (UID: \"8d642828-f7be-4185-82f1-cfd45f494f9b\") " Dec 04 10:38:31 crc kubenswrapper[4831]: I1204 10:38:31.417637 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d642828-f7be-4185-82f1-cfd45f494f9b-ovsdbserver-nb\") pod \"8d642828-f7be-4185-82f1-cfd45f494f9b\" (UID: \"8d642828-f7be-4185-82f1-cfd45f494f9b\") " Dec 04 10:38:31 crc kubenswrapper[4831]: I1204 10:38:31.417866 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8d642828-f7be-4185-82f1-cfd45f494f9b-dns-swift-storage-0\") pod \"8d642828-f7be-4185-82f1-cfd45f494f9b\" (UID: \"8d642828-f7be-4185-82f1-cfd45f494f9b\") " Dec 04 10:38:31 crc kubenswrapper[4831]: I1204 10:38:31.417966 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d642828-f7be-4185-82f1-cfd45f494f9b-config\") pod \"8d642828-f7be-4185-82f1-cfd45f494f9b\" (UID: \"8d642828-f7be-4185-82f1-cfd45f494f9b\") " Dec 04 10:38:31 crc kubenswrapper[4831]: I1204 10:38:31.417992 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngbjw\" (UniqueName: \"kubernetes.io/projected/8d642828-f7be-4185-82f1-cfd45f494f9b-kube-api-access-ngbjw\") pod \"8d642828-f7be-4185-82f1-cfd45f494f9b\" (UID: \"8d642828-f7be-4185-82f1-cfd45f494f9b\") " Dec 04 10:38:31 crc kubenswrapper[4831]: I1204 10:38:31.418032 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8d642828-f7be-4185-82f1-cfd45f494f9b-openstack-edpm-ipam\") pod \"8d642828-f7be-4185-82f1-cfd45f494f9b\" (UID: \"8d642828-f7be-4185-82f1-cfd45f494f9b\") " Dec 04 10:38:31 crc kubenswrapper[4831]: I1204 10:38:31.423900 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d642828-f7be-4185-82f1-cfd45f494f9b-kube-api-access-ngbjw" (OuterVolumeSpecName: "kube-api-access-ngbjw") pod "8d642828-f7be-4185-82f1-cfd45f494f9b" (UID: "8d642828-f7be-4185-82f1-cfd45f494f9b"). InnerVolumeSpecName "kube-api-access-ngbjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:38:31 crc kubenswrapper[4831]: I1204 10:38:31.474819 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d642828-f7be-4185-82f1-cfd45f494f9b-config" (OuterVolumeSpecName: "config") pod "8d642828-f7be-4185-82f1-cfd45f494f9b" (UID: "8d642828-f7be-4185-82f1-cfd45f494f9b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:38:31 crc kubenswrapper[4831]: I1204 10:38:31.476242 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d642828-f7be-4185-82f1-cfd45f494f9b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8d642828-f7be-4185-82f1-cfd45f494f9b" (UID: "8d642828-f7be-4185-82f1-cfd45f494f9b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:38:31 crc kubenswrapper[4831]: I1204 10:38:31.492367 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d642828-f7be-4185-82f1-cfd45f494f9b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8d642828-f7be-4185-82f1-cfd45f494f9b" (UID: "8d642828-f7be-4185-82f1-cfd45f494f9b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:38:31 crc kubenswrapper[4831]: I1204 10:38:31.503878 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d642828-f7be-4185-82f1-cfd45f494f9b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8d642828-f7be-4185-82f1-cfd45f494f9b" (UID: "8d642828-f7be-4185-82f1-cfd45f494f9b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:38:31 crc kubenswrapper[4831]: I1204 10:38:31.517141 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d642828-f7be-4185-82f1-cfd45f494f9b-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "8d642828-f7be-4185-82f1-cfd45f494f9b" (UID: "8d642828-f7be-4185-82f1-cfd45f494f9b"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:38:31 crc kubenswrapper[4831]: I1204 10:38:31.518298 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d642828-f7be-4185-82f1-cfd45f494f9b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8d642828-f7be-4185-82f1-cfd45f494f9b" (UID: "8d642828-f7be-4185-82f1-cfd45f494f9b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:38:31 crc kubenswrapper[4831]: I1204 10:38:31.538403 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d642828-f7be-4185-82f1-cfd45f494f9b-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 10:38:31 crc kubenswrapper[4831]: I1204 10:38:31.538447 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d642828-f7be-4185-82f1-cfd45f494f9b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 10:38:31 crc kubenswrapper[4831]: I1204 10:38:31.538461 4831 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8d642828-f7be-4185-82f1-cfd45f494f9b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:38:31 crc kubenswrapper[4831]: I1204 10:38:31.538476 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d642828-f7be-4185-82f1-cfd45f494f9b-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:38:31 crc kubenswrapper[4831]: I1204 10:38:31.538489 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngbjw\" (UniqueName: \"kubernetes.io/projected/8d642828-f7be-4185-82f1-cfd45f494f9b-kube-api-access-ngbjw\") on node \"crc\" DevicePath \"\"" Dec 04 10:38:31 crc kubenswrapper[4831]: I1204 10:38:31.538499 4831 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8d642828-f7be-4185-82f1-cfd45f494f9b-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 04 10:38:31 crc kubenswrapper[4831]: I1204 10:38:31.538509 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d642828-f7be-4185-82f1-cfd45f494f9b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 10:38:31 crc kubenswrapper[4831]: I1204 10:38:31.752231 4831 generic.go:334] "Generic (PLEG): container finished" podID="8d642828-f7be-4185-82f1-cfd45f494f9b" containerID="ba5855fcdee6a56e83a3c15c5b8a404b3134068c355b7bdbbdcc1eb94862ecc2" exitCode=0 Dec 04 10:38:31 crc kubenswrapper[4831]: I1204 10:38:31.752306 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c94479b55-qlwx5" Dec 04 10:38:31 crc kubenswrapper[4831]: I1204 10:38:31.752337 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c94479b55-qlwx5" event={"ID":"8d642828-f7be-4185-82f1-cfd45f494f9b","Type":"ContainerDied","Data":"ba5855fcdee6a56e83a3c15c5b8a404b3134068c355b7bdbbdcc1eb94862ecc2"} Dec 04 10:38:31 crc kubenswrapper[4831]: I1204 10:38:31.753859 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c94479b55-qlwx5" event={"ID":"8d642828-f7be-4185-82f1-cfd45f494f9b","Type":"ContainerDied","Data":"44b592b8e5f22bb80f9cf48cdeeb80c84217f7256b36908ac8e97b0aeee6524f"} Dec 04 10:38:31 crc kubenswrapper[4831]: I1204 10:38:31.753881 4831 scope.go:117] "RemoveContainer" containerID="ba5855fcdee6a56e83a3c15c5b8a404b3134068c355b7bdbbdcc1eb94862ecc2" Dec 04 10:38:31 crc kubenswrapper[4831]: I1204 10:38:31.775072 4831 scope.go:117] "RemoveContainer" containerID="36f0b13d4303560308d816c71bcefaaf71a38bb0b887290881ad7dfe5be8eff7" Dec 04 10:38:31 crc kubenswrapper[4831]: I1204 10:38:31.794772 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c94479b55-qlwx5"] Dec 04 10:38:31 crc kubenswrapper[4831]: I1204 10:38:31.807933 4831 scope.go:117] "RemoveContainer" containerID="ba5855fcdee6a56e83a3c15c5b8a404b3134068c355b7bdbbdcc1eb94862ecc2" Dec 04 10:38:31 crc kubenswrapper[4831]: E1204 10:38:31.808417 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba5855fcdee6a56e83a3c15c5b8a404b3134068c355b7bdbbdcc1eb94862ecc2\": container with ID starting with ba5855fcdee6a56e83a3c15c5b8a404b3134068c355b7bdbbdcc1eb94862ecc2 not found: ID does not exist" containerID="ba5855fcdee6a56e83a3c15c5b8a404b3134068c355b7bdbbdcc1eb94862ecc2" Dec 04 10:38:31 crc kubenswrapper[4831]: I1204 10:38:31.808451 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba5855fcdee6a56e83a3c15c5b8a404b3134068c355b7bdbbdcc1eb94862ecc2"} err="failed to get container status \"ba5855fcdee6a56e83a3c15c5b8a404b3134068c355b7bdbbdcc1eb94862ecc2\": rpc error: code = NotFound desc = could not find container \"ba5855fcdee6a56e83a3c15c5b8a404b3134068c355b7bdbbdcc1eb94862ecc2\": container with ID starting with ba5855fcdee6a56e83a3c15c5b8a404b3134068c355b7bdbbdcc1eb94862ecc2 not found: ID does not exist" Dec 04 10:38:31 crc kubenswrapper[4831]: I1204 10:38:31.808476 4831 scope.go:117] "RemoveContainer" containerID="36f0b13d4303560308d816c71bcefaaf71a38bb0b887290881ad7dfe5be8eff7" Dec 04 10:38:31 crc kubenswrapper[4831]: E1204 10:38:31.808846 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36f0b13d4303560308d816c71bcefaaf71a38bb0b887290881ad7dfe5be8eff7\": container with ID starting with 36f0b13d4303560308d816c71bcefaaf71a38bb0b887290881ad7dfe5be8eff7 not found: ID does not exist" containerID="36f0b13d4303560308d816c71bcefaaf71a38bb0b887290881ad7dfe5be8eff7" Dec 04 10:38:31 crc kubenswrapper[4831]: I1204 10:38:31.808889 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36f0b13d4303560308d816c71bcefaaf71a38bb0b887290881ad7dfe5be8eff7"} err="failed to get container status \"36f0b13d4303560308d816c71bcefaaf71a38bb0b887290881ad7dfe5be8eff7\": rpc error: code = NotFound desc = could not find container \"36f0b13d4303560308d816c71bcefaaf71a38bb0b887290881ad7dfe5be8eff7\": container with ID starting with 36f0b13d4303560308d816c71bcefaaf71a38bb0b887290881ad7dfe5be8eff7 not found: ID does not exist" Dec 04 10:38:31 crc kubenswrapper[4831]: I1204 10:38:31.809282 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c94479b55-qlwx5"] Dec 04 10:38:33 crc kubenswrapper[4831]: I1204 10:38:33.288359 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d642828-f7be-4185-82f1-cfd45f494f9b" path="/var/lib/kubelet/pods/8d642828-f7be-4185-82f1-cfd45f494f9b/volumes" Dec 04 10:38:33 crc kubenswrapper[4831]: I1204 10:38:33.777547 4831 generic.go:334] "Generic (PLEG): container finished" podID="75f6fd12-4651-4b5f-9eec-d192367b85ad" containerID="3ff8e17d0e7fa0f6cb3e4b39b90300c3de29c1e95b07913ddaae6d333bf4bd9f" exitCode=0 Dec 04 10:38:33 crc kubenswrapper[4831]: I1204 10:38:33.777645 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"75f6fd12-4651-4b5f-9eec-d192367b85ad","Type":"ContainerDied","Data":"3ff8e17d0e7fa0f6cb3e4b39b90300c3de29c1e95b07913ddaae6d333bf4bd9f"} Dec 04 10:38:34 crc kubenswrapper[4831]: I1204 10:38:34.791013 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"75f6fd12-4651-4b5f-9eec-d192367b85ad","Type":"ContainerStarted","Data":"50463577b2b7183fd32a692e7c76e2b2adff14ef21d1dbd5a0bae0e3d1f7c9de"} Dec 04 10:38:34 crc kubenswrapper[4831]: I1204 10:38:34.792895 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 04 10:38:34 crc kubenswrapper[4831]: I1204 10:38:34.795242 4831 generic.go:334] "Generic (PLEG): container finished" podID="7544cda9-54d6-47c9-8ba1-0834b882e674" containerID="d93f2d24d73a2018885211a49ed6910ce3b853978a2b5c499b7e78e7a4c5eb2a" exitCode=0 Dec 04 10:38:34 crc kubenswrapper[4831]: I1204 10:38:34.795363 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7544cda9-54d6-47c9-8ba1-0834b882e674","Type":"ContainerDied","Data":"d93f2d24d73a2018885211a49ed6910ce3b853978a2b5c499b7e78e7a4c5eb2a"} Dec 04 10:38:34 crc kubenswrapper[4831]: I1204 10:38:34.823014 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=35.822994138 podStartE2EDuration="35.822994138s" podCreationTimestamp="2025-12-04 10:37:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:38:34.815619834 +0000 UTC m=+1411.764795168" watchObservedRunningTime="2025-12-04 10:38:34.822994138 +0000 UTC m=+1411.772169452" Dec 04 10:38:35 crc kubenswrapper[4831]: I1204 10:38:35.807248 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7544cda9-54d6-47c9-8ba1-0834b882e674","Type":"ContainerStarted","Data":"510f3a0d5b7b73e12e36caa1f3e4d35aa7f7c61dbd5b905d4d6d35cafbabc070"} Dec 04 10:38:35 crc kubenswrapper[4831]: I1204 10:38:35.808067 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:38:35 crc kubenswrapper[4831]: I1204 10:38:35.840352 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=35.84032631 podStartE2EDuration="35.84032631s" podCreationTimestamp="2025-12-04 10:38:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:38:35.826941378 +0000 UTC m=+1412.776116702" watchObservedRunningTime="2025-12-04 10:38:35.84032631 +0000 UTC m=+1412.789501654" Dec 04 10:38:49 crc kubenswrapper[4831]: I1204 10:38:49.227787 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sbrlp"] Dec 04 10:38:49 crc kubenswrapper[4831]: E1204 10:38:49.228764 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d642828-f7be-4185-82f1-cfd45f494f9b" containerName="dnsmasq-dns" Dec 04 10:38:49 crc kubenswrapper[4831]: I1204 10:38:49.228779 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d642828-f7be-4185-82f1-cfd45f494f9b" containerName="dnsmasq-dns" Dec 04 10:38:49 crc kubenswrapper[4831]: E1204 10:38:49.228804 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88fcf3fe-10ba-460e-89bb-6b94936a183e" containerName="dnsmasq-dns" Dec 04 10:38:49 crc kubenswrapper[4831]: I1204 10:38:49.228812 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="88fcf3fe-10ba-460e-89bb-6b94936a183e" containerName="dnsmasq-dns" Dec 04 10:38:49 crc kubenswrapper[4831]: E1204 10:38:49.228832 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d642828-f7be-4185-82f1-cfd45f494f9b" containerName="init" Dec 04 10:38:49 crc kubenswrapper[4831]: I1204 10:38:49.228839 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d642828-f7be-4185-82f1-cfd45f494f9b" containerName="init" Dec 04 10:38:49 crc kubenswrapper[4831]: E1204 10:38:49.228847 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88fcf3fe-10ba-460e-89bb-6b94936a183e" containerName="init" Dec 04 10:38:49 crc kubenswrapper[4831]: I1204 10:38:49.228853 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="88fcf3fe-10ba-460e-89bb-6b94936a183e" containerName="init" Dec 04 10:38:49 crc kubenswrapper[4831]: I1204 10:38:49.229044 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d642828-f7be-4185-82f1-cfd45f494f9b" containerName="dnsmasq-dns" Dec 04 10:38:49 crc kubenswrapper[4831]: I1204 10:38:49.229053 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="88fcf3fe-10ba-460e-89bb-6b94936a183e" containerName="dnsmasq-dns" Dec 04 10:38:49 crc kubenswrapper[4831]: I1204 10:38:49.229955 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sbrlp" Dec 04 10:38:49 crc kubenswrapper[4831]: I1204 10:38:49.233961 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:38:49 crc kubenswrapper[4831]: I1204 10:38:49.234213 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:38:49 crc kubenswrapper[4831]: I1204 10:38:49.234491 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-l2wn5" Dec 04 10:38:49 crc kubenswrapper[4831]: I1204 10:38:49.241328 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:38:49 crc kubenswrapper[4831]: I1204 10:38:49.245991 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sbrlp"] Dec 04 10:38:49 crc kubenswrapper[4831]: I1204 10:38:49.294800 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25thz\" (UniqueName: \"kubernetes.io/projected/e4d67c27-1b02-4823-8635-621753ee6278-kube-api-access-25thz\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sbrlp\" (UID: \"e4d67c27-1b02-4823-8635-621753ee6278\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sbrlp" Dec 04 10:38:49 crc kubenswrapper[4831]: I1204 10:38:49.294953 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e4d67c27-1b02-4823-8635-621753ee6278-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sbrlp\" (UID: \"e4d67c27-1b02-4823-8635-621753ee6278\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sbrlp" Dec 04 10:38:49 crc kubenswrapper[4831]: I1204 10:38:49.295087 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4d67c27-1b02-4823-8635-621753ee6278-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sbrlp\" (UID: \"e4d67c27-1b02-4823-8635-621753ee6278\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sbrlp" Dec 04 10:38:49 crc kubenswrapper[4831]: I1204 10:38:49.295139 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4d67c27-1b02-4823-8635-621753ee6278-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sbrlp\" (UID: \"e4d67c27-1b02-4823-8635-621753ee6278\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sbrlp" Dec 04 10:38:49 crc kubenswrapper[4831]: I1204 10:38:49.397337 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4d67c27-1b02-4823-8635-621753ee6278-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sbrlp\" (UID: \"e4d67c27-1b02-4823-8635-621753ee6278\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sbrlp" Dec 04 10:38:49 crc kubenswrapper[4831]: I1204 10:38:49.397778 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25thz\" (UniqueName: \"kubernetes.io/projected/e4d67c27-1b02-4823-8635-621753ee6278-kube-api-access-25thz\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sbrlp\" (UID: \"e4d67c27-1b02-4823-8635-621753ee6278\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sbrlp" Dec 04 10:38:49 crc kubenswrapper[4831]: I1204 10:38:49.398039 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e4d67c27-1b02-4823-8635-621753ee6278-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sbrlp\" (UID: \"e4d67c27-1b02-4823-8635-621753ee6278\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sbrlp" Dec 04 10:38:49 crc kubenswrapper[4831]: I1204 10:38:49.398287 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4d67c27-1b02-4823-8635-621753ee6278-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sbrlp\" (UID: \"e4d67c27-1b02-4823-8635-621753ee6278\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sbrlp" Dec 04 10:38:49 crc kubenswrapper[4831]: I1204 10:38:49.403494 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4d67c27-1b02-4823-8635-621753ee6278-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sbrlp\" (UID: \"e4d67c27-1b02-4823-8635-621753ee6278\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sbrlp" Dec 04 10:38:49 crc kubenswrapper[4831]: I1204 10:38:49.404077 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e4d67c27-1b02-4823-8635-621753ee6278-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sbrlp\" (UID: \"e4d67c27-1b02-4823-8635-621753ee6278\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sbrlp" Dec 04 10:38:49 crc kubenswrapper[4831]: I1204 10:38:49.404554 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4d67c27-1b02-4823-8635-621753ee6278-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sbrlp\" (UID: \"e4d67c27-1b02-4823-8635-621753ee6278\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sbrlp" Dec 04 10:38:49 crc kubenswrapper[4831]: I1204 10:38:49.419040 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25thz\" (UniqueName: \"kubernetes.io/projected/e4d67c27-1b02-4823-8635-621753ee6278-kube-api-access-25thz\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sbrlp\" (UID: \"e4d67c27-1b02-4823-8635-621753ee6278\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sbrlp" Dec 04 10:38:49 crc kubenswrapper[4831]: I1204 10:38:49.560066 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sbrlp" Dec 04 10:38:49 crc kubenswrapper[4831]: I1204 10:38:49.571976 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 04 10:38:50 crc kubenswrapper[4831]: I1204 10:38:50.365708 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sbrlp"] Dec 04 10:38:50 crc kubenswrapper[4831]: W1204 10:38:50.367245 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4d67c27_1b02_4823_8635_621753ee6278.slice/crio-1c2ed393bbf9bc1316a96bbde5b8bddef7484ce4c7adafb5ab863d440d1bb4b8 WatchSource:0}: Error finding container 1c2ed393bbf9bc1316a96bbde5b8bddef7484ce4c7adafb5ab863d440d1bb4b8: Status 404 returned error can't find the container with id 1c2ed393bbf9bc1316a96bbde5b8bddef7484ce4c7adafb5ab863d440d1bb4b8 Dec 04 10:38:50 crc kubenswrapper[4831]: I1204 10:38:50.564025 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:38:50 crc kubenswrapper[4831]: I1204 10:38:50.951719 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sbrlp" event={"ID":"e4d67c27-1b02-4823-8635-621753ee6278","Type":"ContainerStarted","Data":"1c2ed393bbf9bc1316a96bbde5b8bddef7484ce4c7adafb5ab863d440d1bb4b8"} Dec 04 10:38:51 crc kubenswrapper[4831]: I1204 10:38:51.468474 4831 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod88fcf3fe-10ba-460e-89bb-6b94936a183e"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod88fcf3fe-10ba-460e-89bb-6b94936a183e] : Timed out while waiting for systemd to remove kubepods-besteffort-pod88fcf3fe_10ba_460e_89bb_6b94936a183e.slice" Dec 04 10:38:51 crc kubenswrapper[4831]: I1204 10:38:51.971118 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:38:51 crc kubenswrapper[4831]: I1204 10:38:51.971350 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:39:00 crc kubenswrapper[4831]: I1204 10:39:00.037193 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sbrlp" event={"ID":"e4d67c27-1b02-4823-8635-621753ee6278","Type":"ContainerStarted","Data":"8b31bc8ef0ef1cf000f59402d71ef2045bfcc0b650ca44dddd26ebaa9cdcdda7"} Dec 04 10:39:00 crc kubenswrapper[4831]: I1204 10:39:00.058764 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sbrlp" podStartSLOduration=2.045304385 podStartE2EDuration="11.058746011s" podCreationTimestamp="2025-12-04 10:38:49 +0000 UTC" firstStartedPulling="2025-12-04 10:38:50.37060594 +0000 UTC m=+1427.319781254" lastFinishedPulling="2025-12-04 10:38:59.384047566 +0000 UTC m=+1436.333222880" observedRunningTime="2025-12-04 10:39:00.05186363 +0000 UTC m=+1437.001038964" watchObservedRunningTime="2025-12-04 10:39:00.058746011 +0000 UTC m=+1437.007921325" Dec 04 10:39:11 crc kubenswrapper[4831]: I1204 10:39:11.146879 4831 generic.go:334] "Generic (PLEG): container finished" podID="e4d67c27-1b02-4823-8635-621753ee6278" containerID="8b31bc8ef0ef1cf000f59402d71ef2045bfcc0b650ca44dddd26ebaa9cdcdda7" exitCode=0 Dec 04 10:39:11 crc kubenswrapper[4831]: I1204 10:39:11.146959 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sbrlp" event={"ID":"e4d67c27-1b02-4823-8635-621753ee6278","Type":"ContainerDied","Data":"8b31bc8ef0ef1cf000f59402d71ef2045bfcc0b650ca44dddd26ebaa9cdcdda7"} Dec 04 10:39:12 crc kubenswrapper[4831]: I1204 10:39:12.629783 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sbrlp" Dec 04 10:39:12 crc kubenswrapper[4831]: I1204 10:39:12.662156 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4d67c27-1b02-4823-8635-621753ee6278-repo-setup-combined-ca-bundle\") pod \"e4d67c27-1b02-4823-8635-621753ee6278\" (UID: \"e4d67c27-1b02-4823-8635-621753ee6278\") " Dec 04 10:39:12 crc kubenswrapper[4831]: I1204 10:39:12.663389 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25thz\" (UniqueName: \"kubernetes.io/projected/e4d67c27-1b02-4823-8635-621753ee6278-kube-api-access-25thz\") pod \"e4d67c27-1b02-4823-8635-621753ee6278\" (UID: \"e4d67c27-1b02-4823-8635-621753ee6278\") " Dec 04 10:39:12 crc kubenswrapper[4831]: I1204 10:39:12.663455 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e4d67c27-1b02-4823-8635-621753ee6278-ssh-key\") pod \"e4d67c27-1b02-4823-8635-621753ee6278\" (UID: \"e4d67c27-1b02-4823-8635-621753ee6278\") " Dec 04 10:39:12 crc kubenswrapper[4831]: I1204 10:39:12.663573 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4d67c27-1b02-4823-8635-621753ee6278-inventory\") pod \"e4d67c27-1b02-4823-8635-621753ee6278\" (UID: \"e4d67c27-1b02-4823-8635-621753ee6278\") " Dec 04 10:39:12 crc kubenswrapper[4831]: I1204 10:39:12.668998 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4d67c27-1b02-4823-8635-621753ee6278-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "e4d67c27-1b02-4823-8635-621753ee6278" (UID: "e4d67c27-1b02-4823-8635-621753ee6278"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:39:12 crc kubenswrapper[4831]: I1204 10:39:12.672846 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4d67c27-1b02-4823-8635-621753ee6278-kube-api-access-25thz" (OuterVolumeSpecName: "kube-api-access-25thz") pod "e4d67c27-1b02-4823-8635-621753ee6278" (UID: "e4d67c27-1b02-4823-8635-621753ee6278"). InnerVolumeSpecName "kube-api-access-25thz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:39:12 crc kubenswrapper[4831]: I1204 10:39:12.695825 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4d67c27-1b02-4823-8635-621753ee6278-inventory" (OuterVolumeSpecName: "inventory") pod "e4d67c27-1b02-4823-8635-621753ee6278" (UID: "e4d67c27-1b02-4823-8635-621753ee6278"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:39:12 crc kubenswrapper[4831]: I1204 10:39:12.717268 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4d67c27-1b02-4823-8635-621753ee6278-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e4d67c27-1b02-4823-8635-621753ee6278" (UID: "e4d67c27-1b02-4823-8635-621753ee6278"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:39:12 crc kubenswrapper[4831]: I1204 10:39:12.766399 4831 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4d67c27-1b02-4823-8635-621753ee6278-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:39:12 crc kubenswrapper[4831]: I1204 10:39:12.766459 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25thz\" (UniqueName: \"kubernetes.io/projected/e4d67c27-1b02-4823-8635-621753ee6278-kube-api-access-25thz\") on node \"crc\" DevicePath \"\"" Dec 04 10:39:12 crc kubenswrapper[4831]: I1204 10:39:12.766475 4831 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e4d67c27-1b02-4823-8635-621753ee6278-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:39:12 crc kubenswrapper[4831]: I1204 10:39:12.766488 4831 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4d67c27-1b02-4823-8635-621753ee6278-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:39:13 crc kubenswrapper[4831]: I1204 10:39:13.171820 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sbrlp" event={"ID":"e4d67c27-1b02-4823-8635-621753ee6278","Type":"ContainerDied","Data":"1c2ed393bbf9bc1316a96bbde5b8bddef7484ce4c7adafb5ab863d440d1bb4b8"} Dec 04 10:39:13 crc kubenswrapper[4831]: I1204 10:39:13.172128 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c2ed393bbf9bc1316a96bbde5b8bddef7484ce4c7adafb5ab863d440d1bb4b8" Dec 04 10:39:13 crc kubenswrapper[4831]: I1204 10:39:13.171920 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sbrlp" Dec 04 10:39:13 crc kubenswrapper[4831]: I1204 10:39:13.258714 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-lrlqc"] Dec 04 10:39:13 crc kubenswrapper[4831]: E1204 10:39:13.259152 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4d67c27-1b02-4823-8635-621753ee6278" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 04 10:39:13 crc kubenswrapper[4831]: I1204 10:39:13.259171 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4d67c27-1b02-4823-8635-621753ee6278" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 04 10:39:13 crc kubenswrapper[4831]: I1204 10:39:13.259405 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4d67c27-1b02-4823-8635-621753ee6278" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 04 10:39:13 crc kubenswrapper[4831]: I1204 10:39:13.260132 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lrlqc" Dec 04 10:39:13 crc kubenswrapper[4831]: I1204 10:39:13.263161 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:39:13 crc kubenswrapper[4831]: I1204 10:39:13.263281 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:39:13 crc kubenswrapper[4831]: I1204 10:39:13.264774 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-l2wn5" Dec 04 10:39:13 crc kubenswrapper[4831]: I1204 10:39:13.266885 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:39:13 crc kubenswrapper[4831]: I1204 10:39:13.292712 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-lrlqc"] Dec 04 10:39:13 crc kubenswrapper[4831]: I1204 10:39:13.378036 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea309f4e-4dea-4d32-b2ac-7ecf505d9341-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-lrlqc\" (UID: \"ea309f4e-4dea-4d32-b2ac-7ecf505d9341\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lrlqc" Dec 04 10:39:13 crc kubenswrapper[4831]: I1204 10:39:13.378199 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn6tm\" (UniqueName: \"kubernetes.io/projected/ea309f4e-4dea-4d32-b2ac-7ecf505d9341-kube-api-access-cn6tm\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-lrlqc\" (UID: \"ea309f4e-4dea-4d32-b2ac-7ecf505d9341\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lrlqc" Dec 04 10:39:13 crc kubenswrapper[4831]: I1204 10:39:13.378328 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea309f4e-4dea-4d32-b2ac-7ecf505d9341-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-lrlqc\" (UID: \"ea309f4e-4dea-4d32-b2ac-7ecf505d9341\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lrlqc" Dec 04 10:39:13 crc kubenswrapper[4831]: I1204 10:39:13.480082 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cn6tm\" (UniqueName: \"kubernetes.io/projected/ea309f4e-4dea-4d32-b2ac-7ecf505d9341-kube-api-access-cn6tm\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-lrlqc\" (UID: \"ea309f4e-4dea-4d32-b2ac-7ecf505d9341\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lrlqc" Dec 04 10:39:13 crc kubenswrapper[4831]: I1204 10:39:13.480181 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea309f4e-4dea-4d32-b2ac-7ecf505d9341-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-lrlqc\" (UID: \"ea309f4e-4dea-4d32-b2ac-7ecf505d9341\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lrlqc" Dec 04 10:39:13 crc kubenswrapper[4831]: I1204 10:39:13.480247 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea309f4e-4dea-4d32-b2ac-7ecf505d9341-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-lrlqc\" (UID: \"ea309f4e-4dea-4d32-b2ac-7ecf505d9341\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lrlqc" Dec 04 10:39:13 crc kubenswrapper[4831]: I1204 10:39:13.484733 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea309f4e-4dea-4d32-b2ac-7ecf505d9341-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-lrlqc\" (UID: \"ea309f4e-4dea-4d32-b2ac-7ecf505d9341\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lrlqc" Dec 04 10:39:13 crc kubenswrapper[4831]: I1204 10:39:13.485101 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea309f4e-4dea-4d32-b2ac-7ecf505d9341-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-lrlqc\" (UID: \"ea309f4e-4dea-4d32-b2ac-7ecf505d9341\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lrlqc" Dec 04 10:39:13 crc kubenswrapper[4831]: I1204 10:39:13.498133 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn6tm\" (UniqueName: \"kubernetes.io/projected/ea309f4e-4dea-4d32-b2ac-7ecf505d9341-kube-api-access-cn6tm\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-lrlqc\" (UID: \"ea309f4e-4dea-4d32-b2ac-7ecf505d9341\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lrlqc" Dec 04 10:39:13 crc kubenswrapper[4831]: I1204 10:39:13.584452 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lrlqc" Dec 04 10:39:14 crc kubenswrapper[4831]: I1204 10:39:14.162066 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-lrlqc"] Dec 04 10:39:14 crc kubenswrapper[4831]: I1204 10:39:14.184579 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lrlqc" event={"ID":"ea309f4e-4dea-4d32-b2ac-7ecf505d9341","Type":"ContainerStarted","Data":"b324695e2c1712106102ad3428fb9d1c36b28a1b8c14bd8f8d65fa4d7c5db978"} Dec 04 10:39:14 crc kubenswrapper[4831]: I1204 10:39:14.553449 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jrs94"] Dec 04 10:39:14 crc kubenswrapper[4831]: I1204 10:39:14.559718 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jrs94" Dec 04 10:39:14 crc kubenswrapper[4831]: I1204 10:39:14.573581 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jrs94"] Dec 04 10:39:14 crc kubenswrapper[4831]: I1204 10:39:14.606109 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a075172-f827-4466-bda1-1c2cbca1ba32-utilities\") pod \"redhat-operators-jrs94\" (UID: \"5a075172-f827-4466-bda1-1c2cbca1ba32\") " pod="openshift-marketplace/redhat-operators-jrs94" Dec 04 10:39:14 crc kubenswrapper[4831]: I1204 10:39:14.606176 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a075172-f827-4466-bda1-1c2cbca1ba32-catalog-content\") pod \"redhat-operators-jrs94\" (UID: \"5a075172-f827-4466-bda1-1c2cbca1ba32\") " pod="openshift-marketplace/redhat-operators-jrs94" Dec 04 10:39:14 crc kubenswrapper[4831]: I1204 10:39:14.606237 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hrj8\" (UniqueName: \"kubernetes.io/projected/5a075172-f827-4466-bda1-1c2cbca1ba32-kube-api-access-2hrj8\") pod \"redhat-operators-jrs94\" (UID: \"5a075172-f827-4466-bda1-1c2cbca1ba32\") " pod="openshift-marketplace/redhat-operators-jrs94" Dec 04 10:39:14 crc kubenswrapper[4831]: I1204 10:39:14.707917 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a075172-f827-4466-bda1-1c2cbca1ba32-utilities\") pod \"redhat-operators-jrs94\" (UID: \"5a075172-f827-4466-bda1-1c2cbca1ba32\") " pod="openshift-marketplace/redhat-operators-jrs94" Dec 04 10:39:14 crc kubenswrapper[4831]: I1204 10:39:14.708192 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a075172-f827-4466-bda1-1c2cbca1ba32-catalog-content\") pod \"redhat-operators-jrs94\" (UID: \"5a075172-f827-4466-bda1-1c2cbca1ba32\") " pod="openshift-marketplace/redhat-operators-jrs94" Dec 04 10:39:14 crc kubenswrapper[4831]: I1204 10:39:14.708300 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hrj8\" (UniqueName: \"kubernetes.io/projected/5a075172-f827-4466-bda1-1c2cbca1ba32-kube-api-access-2hrj8\") pod \"redhat-operators-jrs94\" (UID: \"5a075172-f827-4466-bda1-1c2cbca1ba32\") " pod="openshift-marketplace/redhat-operators-jrs94" Dec 04 10:39:14 crc kubenswrapper[4831]: I1204 10:39:14.708489 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a075172-f827-4466-bda1-1c2cbca1ba32-utilities\") pod \"redhat-operators-jrs94\" (UID: \"5a075172-f827-4466-bda1-1c2cbca1ba32\") " pod="openshift-marketplace/redhat-operators-jrs94" Dec 04 10:39:14 crc kubenswrapper[4831]: I1204 10:39:14.708713 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a075172-f827-4466-bda1-1c2cbca1ba32-catalog-content\") pod \"redhat-operators-jrs94\" (UID: \"5a075172-f827-4466-bda1-1c2cbca1ba32\") " pod="openshift-marketplace/redhat-operators-jrs94" Dec 04 10:39:14 crc kubenswrapper[4831]: I1204 10:39:14.729409 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hrj8\" (UniqueName: \"kubernetes.io/projected/5a075172-f827-4466-bda1-1c2cbca1ba32-kube-api-access-2hrj8\") pod \"redhat-operators-jrs94\" (UID: \"5a075172-f827-4466-bda1-1c2cbca1ba32\") " pod="openshift-marketplace/redhat-operators-jrs94" Dec 04 10:39:14 crc kubenswrapper[4831]: I1204 10:39:14.883931 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jrs94" Dec 04 10:39:15 crc kubenswrapper[4831]: I1204 10:39:15.202405 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lrlqc" event={"ID":"ea309f4e-4dea-4d32-b2ac-7ecf505d9341","Type":"ContainerStarted","Data":"9ce558c11a1d024378ec028f1a3d318ff7f75091d214e0423af4528421a095e5"} Dec 04 10:39:15 crc kubenswrapper[4831]: I1204 10:39:15.227565 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lrlqc" podStartSLOduration=1.7997606 podStartE2EDuration="2.227545155s" podCreationTimestamp="2025-12-04 10:39:13 +0000 UTC" firstStartedPulling="2025-12-04 10:39:14.161821172 +0000 UTC m=+1451.110996486" lastFinishedPulling="2025-12-04 10:39:14.589605727 +0000 UTC m=+1451.538781041" observedRunningTime="2025-12-04 10:39:15.219632227 +0000 UTC m=+1452.168807561" watchObservedRunningTime="2025-12-04 10:39:15.227545155 +0000 UTC m=+1452.176720469" Dec 04 10:39:15 crc kubenswrapper[4831]: I1204 10:39:15.380055 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jrs94"] Dec 04 10:39:15 crc kubenswrapper[4831]: W1204 10:39:15.383209 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a075172_f827_4466_bda1_1c2cbca1ba32.slice/crio-575fff9245a2426525995c1307f2c063210ca69eca77afb9465f8ac1a6c55537 WatchSource:0}: Error finding container 575fff9245a2426525995c1307f2c063210ca69eca77afb9465f8ac1a6c55537: Status 404 returned error can't find the container with id 575fff9245a2426525995c1307f2c063210ca69eca77afb9465f8ac1a6c55537 Dec 04 10:39:16 crc kubenswrapper[4831]: I1204 10:39:16.215986 4831 generic.go:334] "Generic (PLEG): container finished" podID="5a075172-f827-4466-bda1-1c2cbca1ba32" containerID="9752beb61d89197ec6e30d62c7168974364904860f2753d42bd089eef626a7c4" exitCode=0 Dec 04 10:39:16 crc kubenswrapper[4831]: I1204 10:39:16.216044 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jrs94" event={"ID":"5a075172-f827-4466-bda1-1c2cbca1ba32","Type":"ContainerDied","Data":"9752beb61d89197ec6e30d62c7168974364904860f2753d42bd089eef626a7c4"} Dec 04 10:39:16 crc kubenswrapper[4831]: I1204 10:39:16.216355 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jrs94" event={"ID":"5a075172-f827-4466-bda1-1c2cbca1ba32","Type":"ContainerStarted","Data":"575fff9245a2426525995c1307f2c063210ca69eca77afb9465f8ac1a6c55537"} Dec 04 10:39:18 crc kubenswrapper[4831]: I1204 10:39:18.234171 4831 generic.go:334] "Generic (PLEG): container finished" podID="ea309f4e-4dea-4d32-b2ac-7ecf505d9341" containerID="9ce558c11a1d024378ec028f1a3d318ff7f75091d214e0423af4528421a095e5" exitCode=0 Dec 04 10:39:18 crc kubenswrapper[4831]: I1204 10:39:18.234265 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lrlqc" event={"ID":"ea309f4e-4dea-4d32-b2ac-7ecf505d9341","Type":"ContainerDied","Data":"9ce558c11a1d024378ec028f1a3d318ff7f75091d214e0423af4528421a095e5"} Dec 04 10:39:18 crc kubenswrapper[4831]: I1204 10:39:18.237383 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jrs94" event={"ID":"5a075172-f827-4466-bda1-1c2cbca1ba32","Type":"ContainerStarted","Data":"8f881fa748ee2ff422b49b88e3e107c3f44ed48ea26a790b0bfb9a8b9ba0c18c"} Dec 04 10:39:19 crc kubenswrapper[4831]: I1204 10:39:19.711049 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lrlqc" Dec 04 10:39:19 crc kubenswrapper[4831]: I1204 10:39:19.815849 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea309f4e-4dea-4d32-b2ac-7ecf505d9341-inventory\") pod \"ea309f4e-4dea-4d32-b2ac-7ecf505d9341\" (UID: \"ea309f4e-4dea-4d32-b2ac-7ecf505d9341\") " Dec 04 10:39:19 crc kubenswrapper[4831]: I1204 10:39:19.815905 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cn6tm\" (UniqueName: \"kubernetes.io/projected/ea309f4e-4dea-4d32-b2ac-7ecf505d9341-kube-api-access-cn6tm\") pod \"ea309f4e-4dea-4d32-b2ac-7ecf505d9341\" (UID: \"ea309f4e-4dea-4d32-b2ac-7ecf505d9341\") " Dec 04 10:39:19 crc kubenswrapper[4831]: I1204 10:39:19.815973 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea309f4e-4dea-4d32-b2ac-7ecf505d9341-ssh-key\") pod \"ea309f4e-4dea-4d32-b2ac-7ecf505d9341\" (UID: \"ea309f4e-4dea-4d32-b2ac-7ecf505d9341\") " Dec 04 10:39:19 crc kubenswrapper[4831]: I1204 10:39:19.821572 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea309f4e-4dea-4d32-b2ac-7ecf505d9341-kube-api-access-cn6tm" (OuterVolumeSpecName: "kube-api-access-cn6tm") pod "ea309f4e-4dea-4d32-b2ac-7ecf505d9341" (UID: "ea309f4e-4dea-4d32-b2ac-7ecf505d9341"). InnerVolumeSpecName "kube-api-access-cn6tm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:39:19 crc kubenswrapper[4831]: I1204 10:39:19.843932 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea309f4e-4dea-4d32-b2ac-7ecf505d9341-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ea309f4e-4dea-4d32-b2ac-7ecf505d9341" (UID: "ea309f4e-4dea-4d32-b2ac-7ecf505d9341"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:39:19 crc kubenswrapper[4831]: I1204 10:39:19.845594 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea309f4e-4dea-4d32-b2ac-7ecf505d9341-inventory" (OuterVolumeSpecName: "inventory") pod "ea309f4e-4dea-4d32-b2ac-7ecf505d9341" (UID: "ea309f4e-4dea-4d32-b2ac-7ecf505d9341"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:39:19 crc kubenswrapper[4831]: I1204 10:39:19.918327 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cn6tm\" (UniqueName: \"kubernetes.io/projected/ea309f4e-4dea-4d32-b2ac-7ecf505d9341-kube-api-access-cn6tm\") on node \"crc\" DevicePath \"\"" Dec 04 10:39:19 crc kubenswrapper[4831]: I1204 10:39:19.918369 4831 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea309f4e-4dea-4d32-b2ac-7ecf505d9341-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:39:19 crc kubenswrapper[4831]: I1204 10:39:19.918381 4831 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea309f4e-4dea-4d32-b2ac-7ecf505d9341-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:39:20 crc kubenswrapper[4831]: I1204 10:39:20.259786 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lrlqc" event={"ID":"ea309f4e-4dea-4d32-b2ac-7ecf505d9341","Type":"ContainerDied","Data":"b324695e2c1712106102ad3428fb9d1c36b28a1b8c14bd8f8d65fa4d7c5db978"} Dec 04 10:39:20 crc kubenswrapper[4831]: I1204 10:39:20.259830 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lrlqc" Dec 04 10:39:20 crc kubenswrapper[4831]: I1204 10:39:20.259835 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b324695e2c1712106102ad3428fb9d1c36b28a1b8c14bd8f8d65fa4d7c5db978" Dec 04 10:39:20 crc kubenswrapper[4831]: I1204 10:39:20.417213 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9wp7s"] Dec 04 10:39:20 crc kubenswrapper[4831]: E1204 10:39:20.417617 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea309f4e-4dea-4d32-b2ac-7ecf505d9341" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 04 10:39:20 crc kubenswrapper[4831]: I1204 10:39:20.417635 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea309f4e-4dea-4d32-b2ac-7ecf505d9341" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 04 10:39:20 crc kubenswrapper[4831]: I1204 10:39:20.417841 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea309f4e-4dea-4d32-b2ac-7ecf505d9341" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 04 10:39:20 crc kubenswrapper[4831]: I1204 10:39:20.418535 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9wp7s" Dec 04 10:39:20 crc kubenswrapper[4831]: I1204 10:39:20.421526 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:39:20 crc kubenswrapper[4831]: I1204 10:39:20.421679 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:39:20 crc kubenswrapper[4831]: I1204 10:39:20.422038 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-l2wn5" Dec 04 10:39:20 crc kubenswrapper[4831]: I1204 10:39:20.422067 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:39:20 crc kubenswrapper[4831]: I1204 10:39:20.428255 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9wp7s"] Dec 04 10:39:20 crc kubenswrapper[4831]: I1204 10:39:20.530178 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qkfg\" (UniqueName: \"kubernetes.io/projected/7d4bb48f-fa66-44cf-ab52-2fd190bdde16-kube-api-access-9qkfg\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9wp7s\" (UID: \"7d4bb48f-fa66-44cf-ab52-2fd190bdde16\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9wp7s" Dec 04 10:39:20 crc kubenswrapper[4831]: I1204 10:39:20.530563 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d4bb48f-fa66-44cf-ab52-2fd190bdde16-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9wp7s\" (UID: \"7d4bb48f-fa66-44cf-ab52-2fd190bdde16\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9wp7s" Dec 04 10:39:20 crc kubenswrapper[4831]: I1204 10:39:20.530790 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d4bb48f-fa66-44cf-ab52-2fd190bdde16-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9wp7s\" (UID: \"7d4bb48f-fa66-44cf-ab52-2fd190bdde16\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9wp7s" Dec 04 10:39:20 crc kubenswrapper[4831]: I1204 10:39:20.530965 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d4bb48f-fa66-44cf-ab52-2fd190bdde16-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9wp7s\" (UID: \"7d4bb48f-fa66-44cf-ab52-2fd190bdde16\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9wp7s" Dec 04 10:39:20 crc kubenswrapper[4831]: I1204 10:39:20.633583 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d4bb48f-fa66-44cf-ab52-2fd190bdde16-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9wp7s\" (UID: \"7d4bb48f-fa66-44cf-ab52-2fd190bdde16\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9wp7s" Dec 04 10:39:20 crc kubenswrapper[4831]: I1204 10:39:20.633787 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d4bb48f-fa66-44cf-ab52-2fd190bdde16-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9wp7s\" (UID: \"7d4bb48f-fa66-44cf-ab52-2fd190bdde16\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9wp7s" Dec 04 10:39:20 crc kubenswrapper[4831]: I1204 10:39:20.634616 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qkfg\" (UniqueName: \"kubernetes.io/projected/7d4bb48f-fa66-44cf-ab52-2fd190bdde16-kube-api-access-9qkfg\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9wp7s\" (UID: \"7d4bb48f-fa66-44cf-ab52-2fd190bdde16\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9wp7s" Dec 04 10:39:20 crc kubenswrapper[4831]: I1204 10:39:20.635250 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d4bb48f-fa66-44cf-ab52-2fd190bdde16-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9wp7s\" (UID: \"7d4bb48f-fa66-44cf-ab52-2fd190bdde16\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9wp7s" Dec 04 10:39:20 crc kubenswrapper[4831]: I1204 10:39:20.639146 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d4bb48f-fa66-44cf-ab52-2fd190bdde16-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9wp7s\" (UID: \"7d4bb48f-fa66-44cf-ab52-2fd190bdde16\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9wp7s" Dec 04 10:39:20 crc kubenswrapper[4831]: I1204 10:39:20.639360 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d4bb48f-fa66-44cf-ab52-2fd190bdde16-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9wp7s\" (UID: \"7d4bb48f-fa66-44cf-ab52-2fd190bdde16\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9wp7s" Dec 04 10:39:20 crc kubenswrapper[4831]: I1204 10:39:20.642759 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d4bb48f-fa66-44cf-ab52-2fd190bdde16-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9wp7s\" (UID: \"7d4bb48f-fa66-44cf-ab52-2fd190bdde16\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9wp7s" Dec 04 10:39:20 crc kubenswrapper[4831]: I1204 10:39:20.654045 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qkfg\" (UniqueName: \"kubernetes.io/projected/7d4bb48f-fa66-44cf-ab52-2fd190bdde16-kube-api-access-9qkfg\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9wp7s\" (UID: \"7d4bb48f-fa66-44cf-ab52-2fd190bdde16\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9wp7s" Dec 04 10:39:20 crc kubenswrapper[4831]: I1204 10:39:20.741173 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9wp7s" Dec 04 10:39:21 crc kubenswrapper[4831]: I1204 10:39:21.271207 4831 generic.go:334] "Generic (PLEG): container finished" podID="5a075172-f827-4466-bda1-1c2cbca1ba32" containerID="8f881fa748ee2ff422b49b88e3e107c3f44ed48ea26a790b0bfb9a8b9ba0c18c" exitCode=0 Dec 04 10:39:21 crc kubenswrapper[4831]: I1204 10:39:21.271301 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jrs94" event={"ID":"5a075172-f827-4466-bda1-1c2cbca1ba32","Type":"ContainerDied","Data":"8f881fa748ee2ff422b49b88e3e107c3f44ed48ea26a790b0bfb9a8b9ba0c18c"} Dec 04 10:39:21 crc kubenswrapper[4831]: I1204 10:39:21.656435 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9wp7s"] Dec 04 10:39:21 crc kubenswrapper[4831]: W1204 10:39:21.661139 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d4bb48f_fa66_44cf_ab52_2fd190bdde16.slice/crio-748b7519661f133a6b91ca3b6980f0972415ba9ae0cb16685df526c681a8c3f9 WatchSource:0}: Error finding container 748b7519661f133a6b91ca3b6980f0972415ba9ae0cb16685df526c681a8c3f9: Status 404 returned error can't find the container with id 748b7519661f133a6b91ca3b6980f0972415ba9ae0cb16685df526c681a8c3f9 Dec 04 10:39:21 crc kubenswrapper[4831]: I1204 10:39:21.971441 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:39:21 crc kubenswrapper[4831]: I1204 10:39:21.971513 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:39:22 crc kubenswrapper[4831]: I1204 10:39:22.284261 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9wp7s" event={"ID":"7d4bb48f-fa66-44cf-ab52-2fd190bdde16","Type":"ContainerStarted","Data":"748b7519661f133a6b91ca3b6980f0972415ba9ae0cb16685df526c681a8c3f9"} Dec 04 10:39:22 crc kubenswrapper[4831]: I1204 10:39:22.931447 4831 scope.go:117] "RemoveContainer" containerID="1e9cdf3fb9bb3ae6888f5cbab78cc708c0cf20a3e5f2b25201f398548b6207aa" Dec 04 10:39:23 crc kubenswrapper[4831]: I1204 10:39:23.084422 4831 scope.go:117] "RemoveContainer" containerID="82c03e7465660e1854decede12406448d7b3c039a3c938df11039f030fd90513" Dec 04 10:39:23 crc kubenswrapper[4831]: I1204 10:39:23.139243 4831 scope.go:117] "RemoveContainer" containerID="a2df90fc4a087c6a37c1d21c67b8f2758ee28b366dc705cb113621a7fb7d3078" Dec 04 10:39:23 crc kubenswrapper[4831]: I1204 10:39:23.304476 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9wp7s" event={"ID":"7d4bb48f-fa66-44cf-ab52-2fd190bdde16","Type":"ContainerStarted","Data":"9f8d35e23c1a49ccf1c32d15df7791375b56ac864c9a7b754d8e5491ab8582f1"} Dec 04 10:39:23 crc kubenswrapper[4831]: I1204 10:39:23.309250 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jrs94" event={"ID":"5a075172-f827-4466-bda1-1c2cbca1ba32","Type":"ContainerStarted","Data":"1ca925222e35134e67b940e4b0a7fad6f8efa333776395edc7a21a6720c7d42c"} Dec 04 10:39:23 crc kubenswrapper[4831]: I1204 10:39:23.350727 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jrs94" podStartSLOduration=3.340792363 podStartE2EDuration="9.350700007s" podCreationTimestamp="2025-12-04 10:39:14 +0000 UTC" firstStartedPulling="2025-12-04 10:39:16.218781441 +0000 UTC m=+1453.167956765" lastFinishedPulling="2025-12-04 10:39:22.228689095 +0000 UTC m=+1459.177864409" observedRunningTime="2025-12-04 10:39:23.33901826 +0000 UTC m=+1460.288193574" watchObservedRunningTime="2025-12-04 10:39:23.350700007 +0000 UTC m=+1460.299875321" Dec 04 10:39:23 crc kubenswrapper[4831]: I1204 10:39:23.362324 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9wp7s" podStartSLOduration=2.6920164939999998 podStartE2EDuration="3.362307813s" podCreationTimestamp="2025-12-04 10:39:20 +0000 UTC" firstStartedPulling="2025-12-04 10:39:21.664906836 +0000 UTC m=+1458.614082150" lastFinishedPulling="2025-12-04 10:39:22.335198155 +0000 UTC m=+1459.284373469" observedRunningTime="2025-12-04 10:39:23.352482764 +0000 UTC m=+1460.301658078" watchObservedRunningTime="2025-12-04 10:39:23.362307813 +0000 UTC m=+1460.311483127" Dec 04 10:39:24 crc kubenswrapper[4831]: I1204 10:39:24.884092 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jrs94" Dec 04 10:39:24 crc kubenswrapper[4831]: I1204 10:39:24.884421 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jrs94" Dec 04 10:39:25 crc kubenswrapper[4831]: I1204 10:39:25.937438 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jrs94" podUID="5a075172-f827-4466-bda1-1c2cbca1ba32" containerName="registry-server" probeResult="failure" output=< Dec 04 10:39:25 crc kubenswrapper[4831]: timeout: failed to connect service ":50051" within 1s Dec 04 10:39:25 crc kubenswrapper[4831]: > Dec 04 10:39:34 crc kubenswrapper[4831]: I1204 10:39:34.933333 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jrs94" Dec 04 10:39:34 crc kubenswrapper[4831]: I1204 10:39:34.983119 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jrs94" Dec 04 10:39:35 crc kubenswrapper[4831]: I1204 10:39:35.174027 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jrs94"] Dec 04 10:39:36 crc kubenswrapper[4831]: I1204 10:39:36.460196 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jrs94" podUID="5a075172-f827-4466-bda1-1c2cbca1ba32" containerName="registry-server" containerID="cri-o://1ca925222e35134e67b940e4b0a7fad6f8efa333776395edc7a21a6720c7d42c" gracePeriod=2 Dec 04 10:39:37 crc kubenswrapper[4831]: I1204 10:39:37.105260 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jrs94" Dec 04 10:39:37 crc kubenswrapper[4831]: I1204 10:39:37.197231 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a075172-f827-4466-bda1-1c2cbca1ba32-catalog-content\") pod \"5a075172-f827-4466-bda1-1c2cbca1ba32\" (UID: \"5a075172-f827-4466-bda1-1c2cbca1ba32\") " Dec 04 10:39:37 crc kubenswrapper[4831]: I1204 10:39:37.197519 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a075172-f827-4466-bda1-1c2cbca1ba32-utilities\") pod \"5a075172-f827-4466-bda1-1c2cbca1ba32\" (UID: \"5a075172-f827-4466-bda1-1c2cbca1ba32\") " Dec 04 10:39:37 crc kubenswrapper[4831]: I1204 10:39:37.197635 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hrj8\" (UniqueName: \"kubernetes.io/projected/5a075172-f827-4466-bda1-1c2cbca1ba32-kube-api-access-2hrj8\") pod \"5a075172-f827-4466-bda1-1c2cbca1ba32\" (UID: \"5a075172-f827-4466-bda1-1c2cbca1ba32\") " Dec 04 10:39:37 crc kubenswrapper[4831]: I1204 10:39:37.198278 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a075172-f827-4466-bda1-1c2cbca1ba32-utilities" (OuterVolumeSpecName: "utilities") pod "5a075172-f827-4466-bda1-1c2cbca1ba32" (UID: "5a075172-f827-4466-bda1-1c2cbca1ba32"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:39:37 crc kubenswrapper[4831]: I1204 10:39:37.205870 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a075172-f827-4466-bda1-1c2cbca1ba32-kube-api-access-2hrj8" (OuterVolumeSpecName: "kube-api-access-2hrj8") pod "5a075172-f827-4466-bda1-1c2cbca1ba32" (UID: "5a075172-f827-4466-bda1-1c2cbca1ba32"). InnerVolumeSpecName "kube-api-access-2hrj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:39:37 crc kubenswrapper[4831]: I1204 10:39:37.299872 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hrj8\" (UniqueName: \"kubernetes.io/projected/5a075172-f827-4466-bda1-1c2cbca1ba32-kube-api-access-2hrj8\") on node \"crc\" DevicePath \"\"" Dec 04 10:39:37 crc kubenswrapper[4831]: I1204 10:39:37.299900 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a075172-f827-4466-bda1-1c2cbca1ba32-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:39:37 crc kubenswrapper[4831]: I1204 10:39:37.310822 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a075172-f827-4466-bda1-1c2cbca1ba32-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a075172-f827-4466-bda1-1c2cbca1ba32" (UID: "5a075172-f827-4466-bda1-1c2cbca1ba32"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:39:37 crc kubenswrapper[4831]: I1204 10:39:37.402304 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a075172-f827-4466-bda1-1c2cbca1ba32-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:39:37 crc kubenswrapper[4831]: I1204 10:39:37.471769 4831 generic.go:334] "Generic (PLEG): container finished" podID="5a075172-f827-4466-bda1-1c2cbca1ba32" containerID="1ca925222e35134e67b940e4b0a7fad6f8efa333776395edc7a21a6720c7d42c" exitCode=0 Dec 04 10:39:37 crc kubenswrapper[4831]: I1204 10:39:37.471812 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jrs94" event={"ID":"5a075172-f827-4466-bda1-1c2cbca1ba32","Type":"ContainerDied","Data":"1ca925222e35134e67b940e4b0a7fad6f8efa333776395edc7a21a6720c7d42c"} Dec 04 10:39:37 crc kubenswrapper[4831]: I1204 10:39:37.471837 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jrs94" event={"ID":"5a075172-f827-4466-bda1-1c2cbca1ba32","Type":"ContainerDied","Data":"575fff9245a2426525995c1307f2c063210ca69eca77afb9465f8ac1a6c55537"} Dec 04 10:39:37 crc kubenswrapper[4831]: I1204 10:39:37.471840 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jrs94" Dec 04 10:39:37 crc kubenswrapper[4831]: I1204 10:39:37.471853 4831 scope.go:117] "RemoveContainer" containerID="1ca925222e35134e67b940e4b0a7fad6f8efa333776395edc7a21a6720c7d42c" Dec 04 10:39:37 crc kubenswrapper[4831]: I1204 10:39:37.493340 4831 scope.go:117] "RemoveContainer" containerID="8f881fa748ee2ff422b49b88e3e107c3f44ed48ea26a790b0bfb9a8b9ba0c18c" Dec 04 10:39:37 crc kubenswrapper[4831]: I1204 10:39:37.513709 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jrs94"] Dec 04 10:39:37 crc kubenswrapper[4831]: I1204 10:39:37.523578 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jrs94"] Dec 04 10:39:37 crc kubenswrapper[4831]: I1204 10:39:37.536597 4831 scope.go:117] "RemoveContainer" containerID="9752beb61d89197ec6e30d62c7168974364904860f2753d42bd089eef626a7c4" Dec 04 10:39:37 crc kubenswrapper[4831]: I1204 10:39:37.576814 4831 scope.go:117] "RemoveContainer" containerID="1ca925222e35134e67b940e4b0a7fad6f8efa333776395edc7a21a6720c7d42c" Dec 04 10:39:37 crc kubenswrapper[4831]: E1204 10:39:37.578114 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ca925222e35134e67b940e4b0a7fad6f8efa333776395edc7a21a6720c7d42c\": container with ID starting with 1ca925222e35134e67b940e4b0a7fad6f8efa333776395edc7a21a6720c7d42c not found: ID does not exist" containerID="1ca925222e35134e67b940e4b0a7fad6f8efa333776395edc7a21a6720c7d42c" Dec 04 10:39:37 crc kubenswrapper[4831]: I1204 10:39:37.578174 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ca925222e35134e67b940e4b0a7fad6f8efa333776395edc7a21a6720c7d42c"} err="failed to get container status \"1ca925222e35134e67b940e4b0a7fad6f8efa333776395edc7a21a6720c7d42c\": rpc error: code = NotFound desc = could not find container \"1ca925222e35134e67b940e4b0a7fad6f8efa333776395edc7a21a6720c7d42c\": container with ID starting with 1ca925222e35134e67b940e4b0a7fad6f8efa333776395edc7a21a6720c7d42c not found: ID does not exist" Dec 04 10:39:37 crc kubenswrapper[4831]: I1204 10:39:37.578211 4831 scope.go:117] "RemoveContainer" containerID="8f881fa748ee2ff422b49b88e3e107c3f44ed48ea26a790b0bfb9a8b9ba0c18c" Dec 04 10:39:37 crc kubenswrapper[4831]: E1204 10:39:37.578812 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f881fa748ee2ff422b49b88e3e107c3f44ed48ea26a790b0bfb9a8b9ba0c18c\": container with ID starting with 8f881fa748ee2ff422b49b88e3e107c3f44ed48ea26a790b0bfb9a8b9ba0c18c not found: ID does not exist" containerID="8f881fa748ee2ff422b49b88e3e107c3f44ed48ea26a790b0bfb9a8b9ba0c18c" Dec 04 10:39:37 crc kubenswrapper[4831]: I1204 10:39:37.578846 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f881fa748ee2ff422b49b88e3e107c3f44ed48ea26a790b0bfb9a8b9ba0c18c"} err="failed to get container status \"8f881fa748ee2ff422b49b88e3e107c3f44ed48ea26a790b0bfb9a8b9ba0c18c\": rpc error: code = NotFound desc = could not find container \"8f881fa748ee2ff422b49b88e3e107c3f44ed48ea26a790b0bfb9a8b9ba0c18c\": container with ID starting with 8f881fa748ee2ff422b49b88e3e107c3f44ed48ea26a790b0bfb9a8b9ba0c18c not found: ID does not exist" Dec 04 10:39:37 crc kubenswrapper[4831]: I1204 10:39:37.578864 4831 scope.go:117] "RemoveContainer" containerID="9752beb61d89197ec6e30d62c7168974364904860f2753d42bd089eef626a7c4" Dec 04 10:39:37 crc kubenswrapper[4831]: E1204 10:39:37.579122 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9752beb61d89197ec6e30d62c7168974364904860f2753d42bd089eef626a7c4\": container with ID starting with 9752beb61d89197ec6e30d62c7168974364904860f2753d42bd089eef626a7c4 not found: ID does not exist" containerID="9752beb61d89197ec6e30d62c7168974364904860f2753d42bd089eef626a7c4" Dec 04 10:39:37 crc kubenswrapper[4831]: I1204 10:39:37.579149 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9752beb61d89197ec6e30d62c7168974364904860f2753d42bd089eef626a7c4"} err="failed to get container status \"9752beb61d89197ec6e30d62c7168974364904860f2753d42bd089eef626a7c4\": rpc error: code = NotFound desc = could not find container \"9752beb61d89197ec6e30d62c7168974364904860f2753d42bd089eef626a7c4\": container with ID starting with 9752beb61d89197ec6e30d62c7168974364904860f2753d42bd089eef626a7c4 not found: ID does not exist" Dec 04 10:39:39 crc kubenswrapper[4831]: I1204 10:39:39.289873 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a075172-f827-4466-bda1-1c2cbca1ba32" path="/var/lib/kubelet/pods/5a075172-f827-4466-bda1-1c2cbca1ba32/volumes" Dec 04 10:39:51 crc kubenswrapper[4831]: I1204 10:39:51.972018 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:39:51 crc kubenswrapper[4831]: I1204 10:39:51.972624 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:39:51 crc kubenswrapper[4831]: I1204 10:39:51.972690 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" Dec 04 10:39:51 crc kubenswrapper[4831]: I1204 10:39:51.973491 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"61f7279f438426e6b31be64e9ccb62c729f466d054e9b0e0a804c882066b625e"} pod="openshift-machine-config-operator/machine-config-daemon-g76nn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 10:39:51 crc kubenswrapper[4831]: I1204 10:39:51.973560 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" containerID="cri-o://61f7279f438426e6b31be64e9ccb62c729f466d054e9b0e0a804c882066b625e" gracePeriod=600 Dec 04 10:39:52 crc kubenswrapper[4831]: I1204 10:39:52.624626 4831 generic.go:334] "Generic (PLEG): container finished" podID="8475bb26-8864-4d49-935b-db7d4cb73387" containerID="61f7279f438426e6b31be64e9ccb62c729f466d054e9b0e0a804c882066b625e" exitCode=0 Dec 04 10:39:52 crc kubenswrapper[4831]: I1204 10:39:52.624687 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" event={"ID":"8475bb26-8864-4d49-935b-db7d4cb73387","Type":"ContainerDied","Data":"61f7279f438426e6b31be64e9ccb62c729f466d054e9b0e0a804c882066b625e"} Dec 04 10:39:52 crc kubenswrapper[4831]: I1204 10:39:52.625141 4831 scope.go:117] "RemoveContainer" containerID="0123b3bc298be4c2ac62175c379dc6efb186183e599f1998133a95b106c98408" Dec 04 10:39:53 crc kubenswrapper[4831]: I1204 10:39:53.638761 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" event={"ID":"8475bb26-8864-4d49-935b-db7d4cb73387","Type":"ContainerStarted","Data":"37e17da4eccc398a7fd4a49aa969c9f666aa6c38055bd6e9bb03c0ad0e553e58"} Dec 04 10:40:23 crc kubenswrapper[4831]: I1204 10:40:23.389365 4831 scope.go:117] "RemoveContainer" containerID="20ad925ebfdeacb2990ca32c947d6b7e78747b6b27ce748e6d9105619c7a23ab" Dec 04 10:40:38 crc kubenswrapper[4831]: I1204 10:40:38.655698 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bfx7n"] Dec 04 10:40:38 crc kubenswrapper[4831]: E1204 10:40:38.656944 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a075172-f827-4466-bda1-1c2cbca1ba32" containerName="extract-utilities" Dec 04 10:40:38 crc kubenswrapper[4831]: I1204 10:40:38.656965 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a075172-f827-4466-bda1-1c2cbca1ba32" containerName="extract-utilities" Dec 04 10:40:38 crc kubenswrapper[4831]: E1204 10:40:38.656993 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a075172-f827-4466-bda1-1c2cbca1ba32" containerName="extract-content" Dec 04 10:40:38 crc kubenswrapper[4831]: I1204 10:40:38.657006 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a075172-f827-4466-bda1-1c2cbca1ba32" containerName="extract-content" Dec 04 10:40:38 crc kubenswrapper[4831]: E1204 10:40:38.657064 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a075172-f827-4466-bda1-1c2cbca1ba32" containerName="registry-server" Dec 04 10:40:38 crc kubenswrapper[4831]: I1204 10:40:38.657076 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a075172-f827-4466-bda1-1c2cbca1ba32" containerName="registry-server" Dec 04 10:40:38 crc kubenswrapper[4831]: I1204 10:40:38.657450 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a075172-f827-4466-bda1-1c2cbca1ba32" containerName="registry-server" Dec 04 10:40:38 crc kubenswrapper[4831]: I1204 10:40:38.660071 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bfx7n" Dec 04 10:40:38 crc kubenswrapper[4831]: I1204 10:40:38.674359 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bfx7n"] Dec 04 10:40:38 crc kubenswrapper[4831]: I1204 10:40:38.702353 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh5bq\" (UniqueName: \"kubernetes.io/projected/e033ffc7-4ae8-4af5-8e16-8d44c339e76e-kube-api-access-wh5bq\") pod \"community-operators-bfx7n\" (UID: \"e033ffc7-4ae8-4af5-8e16-8d44c339e76e\") " pod="openshift-marketplace/community-operators-bfx7n" Dec 04 10:40:38 crc kubenswrapper[4831]: I1204 10:40:38.702433 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e033ffc7-4ae8-4af5-8e16-8d44c339e76e-utilities\") pod \"community-operators-bfx7n\" (UID: \"e033ffc7-4ae8-4af5-8e16-8d44c339e76e\") " pod="openshift-marketplace/community-operators-bfx7n" Dec 04 10:40:38 crc kubenswrapper[4831]: I1204 10:40:38.702475 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e033ffc7-4ae8-4af5-8e16-8d44c339e76e-catalog-content\") pod \"community-operators-bfx7n\" (UID: \"e033ffc7-4ae8-4af5-8e16-8d44c339e76e\") " pod="openshift-marketplace/community-operators-bfx7n" Dec 04 10:40:38 crc kubenswrapper[4831]: I1204 10:40:38.803726 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh5bq\" (UniqueName: \"kubernetes.io/projected/e033ffc7-4ae8-4af5-8e16-8d44c339e76e-kube-api-access-wh5bq\") pod \"community-operators-bfx7n\" (UID: \"e033ffc7-4ae8-4af5-8e16-8d44c339e76e\") " pod="openshift-marketplace/community-operators-bfx7n" Dec 04 10:40:38 crc kubenswrapper[4831]: I1204 10:40:38.803803 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e033ffc7-4ae8-4af5-8e16-8d44c339e76e-utilities\") pod \"community-operators-bfx7n\" (UID: \"e033ffc7-4ae8-4af5-8e16-8d44c339e76e\") " pod="openshift-marketplace/community-operators-bfx7n" Dec 04 10:40:38 crc kubenswrapper[4831]: I1204 10:40:38.803845 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e033ffc7-4ae8-4af5-8e16-8d44c339e76e-catalog-content\") pod \"community-operators-bfx7n\" (UID: \"e033ffc7-4ae8-4af5-8e16-8d44c339e76e\") " pod="openshift-marketplace/community-operators-bfx7n" Dec 04 10:40:38 crc kubenswrapper[4831]: I1204 10:40:38.804432 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e033ffc7-4ae8-4af5-8e16-8d44c339e76e-catalog-content\") pod \"community-operators-bfx7n\" (UID: \"e033ffc7-4ae8-4af5-8e16-8d44c339e76e\") " pod="openshift-marketplace/community-operators-bfx7n" Dec 04 10:40:38 crc kubenswrapper[4831]: I1204 10:40:38.804730 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e033ffc7-4ae8-4af5-8e16-8d44c339e76e-utilities\") pod \"community-operators-bfx7n\" (UID: \"e033ffc7-4ae8-4af5-8e16-8d44c339e76e\") " pod="openshift-marketplace/community-operators-bfx7n" Dec 04 10:40:38 crc kubenswrapper[4831]: I1204 10:40:38.824221 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh5bq\" (UniqueName: \"kubernetes.io/projected/e033ffc7-4ae8-4af5-8e16-8d44c339e76e-kube-api-access-wh5bq\") pod \"community-operators-bfx7n\" (UID: \"e033ffc7-4ae8-4af5-8e16-8d44c339e76e\") " pod="openshift-marketplace/community-operators-bfx7n" Dec 04 10:40:38 crc kubenswrapper[4831]: I1204 10:40:38.989708 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bfx7n" Dec 04 10:40:39 crc kubenswrapper[4831]: I1204 10:40:39.550355 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bfx7n"] Dec 04 10:40:40 crc kubenswrapper[4831]: I1204 10:40:40.169941 4831 generic.go:334] "Generic (PLEG): container finished" podID="e033ffc7-4ae8-4af5-8e16-8d44c339e76e" containerID="dc89be2816cb1a1e1f2169f499cef01e025f7efb61fac2b5682b40d1af960a39" exitCode=0 Dec 04 10:40:40 crc kubenswrapper[4831]: I1204 10:40:40.170064 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bfx7n" event={"ID":"e033ffc7-4ae8-4af5-8e16-8d44c339e76e","Type":"ContainerDied","Data":"dc89be2816cb1a1e1f2169f499cef01e025f7efb61fac2b5682b40d1af960a39"} Dec 04 10:40:40 crc kubenswrapper[4831]: I1204 10:40:40.170282 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bfx7n" event={"ID":"e033ffc7-4ae8-4af5-8e16-8d44c339e76e","Type":"ContainerStarted","Data":"a3593e2d668160a7b7007b64568d63e2b7d79c53fa02586ab959338cf9263f3e"} Dec 04 10:40:42 crc kubenswrapper[4831]: I1204 10:40:42.190575 4831 generic.go:334] "Generic (PLEG): container finished" podID="e033ffc7-4ae8-4af5-8e16-8d44c339e76e" containerID="cc0149d2fbe75ec270bf992b124aac4ab3f33a60e0c18b3619a6099518dbf71b" exitCode=0 Dec 04 10:40:42 crc kubenswrapper[4831]: I1204 10:40:42.190628 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bfx7n" event={"ID":"e033ffc7-4ae8-4af5-8e16-8d44c339e76e","Type":"ContainerDied","Data":"cc0149d2fbe75ec270bf992b124aac4ab3f33a60e0c18b3619a6099518dbf71b"} Dec 04 10:40:43 crc kubenswrapper[4831]: I1204 10:40:43.204702 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bfx7n" event={"ID":"e033ffc7-4ae8-4af5-8e16-8d44c339e76e","Type":"ContainerStarted","Data":"72debe64c3d8b4c5f6f2a8bbf3644cac3ae904e995fdd1366e8b5da8eb2676f1"} Dec 04 10:40:43 crc kubenswrapper[4831]: I1204 10:40:43.225486 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bfx7n" podStartSLOduration=2.4600370160000002 podStartE2EDuration="5.225470368s" podCreationTimestamp="2025-12-04 10:40:38 +0000 UTC" firstStartedPulling="2025-12-04 10:40:40.172051356 +0000 UTC m=+1537.121226670" lastFinishedPulling="2025-12-04 10:40:42.937484708 +0000 UTC m=+1539.886660022" observedRunningTime="2025-12-04 10:40:43.224197144 +0000 UTC m=+1540.173372458" watchObservedRunningTime="2025-12-04 10:40:43.225470368 +0000 UTC m=+1540.174645682" Dec 04 10:40:43 crc kubenswrapper[4831]: I1204 10:40:43.432165 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dfvkj"] Dec 04 10:40:43 crc kubenswrapper[4831]: I1204 10:40:43.434597 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dfvkj" Dec 04 10:40:43 crc kubenswrapper[4831]: I1204 10:40:43.444756 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dfvkj"] Dec 04 10:40:43 crc kubenswrapper[4831]: I1204 10:40:43.603299 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79xzk\" (UniqueName: \"kubernetes.io/projected/9437113b-ead7-4721-9bcb-8822aa9b3415-kube-api-access-79xzk\") pod \"certified-operators-dfvkj\" (UID: \"9437113b-ead7-4721-9bcb-8822aa9b3415\") " pod="openshift-marketplace/certified-operators-dfvkj" Dec 04 10:40:43 crc kubenswrapper[4831]: I1204 10:40:43.603723 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9437113b-ead7-4721-9bcb-8822aa9b3415-utilities\") pod \"certified-operators-dfvkj\" (UID: \"9437113b-ead7-4721-9bcb-8822aa9b3415\") " pod="openshift-marketplace/certified-operators-dfvkj" Dec 04 10:40:43 crc kubenswrapper[4831]: I1204 10:40:43.603891 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9437113b-ead7-4721-9bcb-8822aa9b3415-catalog-content\") pod \"certified-operators-dfvkj\" (UID: \"9437113b-ead7-4721-9bcb-8822aa9b3415\") " pod="openshift-marketplace/certified-operators-dfvkj" Dec 04 10:40:43 crc kubenswrapper[4831]: I1204 10:40:43.706166 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9437113b-ead7-4721-9bcb-8822aa9b3415-utilities\") pod \"certified-operators-dfvkj\" (UID: \"9437113b-ead7-4721-9bcb-8822aa9b3415\") " pod="openshift-marketplace/certified-operators-dfvkj" Dec 04 10:40:43 crc kubenswrapper[4831]: I1204 10:40:43.706252 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9437113b-ead7-4721-9bcb-8822aa9b3415-catalog-content\") pod \"certified-operators-dfvkj\" (UID: \"9437113b-ead7-4721-9bcb-8822aa9b3415\") " pod="openshift-marketplace/certified-operators-dfvkj" Dec 04 10:40:43 crc kubenswrapper[4831]: I1204 10:40:43.706347 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79xzk\" (UniqueName: \"kubernetes.io/projected/9437113b-ead7-4721-9bcb-8822aa9b3415-kube-api-access-79xzk\") pod \"certified-operators-dfvkj\" (UID: \"9437113b-ead7-4721-9bcb-8822aa9b3415\") " pod="openshift-marketplace/certified-operators-dfvkj" Dec 04 10:40:43 crc kubenswrapper[4831]: I1204 10:40:43.707090 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9437113b-ead7-4721-9bcb-8822aa9b3415-utilities\") pod \"certified-operators-dfvkj\" (UID: \"9437113b-ead7-4721-9bcb-8822aa9b3415\") " pod="openshift-marketplace/certified-operators-dfvkj" Dec 04 10:40:43 crc kubenswrapper[4831]: I1204 10:40:43.707121 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9437113b-ead7-4721-9bcb-8822aa9b3415-catalog-content\") pod \"certified-operators-dfvkj\" (UID: \"9437113b-ead7-4721-9bcb-8822aa9b3415\") " pod="openshift-marketplace/certified-operators-dfvkj" Dec 04 10:40:43 crc kubenswrapper[4831]: I1204 10:40:43.732178 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79xzk\" (UniqueName: \"kubernetes.io/projected/9437113b-ead7-4721-9bcb-8822aa9b3415-kube-api-access-79xzk\") pod \"certified-operators-dfvkj\" (UID: \"9437113b-ead7-4721-9bcb-8822aa9b3415\") " pod="openshift-marketplace/certified-operators-dfvkj" Dec 04 10:40:43 crc kubenswrapper[4831]: I1204 10:40:43.752905 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dfvkj" Dec 04 10:40:44 crc kubenswrapper[4831]: I1204 10:40:44.292598 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dfvkj"] Dec 04 10:40:45 crc kubenswrapper[4831]: I1204 10:40:45.230010 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dfvkj" event={"ID":"9437113b-ead7-4721-9bcb-8822aa9b3415","Type":"ContainerStarted","Data":"ce1138a9ebbe699081ed217de5748ee1380aea967b47732579324d26c65b0237"} Dec 04 10:40:46 crc kubenswrapper[4831]: I1204 10:40:46.243485 4831 generic.go:334] "Generic (PLEG): container finished" podID="9437113b-ead7-4721-9bcb-8822aa9b3415" containerID="9bf39f95ea7f41d9416ce47e3ca781e1c0fad79cbca2f3d86601c4fb8697e17f" exitCode=0 Dec 04 10:40:46 crc kubenswrapper[4831]: I1204 10:40:46.243568 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dfvkj" event={"ID":"9437113b-ead7-4721-9bcb-8822aa9b3415","Type":"ContainerDied","Data":"9bf39f95ea7f41d9416ce47e3ca781e1c0fad79cbca2f3d86601c4fb8697e17f"} Dec 04 10:40:46 crc kubenswrapper[4831]: I1204 10:40:46.246113 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 10:40:48 crc kubenswrapper[4831]: I1204 10:40:48.990257 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bfx7n" Dec 04 10:40:48 crc kubenswrapper[4831]: I1204 10:40:48.990652 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bfx7n" Dec 04 10:40:49 crc kubenswrapper[4831]: I1204 10:40:49.037872 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bfx7n" Dec 04 10:40:49 crc kubenswrapper[4831]: I1204 10:40:49.331985 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bfx7n" Dec 04 10:40:50 crc kubenswrapper[4831]: I1204 10:40:50.228776 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bfx7n"] Dec 04 10:40:50 crc kubenswrapper[4831]: I1204 10:40:50.296300 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dfvkj" event={"ID":"9437113b-ead7-4721-9bcb-8822aa9b3415","Type":"ContainerStarted","Data":"3923c3f38104ee2fe572326ccc93d7a3ffd577a135c330fcd82d69c129d903f4"} Dec 04 10:40:51 crc kubenswrapper[4831]: I1204 10:40:51.327344 4831 generic.go:334] "Generic (PLEG): container finished" podID="9437113b-ead7-4721-9bcb-8822aa9b3415" containerID="3923c3f38104ee2fe572326ccc93d7a3ffd577a135c330fcd82d69c129d903f4" exitCode=0 Dec 04 10:40:51 crc kubenswrapper[4831]: I1204 10:40:51.327454 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dfvkj" event={"ID":"9437113b-ead7-4721-9bcb-8822aa9b3415","Type":"ContainerDied","Data":"3923c3f38104ee2fe572326ccc93d7a3ffd577a135c330fcd82d69c129d903f4"} Dec 04 10:40:51 crc kubenswrapper[4831]: I1204 10:40:51.327991 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bfx7n" podUID="e033ffc7-4ae8-4af5-8e16-8d44c339e76e" containerName="registry-server" containerID="cri-o://72debe64c3d8b4c5f6f2a8bbf3644cac3ae904e995fdd1366e8b5da8eb2676f1" gracePeriod=2 Dec 04 10:40:51 crc kubenswrapper[4831]: I1204 10:40:51.834912 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bfx7n" Dec 04 10:40:51 crc kubenswrapper[4831]: I1204 10:40:51.975545 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e033ffc7-4ae8-4af5-8e16-8d44c339e76e-utilities\") pod \"e033ffc7-4ae8-4af5-8e16-8d44c339e76e\" (UID: \"e033ffc7-4ae8-4af5-8e16-8d44c339e76e\") " Dec 04 10:40:51 crc kubenswrapper[4831]: I1204 10:40:51.975767 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wh5bq\" (UniqueName: \"kubernetes.io/projected/e033ffc7-4ae8-4af5-8e16-8d44c339e76e-kube-api-access-wh5bq\") pod \"e033ffc7-4ae8-4af5-8e16-8d44c339e76e\" (UID: \"e033ffc7-4ae8-4af5-8e16-8d44c339e76e\") " Dec 04 10:40:51 crc kubenswrapper[4831]: I1204 10:40:51.975843 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e033ffc7-4ae8-4af5-8e16-8d44c339e76e-catalog-content\") pod \"e033ffc7-4ae8-4af5-8e16-8d44c339e76e\" (UID: \"e033ffc7-4ae8-4af5-8e16-8d44c339e76e\") " Dec 04 10:40:51 crc kubenswrapper[4831]: I1204 10:40:51.976626 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e033ffc7-4ae8-4af5-8e16-8d44c339e76e-utilities" (OuterVolumeSpecName: "utilities") pod "e033ffc7-4ae8-4af5-8e16-8d44c339e76e" (UID: "e033ffc7-4ae8-4af5-8e16-8d44c339e76e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:40:51 crc kubenswrapper[4831]: I1204 10:40:51.982622 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e033ffc7-4ae8-4af5-8e16-8d44c339e76e-kube-api-access-wh5bq" (OuterVolumeSpecName: "kube-api-access-wh5bq") pod "e033ffc7-4ae8-4af5-8e16-8d44c339e76e" (UID: "e033ffc7-4ae8-4af5-8e16-8d44c339e76e"). InnerVolumeSpecName "kube-api-access-wh5bq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:40:52 crc kubenswrapper[4831]: I1204 10:40:52.078231 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e033ffc7-4ae8-4af5-8e16-8d44c339e76e-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:40:52 crc kubenswrapper[4831]: I1204 10:40:52.078274 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wh5bq\" (UniqueName: \"kubernetes.io/projected/e033ffc7-4ae8-4af5-8e16-8d44c339e76e-kube-api-access-wh5bq\") on node \"crc\" DevicePath \"\"" Dec 04 10:40:52 crc kubenswrapper[4831]: I1204 10:40:52.081365 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e033ffc7-4ae8-4af5-8e16-8d44c339e76e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e033ffc7-4ae8-4af5-8e16-8d44c339e76e" (UID: "e033ffc7-4ae8-4af5-8e16-8d44c339e76e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:40:52 crc kubenswrapper[4831]: I1204 10:40:52.180584 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e033ffc7-4ae8-4af5-8e16-8d44c339e76e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:40:52 crc kubenswrapper[4831]: I1204 10:40:52.342335 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dfvkj" event={"ID":"9437113b-ead7-4721-9bcb-8822aa9b3415","Type":"ContainerStarted","Data":"941633f51e4838c342f9830a847ccd05ca65f5fdb09dfc5b62e91610ac9eaacf"} Dec 04 10:40:52 crc kubenswrapper[4831]: I1204 10:40:52.346380 4831 generic.go:334] "Generic (PLEG): container finished" podID="e033ffc7-4ae8-4af5-8e16-8d44c339e76e" containerID="72debe64c3d8b4c5f6f2a8bbf3644cac3ae904e995fdd1366e8b5da8eb2676f1" exitCode=0 Dec 04 10:40:52 crc kubenswrapper[4831]: I1204 10:40:52.346409 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bfx7n" Dec 04 10:40:52 crc kubenswrapper[4831]: I1204 10:40:52.346424 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bfx7n" event={"ID":"e033ffc7-4ae8-4af5-8e16-8d44c339e76e","Type":"ContainerDied","Data":"72debe64c3d8b4c5f6f2a8bbf3644cac3ae904e995fdd1366e8b5da8eb2676f1"} Dec 04 10:40:52 crc kubenswrapper[4831]: I1204 10:40:52.346695 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bfx7n" event={"ID":"e033ffc7-4ae8-4af5-8e16-8d44c339e76e","Type":"ContainerDied","Data":"a3593e2d668160a7b7007b64568d63e2b7d79c53fa02586ab959338cf9263f3e"} Dec 04 10:40:52 crc kubenswrapper[4831]: I1204 10:40:52.346711 4831 scope.go:117] "RemoveContainer" containerID="72debe64c3d8b4c5f6f2a8bbf3644cac3ae904e995fdd1366e8b5da8eb2676f1" Dec 04 10:40:52 crc kubenswrapper[4831]: I1204 10:40:52.370062 4831 scope.go:117] "RemoveContainer" containerID="cc0149d2fbe75ec270bf992b124aac4ab3f33a60e0c18b3619a6099518dbf71b" Dec 04 10:40:52 crc kubenswrapper[4831]: I1204 10:40:52.373853 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dfvkj" podStartSLOduration=3.483933079 podStartE2EDuration="9.37383091s" podCreationTimestamp="2025-12-04 10:40:43 +0000 UTC" firstStartedPulling="2025-12-04 10:40:46.245781279 +0000 UTC m=+1543.194956603" lastFinishedPulling="2025-12-04 10:40:52.13567912 +0000 UTC m=+1549.084854434" observedRunningTime="2025-12-04 10:40:52.362225335 +0000 UTC m=+1549.311400669" watchObservedRunningTime="2025-12-04 10:40:52.37383091 +0000 UTC m=+1549.323006224" Dec 04 10:40:52 crc kubenswrapper[4831]: I1204 10:40:52.387808 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bfx7n"] Dec 04 10:40:52 crc kubenswrapper[4831]: I1204 10:40:52.401873 4831 scope.go:117] "RemoveContainer" containerID="dc89be2816cb1a1e1f2169f499cef01e025f7efb61fac2b5682b40d1af960a39" Dec 04 10:40:52 crc kubenswrapper[4831]: I1204 10:40:52.409523 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bfx7n"] Dec 04 10:40:52 crc kubenswrapper[4831]: I1204 10:40:52.430382 4831 scope.go:117] "RemoveContainer" containerID="72debe64c3d8b4c5f6f2a8bbf3644cac3ae904e995fdd1366e8b5da8eb2676f1" Dec 04 10:40:52 crc kubenswrapper[4831]: E1204 10:40:52.434954 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72debe64c3d8b4c5f6f2a8bbf3644cac3ae904e995fdd1366e8b5da8eb2676f1\": container with ID starting with 72debe64c3d8b4c5f6f2a8bbf3644cac3ae904e995fdd1366e8b5da8eb2676f1 not found: ID does not exist" containerID="72debe64c3d8b4c5f6f2a8bbf3644cac3ae904e995fdd1366e8b5da8eb2676f1" Dec 04 10:40:52 crc kubenswrapper[4831]: I1204 10:40:52.435029 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72debe64c3d8b4c5f6f2a8bbf3644cac3ae904e995fdd1366e8b5da8eb2676f1"} err="failed to get container status \"72debe64c3d8b4c5f6f2a8bbf3644cac3ae904e995fdd1366e8b5da8eb2676f1\": rpc error: code = NotFound desc = could not find container \"72debe64c3d8b4c5f6f2a8bbf3644cac3ae904e995fdd1366e8b5da8eb2676f1\": container with ID starting with 72debe64c3d8b4c5f6f2a8bbf3644cac3ae904e995fdd1366e8b5da8eb2676f1 not found: ID does not exist" Dec 04 10:40:52 crc kubenswrapper[4831]: I1204 10:40:52.435082 4831 scope.go:117] "RemoveContainer" containerID="cc0149d2fbe75ec270bf992b124aac4ab3f33a60e0c18b3619a6099518dbf71b" Dec 04 10:40:52 crc kubenswrapper[4831]: E1204 10:40:52.435885 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc0149d2fbe75ec270bf992b124aac4ab3f33a60e0c18b3619a6099518dbf71b\": container with ID starting with cc0149d2fbe75ec270bf992b124aac4ab3f33a60e0c18b3619a6099518dbf71b not found: ID does not exist" containerID="cc0149d2fbe75ec270bf992b124aac4ab3f33a60e0c18b3619a6099518dbf71b" Dec 04 10:40:52 crc kubenswrapper[4831]: I1204 10:40:52.435927 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc0149d2fbe75ec270bf992b124aac4ab3f33a60e0c18b3619a6099518dbf71b"} err="failed to get container status \"cc0149d2fbe75ec270bf992b124aac4ab3f33a60e0c18b3619a6099518dbf71b\": rpc error: code = NotFound desc = could not find container \"cc0149d2fbe75ec270bf992b124aac4ab3f33a60e0c18b3619a6099518dbf71b\": container with ID starting with cc0149d2fbe75ec270bf992b124aac4ab3f33a60e0c18b3619a6099518dbf71b not found: ID does not exist" Dec 04 10:40:52 crc kubenswrapper[4831]: I1204 10:40:52.435952 4831 scope.go:117] "RemoveContainer" containerID="dc89be2816cb1a1e1f2169f499cef01e025f7efb61fac2b5682b40d1af960a39" Dec 04 10:40:52 crc kubenswrapper[4831]: E1204 10:40:52.436402 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc89be2816cb1a1e1f2169f499cef01e025f7efb61fac2b5682b40d1af960a39\": container with ID starting with dc89be2816cb1a1e1f2169f499cef01e025f7efb61fac2b5682b40d1af960a39 not found: ID does not exist" containerID="dc89be2816cb1a1e1f2169f499cef01e025f7efb61fac2b5682b40d1af960a39" Dec 04 10:40:52 crc kubenswrapper[4831]: I1204 10:40:52.436445 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc89be2816cb1a1e1f2169f499cef01e025f7efb61fac2b5682b40d1af960a39"} err="failed to get container status \"dc89be2816cb1a1e1f2169f499cef01e025f7efb61fac2b5682b40d1af960a39\": rpc error: code = NotFound desc = could not find container \"dc89be2816cb1a1e1f2169f499cef01e025f7efb61fac2b5682b40d1af960a39\": container with ID starting with dc89be2816cb1a1e1f2169f499cef01e025f7efb61fac2b5682b40d1af960a39 not found: ID does not exist" Dec 04 10:40:53 crc kubenswrapper[4831]: I1204 10:40:53.292417 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e033ffc7-4ae8-4af5-8e16-8d44c339e76e" path="/var/lib/kubelet/pods/e033ffc7-4ae8-4af5-8e16-8d44c339e76e/volumes" Dec 04 10:40:53 crc kubenswrapper[4831]: I1204 10:40:53.753394 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dfvkj" Dec 04 10:40:53 crc kubenswrapper[4831]: I1204 10:40:53.753648 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dfvkj" Dec 04 10:40:53 crc kubenswrapper[4831]: I1204 10:40:53.802878 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dfvkj" Dec 04 10:41:03 crc kubenswrapper[4831]: I1204 10:41:03.803430 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dfvkj" Dec 04 10:41:03 crc kubenswrapper[4831]: I1204 10:41:03.854251 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dfvkj"] Dec 04 10:41:04 crc kubenswrapper[4831]: I1204 10:41:04.463603 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dfvkj" podUID="9437113b-ead7-4721-9bcb-8822aa9b3415" containerName="registry-server" containerID="cri-o://941633f51e4838c342f9830a847ccd05ca65f5fdb09dfc5b62e91610ac9eaacf" gracePeriod=2 Dec 04 10:41:04 crc kubenswrapper[4831]: I1204 10:41:04.988360 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dfvkj" Dec 04 10:41:05 crc kubenswrapper[4831]: I1204 10:41:05.128557 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9437113b-ead7-4721-9bcb-8822aa9b3415-utilities\") pod \"9437113b-ead7-4721-9bcb-8822aa9b3415\" (UID: \"9437113b-ead7-4721-9bcb-8822aa9b3415\") " Dec 04 10:41:05 crc kubenswrapper[4831]: I1204 10:41:05.128638 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79xzk\" (UniqueName: \"kubernetes.io/projected/9437113b-ead7-4721-9bcb-8822aa9b3415-kube-api-access-79xzk\") pod \"9437113b-ead7-4721-9bcb-8822aa9b3415\" (UID: \"9437113b-ead7-4721-9bcb-8822aa9b3415\") " Dec 04 10:41:05 crc kubenswrapper[4831]: I1204 10:41:05.128810 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9437113b-ead7-4721-9bcb-8822aa9b3415-catalog-content\") pod \"9437113b-ead7-4721-9bcb-8822aa9b3415\" (UID: \"9437113b-ead7-4721-9bcb-8822aa9b3415\") " Dec 04 10:41:05 crc kubenswrapper[4831]: I1204 10:41:05.129720 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9437113b-ead7-4721-9bcb-8822aa9b3415-utilities" (OuterVolumeSpecName: "utilities") pod "9437113b-ead7-4721-9bcb-8822aa9b3415" (UID: "9437113b-ead7-4721-9bcb-8822aa9b3415"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:41:05 crc kubenswrapper[4831]: I1204 10:41:05.135056 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9437113b-ead7-4721-9bcb-8822aa9b3415-kube-api-access-79xzk" (OuterVolumeSpecName: "kube-api-access-79xzk") pod "9437113b-ead7-4721-9bcb-8822aa9b3415" (UID: "9437113b-ead7-4721-9bcb-8822aa9b3415"). InnerVolumeSpecName "kube-api-access-79xzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:41:05 crc kubenswrapper[4831]: I1204 10:41:05.181146 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9437113b-ead7-4721-9bcb-8822aa9b3415-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9437113b-ead7-4721-9bcb-8822aa9b3415" (UID: "9437113b-ead7-4721-9bcb-8822aa9b3415"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:41:05 crc kubenswrapper[4831]: I1204 10:41:05.231300 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9437113b-ead7-4721-9bcb-8822aa9b3415-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:41:05 crc kubenswrapper[4831]: I1204 10:41:05.231339 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79xzk\" (UniqueName: \"kubernetes.io/projected/9437113b-ead7-4721-9bcb-8822aa9b3415-kube-api-access-79xzk\") on node \"crc\" DevicePath \"\"" Dec 04 10:41:05 crc kubenswrapper[4831]: I1204 10:41:05.231352 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9437113b-ead7-4721-9bcb-8822aa9b3415-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:41:05 crc kubenswrapper[4831]: I1204 10:41:05.475120 4831 generic.go:334] "Generic (PLEG): container finished" podID="9437113b-ead7-4721-9bcb-8822aa9b3415" containerID="941633f51e4838c342f9830a847ccd05ca65f5fdb09dfc5b62e91610ac9eaacf" exitCode=0 Dec 04 10:41:05 crc kubenswrapper[4831]: I1204 10:41:05.475163 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dfvkj" event={"ID":"9437113b-ead7-4721-9bcb-8822aa9b3415","Type":"ContainerDied","Data":"941633f51e4838c342f9830a847ccd05ca65f5fdb09dfc5b62e91610ac9eaacf"} Dec 04 10:41:05 crc kubenswrapper[4831]: I1204 10:41:05.475189 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dfvkj" event={"ID":"9437113b-ead7-4721-9bcb-8822aa9b3415","Type":"ContainerDied","Data":"ce1138a9ebbe699081ed217de5748ee1380aea967b47732579324d26c65b0237"} Dec 04 10:41:05 crc kubenswrapper[4831]: I1204 10:41:05.475208 4831 scope.go:117] "RemoveContainer" containerID="941633f51e4838c342f9830a847ccd05ca65f5fdb09dfc5b62e91610ac9eaacf" Dec 04 10:41:05 crc kubenswrapper[4831]: I1204 10:41:05.475219 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dfvkj" Dec 04 10:41:05 crc kubenswrapper[4831]: I1204 10:41:05.498196 4831 scope.go:117] "RemoveContainer" containerID="3923c3f38104ee2fe572326ccc93d7a3ffd577a135c330fcd82d69c129d903f4" Dec 04 10:41:05 crc kubenswrapper[4831]: I1204 10:41:05.506198 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dfvkj"] Dec 04 10:41:05 crc kubenswrapper[4831]: I1204 10:41:05.517251 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dfvkj"] Dec 04 10:41:05 crc kubenswrapper[4831]: I1204 10:41:05.521535 4831 scope.go:117] "RemoveContainer" containerID="9bf39f95ea7f41d9416ce47e3ca781e1c0fad79cbca2f3d86601c4fb8697e17f" Dec 04 10:41:05 crc kubenswrapper[4831]: I1204 10:41:05.566941 4831 scope.go:117] "RemoveContainer" containerID="941633f51e4838c342f9830a847ccd05ca65f5fdb09dfc5b62e91610ac9eaacf" Dec 04 10:41:05 crc kubenswrapper[4831]: E1204 10:41:05.567446 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"941633f51e4838c342f9830a847ccd05ca65f5fdb09dfc5b62e91610ac9eaacf\": container with ID starting with 941633f51e4838c342f9830a847ccd05ca65f5fdb09dfc5b62e91610ac9eaacf not found: ID does not exist" containerID="941633f51e4838c342f9830a847ccd05ca65f5fdb09dfc5b62e91610ac9eaacf" Dec 04 10:41:05 crc kubenswrapper[4831]: I1204 10:41:05.567490 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"941633f51e4838c342f9830a847ccd05ca65f5fdb09dfc5b62e91610ac9eaacf"} err="failed to get container status \"941633f51e4838c342f9830a847ccd05ca65f5fdb09dfc5b62e91610ac9eaacf\": rpc error: code = NotFound desc = could not find container \"941633f51e4838c342f9830a847ccd05ca65f5fdb09dfc5b62e91610ac9eaacf\": container with ID starting with 941633f51e4838c342f9830a847ccd05ca65f5fdb09dfc5b62e91610ac9eaacf not found: ID does not exist" Dec 04 10:41:05 crc kubenswrapper[4831]: I1204 10:41:05.567519 4831 scope.go:117] "RemoveContainer" containerID="3923c3f38104ee2fe572326ccc93d7a3ffd577a135c330fcd82d69c129d903f4" Dec 04 10:41:05 crc kubenswrapper[4831]: E1204 10:41:05.568034 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3923c3f38104ee2fe572326ccc93d7a3ffd577a135c330fcd82d69c129d903f4\": container with ID starting with 3923c3f38104ee2fe572326ccc93d7a3ffd577a135c330fcd82d69c129d903f4 not found: ID does not exist" containerID="3923c3f38104ee2fe572326ccc93d7a3ffd577a135c330fcd82d69c129d903f4" Dec 04 10:41:05 crc kubenswrapper[4831]: I1204 10:41:05.568103 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3923c3f38104ee2fe572326ccc93d7a3ffd577a135c330fcd82d69c129d903f4"} err="failed to get container status \"3923c3f38104ee2fe572326ccc93d7a3ffd577a135c330fcd82d69c129d903f4\": rpc error: code = NotFound desc = could not find container \"3923c3f38104ee2fe572326ccc93d7a3ffd577a135c330fcd82d69c129d903f4\": container with ID starting with 3923c3f38104ee2fe572326ccc93d7a3ffd577a135c330fcd82d69c129d903f4 not found: ID does not exist" Dec 04 10:41:05 crc kubenswrapper[4831]: I1204 10:41:05.568144 4831 scope.go:117] "RemoveContainer" containerID="9bf39f95ea7f41d9416ce47e3ca781e1c0fad79cbca2f3d86601c4fb8697e17f" Dec 04 10:41:05 crc kubenswrapper[4831]: E1204 10:41:05.568607 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bf39f95ea7f41d9416ce47e3ca781e1c0fad79cbca2f3d86601c4fb8697e17f\": container with ID starting with 9bf39f95ea7f41d9416ce47e3ca781e1c0fad79cbca2f3d86601c4fb8697e17f not found: ID does not exist" containerID="9bf39f95ea7f41d9416ce47e3ca781e1c0fad79cbca2f3d86601c4fb8697e17f" Dec 04 10:41:05 crc kubenswrapper[4831]: I1204 10:41:05.568640 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bf39f95ea7f41d9416ce47e3ca781e1c0fad79cbca2f3d86601c4fb8697e17f"} err="failed to get container status \"9bf39f95ea7f41d9416ce47e3ca781e1c0fad79cbca2f3d86601c4fb8697e17f\": rpc error: code = NotFound desc = could not find container \"9bf39f95ea7f41d9416ce47e3ca781e1c0fad79cbca2f3d86601c4fb8697e17f\": container with ID starting with 9bf39f95ea7f41d9416ce47e3ca781e1c0fad79cbca2f3d86601c4fb8697e17f not found: ID does not exist" Dec 04 10:41:07 crc kubenswrapper[4831]: I1204 10:41:07.287985 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9437113b-ead7-4721-9bcb-8822aa9b3415" path="/var/lib/kubelet/pods/9437113b-ead7-4721-9bcb-8822aa9b3415/volumes" Dec 04 10:41:23 crc kubenswrapper[4831]: I1204 10:41:23.836617 4831 scope.go:117] "RemoveContainer" containerID="4df4761576383142bd0ba848fb81f6cbc5e439c5f6787f7c356318ebb95a277f" Dec 04 10:42:21 crc kubenswrapper[4831]: I1204 10:42:21.971876 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:42:21 crc kubenswrapper[4831]: I1204 10:42:21.972451 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:42:51 crc kubenswrapper[4831]: I1204 10:42:51.676732 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9wp7s" event={"ID":"7d4bb48f-fa66-44cf-ab52-2fd190bdde16","Type":"ContainerDied","Data":"9f8d35e23c1a49ccf1c32d15df7791375b56ac864c9a7b754d8e5491ab8582f1"} Dec 04 10:42:51 crc kubenswrapper[4831]: I1204 10:42:51.676695 4831 generic.go:334] "Generic (PLEG): container finished" podID="7d4bb48f-fa66-44cf-ab52-2fd190bdde16" containerID="9f8d35e23c1a49ccf1c32d15df7791375b56ac864c9a7b754d8e5491ab8582f1" exitCode=0 Dec 04 10:42:51 crc kubenswrapper[4831]: I1204 10:42:51.971452 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:42:51 crc kubenswrapper[4831]: I1204 10:42:51.971511 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:42:53 crc kubenswrapper[4831]: I1204 10:42:53.108633 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9wp7s" Dec 04 10:42:53 crc kubenswrapper[4831]: I1204 10:42:53.228347 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d4bb48f-fa66-44cf-ab52-2fd190bdde16-bootstrap-combined-ca-bundle\") pod \"7d4bb48f-fa66-44cf-ab52-2fd190bdde16\" (UID: \"7d4bb48f-fa66-44cf-ab52-2fd190bdde16\") " Dec 04 10:42:53 crc kubenswrapper[4831]: I1204 10:42:53.228391 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qkfg\" (UniqueName: \"kubernetes.io/projected/7d4bb48f-fa66-44cf-ab52-2fd190bdde16-kube-api-access-9qkfg\") pod \"7d4bb48f-fa66-44cf-ab52-2fd190bdde16\" (UID: \"7d4bb48f-fa66-44cf-ab52-2fd190bdde16\") " Dec 04 10:42:53 crc kubenswrapper[4831]: I1204 10:42:53.228443 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d4bb48f-fa66-44cf-ab52-2fd190bdde16-ssh-key\") pod \"7d4bb48f-fa66-44cf-ab52-2fd190bdde16\" (UID: \"7d4bb48f-fa66-44cf-ab52-2fd190bdde16\") " Dec 04 10:42:53 crc kubenswrapper[4831]: I1204 10:42:53.228614 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d4bb48f-fa66-44cf-ab52-2fd190bdde16-inventory\") pod \"7d4bb48f-fa66-44cf-ab52-2fd190bdde16\" (UID: \"7d4bb48f-fa66-44cf-ab52-2fd190bdde16\") " Dec 04 10:42:53 crc kubenswrapper[4831]: I1204 10:42:53.234703 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d4bb48f-fa66-44cf-ab52-2fd190bdde16-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "7d4bb48f-fa66-44cf-ab52-2fd190bdde16" (UID: "7d4bb48f-fa66-44cf-ab52-2fd190bdde16"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:42:53 crc kubenswrapper[4831]: I1204 10:42:53.235045 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d4bb48f-fa66-44cf-ab52-2fd190bdde16-kube-api-access-9qkfg" (OuterVolumeSpecName: "kube-api-access-9qkfg") pod "7d4bb48f-fa66-44cf-ab52-2fd190bdde16" (UID: "7d4bb48f-fa66-44cf-ab52-2fd190bdde16"). InnerVolumeSpecName "kube-api-access-9qkfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:42:53 crc kubenswrapper[4831]: I1204 10:42:53.259689 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d4bb48f-fa66-44cf-ab52-2fd190bdde16-inventory" (OuterVolumeSpecName: "inventory") pod "7d4bb48f-fa66-44cf-ab52-2fd190bdde16" (UID: "7d4bb48f-fa66-44cf-ab52-2fd190bdde16"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:42:53 crc kubenswrapper[4831]: I1204 10:42:53.272364 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d4bb48f-fa66-44cf-ab52-2fd190bdde16-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7d4bb48f-fa66-44cf-ab52-2fd190bdde16" (UID: "7d4bb48f-fa66-44cf-ab52-2fd190bdde16"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:42:53 crc kubenswrapper[4831]: I1204 10:42:53.331053 4831 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d4bb48f-fa66-44cf-ab52-2fd190bdde16-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:42:53 crc kubenswrapper[4831]: I1204 10:42:53.331111 4831 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d4bb48f-fa66-44cf-ab52-2fd190bdde16-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:42:53 crc kubenswrapper[4831]: I1204 10:42:53.331131 4831 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d4bb48f-fa66-44cf-ab52-2fd190bdde16-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:42:53 crc kubenswrapper[4831]: I1204 10:42:53.331151 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qkfg\" (UniqueName: \"kubernetes.io/projected/7d4bb48f-fa66-44cf-ab52-2fd190bdde16-kube-api-access-9qkfg\") on node \"crc\" DevicePath \"\"" Dec 04 10:42:53 crc kubenswrapper[4831]: I1204 10:42:53.722425 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9wp7s" event={"ID":"7d4bb48f-fa66-44cf-ab52-2fd190bdde16","Type":"ContainerDied","Data":"748b7519661f133a6b91ca3b6980f0972415ba9ae0cb16685df526c681a8c3f9"} Dec 04 10:42:53 crc kubenswrapper[4831]: I1204 10:42:53.722474 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="748b7519661f133a6b91ca3b6980f0972415ba9ae0cb16685df526c681a8c3f9" Dec 04 10:42:53 crc kubenswrapper[4831]: I1204 10:42:53.722581 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9wp7s" Dec 04 10:42:53 crc kubenswrapper[4831]: I1204 10:42:53.790310 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b924b"] Dec 04 10:42:53 crc kubenswrapper[4831]: E1204 10:42:53.790872 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e033ffc7-4ae8-4af5-8e16-8d44c339e76e" containerName="extract-content" Dec 04 10:42:53 crc kubenswrapper[4831]: I1204 10:42:53.790906 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e033ffc7-4ae8-4af5-8e16-8d44c339e76e" containerName="extract-content" Dec 04 10:42:53 crc kubenswrapper[4831]: E1204 10:42:53.790923 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e033ffc7-4ae8-4af5-8e16-8d44c339e76e" containerName="registry-server" Dec 04 10:42:53 crc kubenswrapper[4831]: I1204 10:42:53.790929 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e033ffc7-4ae8-4af5-8e16-8d44c339e76e" containerName="registry-server" Dec 04 10:42:53 crc kubenswrapper[4831]: E1204 10:42:53.790942 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9437113b-ead7-4721-9bcb-8822aa9b3415" containerName="extract-content" Dec 04 10:42:53 crc kubenswrapper[4831]: I1204 10:42:53.790948 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="9437113b-ead7-4721-9bcb-8822aa9b3415" containerName="extract-content" Dec 04 10:42:53 crc kubenswrapper[4831]: E1204 10:42:53.790976 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e033ffc7-4ae8-4af5-8e16-8d44c339e76e" containerName="extract-utilities" Dec 04 10:42:53 crc kubenswrapper[4831]: I1204 10:42:53.790982 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e033ffc7-4ae8-4af5-8e16-8d44c339e76e" containerName="extract-utilities" Dec 04 10:42:53 crc kubenswrapper[4831]: E1204 10:42:53.790994 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d4bb48f-fa66-44cf-ab52-2fd190bdde16" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 04 10:42:53 crc kubenswrapper[4831]: I1204 10:42:53.791001 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d4bb48f-fa66-44cf-ab52-2fd190bdde16" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 04 10:42:53 crc kubenswrapper[4831]: E1204 10:42:53.791011 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9437113b-ead7-4721-9bcb-8822aa9b3415" containerName="registry-server" Dec 04 10:42:53 crc kubenswrapper[4831]: I1204 10:42:53.791017 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="9437113b-ead7-4721-9bcb-8822aa9b3415" containerName="registry-server" Dec 04 10:42:53 crc kubenswrapper[4831]: E1204 10:42:53.791032 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9437113b-ead7-4721-9bcb-8822aa9b3415" containerName="extract-utilities" Dec 04 10:42:53 crc kubenswrapper[4831]: I1204 10:42:53.791038 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="9437113b-ead7-4721-9bcb-8822aa9b3415" containerName="extract-utilities" Dec 04 10:42:53 crc kubenswrapper[4831]: I1204 10:42:53.791219 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d4bb48f-fa66-44cf-ab52-2fd190bdde16" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 04 10:42:53 crc kubenswrapper[4831]: I1204 10:42:53.791234 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e033ffc7-4ae8-4af5-8e16-8d44c339e76e" containerName="registry-server" Dec 04 10:42:53 crc kubenswrapper[4831]: I1204 10:42:53.791244 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="9437113b-ead7-4721-9bcb-8822aa9b3415" containerName="registry-server" Dec 04 10:42:53 crc kubenswrapper[4831]: I1204 10:42:53.792071 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b924b" Dec 04 10:42:53 crc kubenswrapper[4831]: I1204 10:42:53.793843 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-l2wn5" Dec 04 10:42:53 crc kubenswrapper[4831]: I1204 10:42:53.795098 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:42:53 crc kubenswrapper[4831]: I1204 10:42:53.795834 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:42:53 crc kubenswrapper[4831]: I1204 10:42:53.795947 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:42:53 crc kubenswrapper[4831]: I1204 10:42:53.802729 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b924b"] Dec 04 10:42:53 crc kubenswrapper[4831]: I1204 10:42:53.942934 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb1963f3-7a7f-40b9-a9b2-b74f220bebb2-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b924b\" (UID: \"bb1963f3-7a7f-40b9-a9b2-b74f220bebb2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b924b" Dec 04 10:42:53 crc kubenswrapper[4831]: I1204 10:42:53.943429 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb1963f3-7a7f-40b9-a9b2-b74f220bebb2-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b924b\" (UID: \"bb1963f3-7a7f-40b9-a9b2-b74f220bebb2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b924b" Dec 04 10:42:53 crc kubenswrapper[4831]: I1204 10:42:53.943755 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2dkz\" (UniqueName: \"kubernetes.io/projected/bb1963f3-7a7f-40b9-a9b2-b74f220bebb2-kube-api-access-x2dkz\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b924b\" (UID: \"bb1963f3-7a7f-40b9-a9b2-b74f220bebb2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b924b" Dec 04 10:42:54 crc kubenswrapper[4831]: I1204 10:42:54.045530 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb1963f3-7a7f-40b9-a9b2-b74f220bebb2-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b924b\" (UID: \"bb1963f3-7a7f-40b9-a9b2-b74f220bebb2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b924b" Dec 04 10:42:54 crc kubenswrapper[4831]: I1204 10:42:54.045973 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb1963f3-7a7f-40b9-a9b2-b74f220bebb2-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b924b\" (UID: \"bb1963f3-7a7f-40b9-a9b2-b74f220bebb2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b924b" Dec 04 10:42:54 crc kubenswrapper[4831]: I1204 10:42:54.046068 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2dkz\" (UniqueName: \"kubernetes.io/projected/bb1963f3-7a7f-40b9-a9b2-b74f220bebb2-kube-api-access-x2dkz\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b924b\" (UID: \"bb1963f3-7a7f-40b9-a9b2-b74f220bebb2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b924b" Dec 04 10:42:54 crc kubenswrapper[4831]: I1204 10:42:54.052487 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb1963f3-7a7f-40b9-a9b2-b74f220bebb2-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b924b\" (UID: \"bb1963f3-7a7f-40b9-a9b2-b74f220bebb2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b924b" Dec 04 10:42:54 crc kubenswrapper[4831]: I1204 10:42:54.054260 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb1963f3-7a7f-40b9-a9b2-b74f220bebb2-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b924b\" (UID: \"bb1963f3-7a7f-40b9-a9b2-b74f220bebb2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b924b" Dec 04 10:42:54 crc kubenswrapper[4831]: I1204 10:42:54.074592 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2dkz\" (UniqueName: \"kubernetes.io/projected/bb1963f3-7a7f-40b9-a9b2-b74f220bebb2-kube-api-access-x2dkz\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b924b\" (UID: \"bb1963f3-7a7f-40b9-a9b2-b74f220bebb2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b924b" Dec 04 10:42:54 crc kubenswrapper[4831]: I1204 10:42:54.120170 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b924b" Dec 04 10:42:54 crc kubenswrapper[4831]: I1204 10:42:54.692627 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b924b"] Dec 04 10:42:54 crc kubenswrapper[4831]: W1204 10:42:54.695737 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb1963f3_7a7f_40b9_a9b2_b74f220bebb2.slice/crio-ca9a1aad112412314f01431498e802bad9d39a1f0861f2b9c1751ca70f0d237f WatchSource:0}: Error finding container ca9a1aad112412314f01431498e802bad9d39a1f0861f2b9c1751ca70f0d237f: Status 404 returned error can't find the container with id ca9a1aad112412314f01431498e802bad9d39a1f0861f2b9c1751ca70f0d237f Dec 04 10:42:54 crc kubenswrapper[4831]: I1204 10:42:54.737790 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b924b" event={"ID":"bb1963f3-7a7f-40b9-a9b2-b74f220bebb2","Type":"ContainerStarted","Data":"ca9a1aad112412314f01431498e802bad9d39a1f0861f2b9c1751ca70f0d237f"} Dec 04 10:42:55 crc kubenswrapper[4831]: I1204 10:42:55.750889 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b924b" event={"ID":"bb1963f3-7a7f-40b9-a9b2-b74f220bebb2","Type":"ContainerStarted","Data":"64aa75e697b5e859bc84948dd20851ad0935633fff6bb507b2707b71f374c652"} Dec 04 10:42:55 crc kubenswrapper[4831]: I1204 10:42:55.775328 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b924b" podStartSLOduration=2.365590229 podStartE2EDuration="2.775303669s" podCreationTimestamp="2025-12-04 10:42:53 +0000 UTC" firstStartedPulling="2025-12-04 10:42:54.699017458 +0000 UTC m=+1671.648192772" lastFinishedPulling="2025-12-04 10:42:55.108730898 +0000 UTC m=+1672.057906212" observedRunningTime="2025-12-04 10:42:55.767212737 +0000 UTC m=+1672.716388071" watchObservedRunningTime="2025-12-04 10:42:55.775303669 +0000 UTC m=+1672.724478993" Dec 04 10:43:12 crc kubenswrapper[4831]: I1204 10:43:12.043741 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-68cqf"] Dec 04 10:43:12 crc kubenswrapper[4831]: I1204 10:43:12.056009 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-hwz95"] Dec 04 10:43:12 crc kubenswrapper[4831]: I1204 10:43:12.064653 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-hwz95"] Dec 04 10:43:12 crc kubenswrapper[4831]: I1204 10:43:12.073278 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-68cqf"] Dec 04 10:43:13 crc kubenswrapper[4831]: I1204 10:43:13.292645 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0eba6b8-a3b0-4fb3-997b-6ce23e035ec9" path="/var/lib/kubelet/pods/a0eba6b8-a3b0-4fb3-997b-6ce23e035ec9/volumes" Dec 04 10:43:13 crc kubenswrapper[4831]: I1204 10:43:13.294068 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8149a56-f5ed-43f5-98a4-8d7324feadce" path="/var/lib/kubelet/pods/f8149a56-f5ed-43f5-98a4-8d7324feadce/volumes" Dec 04 10:43:16 crc kubenswrapper[4831]: I1204 10:43:16.029975 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-create-45skt"] Dec 04 10:43:16 crc kubenswrapper[4831]: I1204 10:43:16.043627 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-create-45skt"] Dec 04 10:43:17 crc kubenswrapper[4831]: I1204 10:43:17.287621 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b95fde6f-f78a-4a46-ab00-5f817de61b4e" path="/var/lib/kubelet/pods/b95fde6f-f78a-4a46-ab00-5f817de61b4e/volumes" Dec 04 10:43:21 crc kubenswrapper[4831]: I1204 10:43:21.971591 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:43:21 crc kubenswrapper[4831]: I1204 10:43:21.972260 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:43:21 crc kubenswrapper[4831]: I1204 10:43:21.972316 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" Dec 04 10:43:21 crc kubenswrapper[4831]: I1204 10:43:21.973158 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"37e17da4eccc398a7fd4a49aa969c9f666aa6c38055bd6e9bb03c0ad0e553e58"} pod="openshift-machine-config-operator/machine-config-daemon-g76nn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 10:43:21 crc kubenswrapper[4831]: I1204 10:43:21.973246 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" containerID="cri-o://37e17da4eccc398a7fd4a49aa969c9f666aa6c38055bd6e9bb03c0ad0e553e58" gracePeriod=600 Dec 04 10:43:22 crc kubenswrapper[4831]: E1204 10:43:22.548836 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:43:23 crc kubenswrapper[4831]: I1204 10:43:23.008548 4831 generic.go:334] "Generic (PLEG): container finished" podID="8475bb26-8864-4d49-935b-db7d4cb73387" containerID="37e17da4eccc398a7fd4a49aa969c9f666aa6c38055bd6e9bb03c0ad0e553e58" exitCode=0 Dec 04 10:43:23 crc kubenswrapper[4831]: I1204 10:43:23.008598 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" event={"ID":"8475bb26-8864-4d49-935b-db7d4cb73387","Type":"ContainerDied","Data":"37e17da4eccc398a7fd4a49aa969c9f666aa6c38055bd6e9bb03c0ad0e553e58"} Dec 04 10:43:23 crc kubenswrapper[4831]: I1204 10:43:23.008671 4831 scope.go:117] "RemoveContainer" containerID="61f7279f438426e6b31be64e9ccb62c729f466d054e9b0e0a804c882066b625e" Dec 04 10:43:23 crc kubenswrapper[4831]: I1204 10:43:23.009340 4831 scope.go:117] "RemoveContainer" containerID="37e17da4eccc398a7fd4a49aa969c9f666aa6c38055bd6e9bb03c0ad0e553e58" Dec 04 10:43:23 crc kubenswrapper[4831]: E1204 10:43:23.009644 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:43:23 crc kubenswrapper[4831]: I1204 10:43:23.049699 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-e63e-account-create-rx9tx"] Dec 04 10:43:23 crc kubenswrapper[4831]: I1204 10:43:23.064720 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-cd08-account-create-6kj6f"] Dec 04 10:43:23 crc kubenswrapper[4831]: I1204 10:43:23.086299 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-e63e-account-create-rx9tx"] Dec 04 10:43:23 crc kubenswrapper[4831]: I1204 10:43:23.095459 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-cd08-account-create-6kj6f"] Dec 04 10:43:23 crc kubenswrapper[4831]: I1204 10:43:23.288253 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9433b0a1-4aba-4dea-9245-00e6e0ea65b3" path="/var/lib/kubelet/pods/9433b0a1-4aba-4dea-9245-00e6e0ea65b3/volumes" Dec 04 10:43:23 crc kubenswrapper[4831]: I1204 10:43:23.289547 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d25f503b-ff4c-4e83-9ac4-8342e6b525a5" path="/var/lib/kubelet/pods/d25f503b-ff4c-4e83-9ac4-8342e6b525a5/volumes" Dec 04 10:43:23 crc kubenswrapper[4831]: I1204 10:43:23.940957 4831 scope.go:117] "RemoveContainer" containerID="707c8a641959817a87955edd4ebfe1341fbcac68ce3e10572059efeb384fd81a" Dec 04 10:43:23 crc kubenswrapper[4831]: I1204 10:43:23.968993 4831 scope.go:117] "RemoveContainer" containerID="0b0d322bf8c7d8d87f97d5c85465dee8ff31c878c6974ec296a81d9912a4d80f" Dec 04 10:43:24 crc kubenswrapper[4831]: I1204 10:43:24.014183 4831 scope.go:117] "RemoveContainer" containerID="1528f69fc1b30778b5461d697506bcae221f968b424cbcea9da7e200d091e0a2" Dec 04 10:43:24 crc kubenswrapper[4831]: I1204 10:43:24.076923 4831 scope.go:117] "RemoveContainer" containerID="3f11b7e9c243b75f8a0ef0d67eb7921601bbe2240fbbee0ed72bda35da870121" Dec 04 10:43:24 crc kubenswrapper[4831]: I1204 10:43:24.158458 4831 scope.go:117] "RemoveContainer" containerID="55abb855d2c74bef7973a0805864c932ae617a0fee27b66627a41a0e4900ac2b" Dec 04 10:43:35 crc kubenswrapper[4831]: I1204 10:43:35.027540 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-4a99-account-create-22f2m"] Dec 04 10:43:35 crc kubenswrapper[4831]: I1204 10:43:35.038211 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-4a99-account-create-22f2m"] Dec 04 10:43:35 crc kubenswrapper[4831]: I1204 10:43:35.286890 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="130910ed-23cb-4473-b0cb-e59907852513" path="/var/lib/kubelet/pods/130910ed-23cb-4473-b0cb-e59907852513/volumes" Dec 04 10:43:36 crc kubenswrapper[4831]: I1204 10:43:36.277760 4831 scope.go:117] "RemoveContainer" containerID="37e17da4eccc398a7fd4a49aa969c9f666aa6c38055bd6e9bb03c0ad0e553e58" Dec 04 10:43:36 crc kubenswrapper[4831]: E1204 10:43:36.278588 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:43:44 crc kubenswrapper[4831]: I1204 10:43:44.045626 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-xhvnt"] Dec 04 10:43:44 crc kubenswrapper[4831]: I1204 10:43:44.057251 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-6mcl2"] Dec 04 10:43:44 crc kubenswrapper[4831]: I1204 10:43:44.066323 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-xhvnt"] Dec 04 10:43:44 crc kubenswrapper[4831]: I1204 10:43:44.075173 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-6mcl2"] Dec 04 10:43:45 crc kubenswrapper[4831]: I1204 10:43:45.292464 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="561a4728-58d4-40dc-ad9b-604394f208d6" path="/var/lib/kubelet/pods/561a4728-58d4-40dc-ad9b-604394f208d6/volumes" Dec 04 10:43:45 crc kubenswrapper[4831]: I1204 10:43:45.293567 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc17360e-f22d-4420-8de8-5a95abc8f54c" path="/var/lib/kubelet/pods/dc17360e-f22d-4420-8de8-5a95abc8f54c/volumes" Dec 04 10:43:46 crc kubenswrapper[4831]: I1204 10:43:46.028465 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-pl6jb"] Dec 04 10:43:46 crc kubenswrapper[4831]: I1204 10:43:46.039952 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-m6n2t"] Dec 04 10:43:46 crc kubenswrapper[4831]: I1204 10:43:46.048777 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-pl6jb"] Dec 04 10:43:46 crc kubenswrapper[4831]: I1204 10:43:46.057838 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-m6n2t"] Dec 04 10:43:47 crc kubenswrapper[4831]: I1204 10:43:47.288594 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dba9a395-7b7d-4a63-be80-61caea16396b" path="/var/lib/kubelet/pods/dba9a395-7b7d-4a63-be80-61caea16396b/volumes" Dec 04 10:43:47 crc kubenswrapper[4831]: I1204 10:43:47.290253 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f36d8a75-9447-4921-8f46-f3fdb08160d8" path="/var/lib/kubelet/pods/f36d8a75-9447-4921-8f46-f3fdb08160d8/volumes" Dec 04 10:43:50 crc kubenswrapper[4831]: I1204 10:43:50.276958 4831 scope.go:117] "RemoveContainer" containerID="37e17da4eccc398a7fd4a49aa969c9f666aa6c38055bd6e9bb03c0ad0e553e58" Dec 04 10:43:50 crc kubenswrapper[4831]: E1204 10:43:50.277754 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:43:51 crc kubenswrapper[4831]: I1204 10:43:51.030849 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-lm2h7"] Dec 04 10:43:51 crc kubenswrapper[4831]: I1204 10:43:51.040520 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-lm2h7"] Dec 04 10:43:51 crc kubenswrapper[4831]: I1204 10:43:51.286646 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c771ae2e-18c8-42c4-a789-7b55fea8e605" path="/var/lib/kubelet/pods/c771ae2e-18c8-42c4-a789-7b55fea8e605/volumes" Dec 04 10:43:53 crc kubenswrapper[4831]: I1204 10:43:53.030359 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b886-account-create-r4hm2"] Dec 04 10:43:53 crc kubenswrapper[4831]: I1204 10:43:53.040880 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-b886-account-create-r4hm2"] Dec 04 10:43:53 crc kubenswrapper[4831]: I1204 10:43:53.287219 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91a201ba-1bd6-41a4-a891-a82a2e017da1" path="/var/lib/kubelet/pods/91a201ba-1bd6-41a4-a891-a82a2e017da1/volumes" Dec 04 10:44:04 crc kubenswrapper[4831]: I1204 10:44:04.276422 4831 scope.go:117] "RemoveContainer" containerID="37e17da4eccc398a7fd4a49aa969c9f666aa6c38055bd6e9bb03c0ad0e553e58" Dec 04 10:44:04 crc kubenswrapper[4831]: E1204 10:44:04.278584 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:44:10 crc kubenswrapper[4831]: I1204 10:44:10.042060 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-ba03-account-create-gvcf6"] Dec 04 10:44:10 crc kubenswrapper[4831]: I1204 10:44:10.052373 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-sync-hwbtz"] Dec 04 10:44:10 crc kubenswrapper[4831]: I1204 10:44:10.062940 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-ba03-account-create-gvcf6"] Dec 04 10:44:10 crc kubenswrapper[4831]: I1204 10:44:10.070439 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-sync-hwbtz"] Dec 04 10:44:11 crc kubenswrapper[4831]: I1204 10:44:11.288520 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d40946a3-b49e-4cb0-b113-18711def8c0e" path="/var/lib/kubelet/pods/d40946a3-b49e-4cb0-b113-18711def8c0e/volumes" Dec 04 10:44:11 crc kubenswrapper[4831]: I1204 10:44:11.289411 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a" path="/var/lib/kubelet/pods/ee687d9f-0cd1-40f3-a5e5-e8891e8d0d3a/volumes" Dec 04 10:44:12 crc kubenswrapper[4831]: I1204 10:44:12.035524 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-edf8-account-create-w79xp"] Dec 04 10:44:12 crc kubenswrapper[4831]: I1204 10:44:12.050218 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-edf8-account-create-w79xp"] Dec 04 10:44:13 crc kubenswrapper[4831]: I1204 10:44:13.286968 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="664dafb4-41fc-4bfc-8355-32ae4ef51867" path="/var/lib/kubelet/pods/664dafb4-41fc-4bfc-8355-32ae4ef51867/volumes" Dec 04 10:44:16 crc kubenswrapper[4831]: I1204 10:44:16.277595 4831 scope.go:117] "RemoveContainer" containerID="37e17da4eccc398a7fd4a49aa969c9f666aa6c38055bd6e9bb03c0ad0e553e58" Dec 04 10:44:16 crc kubenswrapper[4831]: E1204 10:44:16.278514 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:44:18 crc kubenswrapper[4831]: I1204 10:44:18.027680 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-377a-account-create-c9vwp"] Dec 04 10:44:18 crc kubenswrapper[4831]: I1204 10:44:18.036473 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-377a-account-create-c9vwp"] Dec 04 10:44:19 crc kubenswrapper[4831]: I1204 10:44:19.037343 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-s9bgg"] Dec 04 10:44:19 crc kubenswrapper[4831]: I1204 10:44:19.052848 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-s9bgg"] Dec 04 10:44:19 crc kubenswrapper[4831]: I1204 10:44:19.289040 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5" path="/var/lib/kubelet/pods/ce1fbcf4-dcad-4ddc-8456-cb85d7f29df5/volumes" Dec 04 10:44:19 crc kubenswrapper[4831]: I1204 10:44:19.289621 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f67e7c2e-46d3-4e53-97fe-1b1a7d2e7d9f" path="/var/lib/kubelet/pods/f67e7c2e-46d3-4e53-97fe-1b1a7d2e7d9f/volumes" Dec 04 10:44:24 crc kubenswrapper[4831]: I1204 10:44:24.346376 4831 scope.go:117] "RemoveContainer" containerID="7587716e8a8723f8f0a01c1bea6561f868845004d917c80310ea5759e4e620b3" Dec 04 10:44:24 crc kubenswrapper[4831]: I1204 10:44:24.384012 4831 scope.go:117] "RemoveContainer" containerID="88e3709a20824fbda63a27d39977898e63725c6f9ce174c6256c7a8bde9ef169" Dec 04 10:44:24 crc kubenswrapper[4831]: I1204 10:44:24.440993 4831 scope.go:117] "RemoveContainer" containerID="292bb796952f79a1b29382424747ba3fc7838fec6389269074b00bfd131670d1" Dec 04 10:44:24 crc kubenswrapper[4831]: I1204 10:44:24.527201 4831 scope.go:117] "RemoveContainer" containerID="e77834aaccfef7207ad6ec5de99f29061ec47a32f6211f8ba60f1ebd9a582255" Dec 04 10:44:24 crc kubenswrapper[4831]: I1204 10:44:24.562217 4831 scope.go:117] "RemoveContainer" containerID="32a6413864550a3235d47d439de607080de0a4fcd14b12c63c541d7e0e15d3fe" Dec 04 10:44:24 crc kubenswrapper[4831]: I1204 10:44:24.608519 4831 scope.go:117] "RemoveContainer" containerID="935821672bc4757d8ed34b09d63c80b995dd9a0e6dbdb2c7c6d87b0183e4b271" Dec 04 10:44:24 crc kubenswrapper[4831]: I1204 10:44:24.655426 4831 scope.go:117] "RemoveContainer" containerID="1029616cc98d2367ff3a607ed08022b9f67c4b1f011c7de6880df86463eb3310" Dec 04 10:44:24 crc kubenswrapper[4831]: I1204 10:44:24.682900 4831 scope.go:117] "RemoveContainer" containerID="3b25fa24b36a0dd425ffaf8fe0b8181de2a3297787a29654799f0fb4a0cf8896" Dec 04 10:44:24 crc kubenswrapper[4831]: I1204 10:44:24.702483 4831 scope.go:117] "RemoveContainer" containerID="d6d1eef002d83ca29576f21cf66897b7fdd4a93cd18544403c6a0a5444b5c3ff" Dec 04 10:44:24 crc kubenswrapper[4831]: I1204 10:44:24.727697 4831 scope.go:117] "RemoveContainer" containerID="3a1d4237f22b0570e9b56ea8d266009644ae7c742001de722ea30ae565cae9d1" Dec 04 10:44:24 crc kubenswrapper[4831]: I1204 10:44:24.750874 4831 scope.go:117] "RemoveContainer" containerID="ad385629ce59d58b186d0c360c1876ad8d7a01e96e773c4c52a5a84490d884ad" Dec 04 10:44:24 crc kubenswrapper[4831]: I1204 10:44:24.779819 4831 scope.go:117] "RemoveContainer" containerID="00de293c46492e8856a60277fad082eca3709b73ad4afe3288c8baff079841aa" Dec 04 10:44:28 crc kubenswrapper[4831]: I1204 10:44:28.277242 4831 scope.go:117] "RemoveContainer" containerID="37e17da4eccc398a7fd4a49aa969c9f666aa6c38055bd6e9bb03c0ad0e553e58" Dec 04 10:44:28 crc kubenswrapper[4831]: E1204 10:44:28.278170 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:44:39 crc kubenswrapper[4831]: I1204 10:44:39.045840 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-7svk9"] Dec 04 10:44:39 crc kubenswrapper[4831]: I1204 10:44:39.058712 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-7svk9"] Dec 04 10:44:39 crc kubenswrapper[4831]: I1204 10:44:39.288838 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a79b812b-5862-43d0-a6d5-5c6ec3a63e51" path="/var/lib/kubelet/pods/a79b812b-5862-43d0-a6d5-5c6ec3a63e51/volumes" Dec 04 10:44:42 crc kubenswrapper[4831]: I1204 10:44:42.276303 4831 scope.go:117] "RemoveContainer" containerID="37e17da4eccc398a7fd4a49aa969c9f666aa6c38055bd6e9bb03c0ad0e553e58" Dec 04 10:44:42 crc kubenswrapper[4831]: E1204 10:44:42.276972 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:44:48 crc kubenswrapper[4831]: I1204 10:44:48.027281 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-9hwx9"] Dec 04 10:44:48 crc kubenswrapper[4831]: I1204 10:44:48.043938 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-9hwx9"] Dec 04 10:44:49 crc kubenswrapper[4831]: I1204 10:44:49.287599 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e" path="/var/lib/kubelet/pods/ec9a65ef-c4b5-4c6f-b4c1-91c51e7c070e/volumes" Dec 04 10:44:54 crc kubenswrapper[4831]: I1204 10:44:54.032986 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-q6jc5"] Dec 04 10:44:54 crc kubenswrapper[4831]: I1204 10:44:54.043095 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-q6jc5"] Dec 04 10:44:55 crc kubenswrapper[4831]: I1204 10:44:55.277085 4831 scope.go:117] "RemoveContainer" containerID="37e17da4eccc398a7fd4a49aa969c9f666aa6c38055bd6e9bb03c0ad0e553e58" Dec 04 10:44:55 crc kubenswrapper[4831]: E1204 10:44:55.277912 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:44:55 crc kubenswrapper[4831]: I1204 10:44:55.288343 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41899055-8db6-4cdb-a9da-2bbb143b9f3f" path="/var/lib/kubelet/pods/41899055-8db6-4cdb-a9da-2bbb143b9f3f/volumes" Dec 04 10:44:58 crc kubenswrapper[4831]: I1204 10:44:58.038610 4831 generic.go:334] "Generic (PLEG): container finished" podID="bb1963f3-7a7f-40b9-a9b2-b74f220bebb2" containerID="64aa75e697b5e859bc84948dd20851ad0935633fff6bb507b2707b71f374c652" exitCode=0 Dec 04 10:44:58 crc kubenswrapper[4831]: I1204 10:44:58.038708 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b924b" event={"ID":"bb1963f3-7a7f-40b9-a9b2-b74f220bebb2","Type":"ContainerDied","Data":"64aa75e697b5e859bc84948dd20851ad0935633fff6bb507b2707b71f374c652"} Dec 04 10:44:59 crc kubenswrapper[4831]: I1204 10:44:59.485569 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b924b" Dec 04 10:44:59 crc kubenswrapper[4831]: I1204 10:44:59.541740 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2dkz\" (UniqueName: \"kubernetes.io/projected/bb1963f3-7a7f-40b9-a9b2-b74f220bebb2-kube-api-access-x2dkz\") pod \"bb1963f3-7a7f-40b9-a9b2-b74f220bebb2\" (UID: \"bb1963f3-7a7f-40b9-a9b2-b74f220bebb2\") " Dec 04 10:44:59 crc kubenswrapper[4831]: I1204 10:44:59.541795 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb1963f3-7a7f-40b9-a9b2-b74f220bebb2-ssh-key\") pod \"bb1963f3-7a7f-40b9-a9b2-b74f220bebb2\" (UID: \"bb1963f3-7a7f-40b9-a9b2-b74f220bebb2\") " Dec 04 10:44:59 crc kubenswrapper[4831]: I1204 10:44:59.541991 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb1963f3-7a7f-40b9-a9b2-b74f220bebb2-inventory\") pod \"bb1963f3-7a7f-40b9-a9b2-b74f220bebb2\" (UID: \"bb1963f3-7a7f-40b9-a9b2-b74f220bebb2\") " Dec 04 10:44:59 crc kubenswrapper[4831]: I1204 10:44:59.548196 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb1963f3-7a7f-40b9-a9b2-b74f220bebb2-kube-api-access-x2dkz" (OuterVolumeSpecName: "kube-api-access-x2dkz") pod "bb1963f3-7a7f-40b9-a9b2-b74f220bebb2" (UID: "bb1963f3-7a7f-40b9-a9b2-b74f220bebb2"). InnerVolumeSpecName "kube-api-access-x2dkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:44:59 crc kubenswrapper[4831]: I1204 10:44:59.573639 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb1963f3-7a7f-40b9-a9b2-b74f220bebb2-inventory" (OuterVolumeSpecName: "inventory") pod "bb1963f3-7a7f-40b9-a9b2-b74f220bebb2" (UID: "bb1963f3-7a7f-40b9-a9b2-b74f220bebb2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:44:59 crc kubenswrapper[4831]: I1204 10:44:59.574002 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb1963f3-7a7f-40b9-a9b2-b74f220bebb2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bb1963f3-7a7f-40b9-a9b2-b74f220bebb2" (UID: "bb1963f3-7a7f-40b9-a9b2-b74f220bebb2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:44:59 crc kubenswrapper[4831]: I1204 10:44:59.644285 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2dkz\" (UniqueName: \"kubernetes.io/projected/bb1963f3-7a7f-40b9-a9b2-b74f220bebb2-kube-api-access-x2dkz\") on node \"crc\" DevicePath \"\"" Dec 04 10:44:59 crc kubenswrapper[4831]: I1204 10:44:59.644328 4831 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb1963f3-7a7f-40b9-a9b2-b74f220bebb2-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:44:59 crc kubenswrapper[4831]: I1204 10:44:59.644341 4831 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb1963f3-7a7f-40b9-a9b2-b74f220bebb2-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:45:00 crc kubenswrapper[4831]: I1204 10:45:00.062204 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b924b" event={"ID":"bb1963f3-7a7f-40b9-a9b2-b74f220bebb2","Type":"ContainerDied","Data":"ca9a1aad112412314f01431498e802bad9d39a1f0861f2b9c1751ca70f0d237f"} Dec 04 10:45:00 crc kubenswrapper[4831]: I1204 10:45:00.062292 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b924b" Dec 04 10:45:00 crc kubenswrapper[4831]: I1204 10:45:00.063245 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca9a1aad112412314f01431498e802bad9d39a1f0861f2b9c1751ca70f0d237f" Dec 04 10:45:00 crc kubenswrapper[4831]: I1204 10:45:00.144118 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdn78"] Dec 04 10:45:00 crc kubenswrapper[4831]: E1204 10:45:00.144656 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb1963f3-7a7f-40b9-a9b2-b74f220bebb2" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 04 10:45:00 crc kubenswrapper[4831]: I1204 10:45:00.144702 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb1963f3-7a7f-40b9-a9b2-b74f220bebb2" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 04 10:45:00 crc kubenswrapper[4831]: I1204 10:45:00.144998 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb1963f3-7a7f-40b9-a9b2-b74f220bebb2" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 04 10:45:00 crc kubenswrapper[4831]: I1204 10:45:00.145927 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdn78" Dec 04 10:45:00 crc kubenswrapper[4831]: I1204 10:45:00.148351 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:45:00 crc kubenswrapper[4831]: I1204 10:45:00.150419 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:45:00 crc kubenswrapper[4831]: I1204 10:45:00.150669 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-l2wn5" Dec 04 10:45:00 crc kubenswrapper[4831]: I1204 10:45:00.150956 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:45:00 crc kubenswrapper[4831]: I1204 10:45:00.167835 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdn78"] Dec 04 10:45:00 crc kubenswrapper[4831]: I1204 10:45:00.239203 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414085-vrmpn"] Dec 04 10:45:00 crc kubenswrapper[4831]: I1204 10:45:00.240613 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-vrmpn" Dec 04 10:45:00 crc kubenswrapper[4831]: I1204 10:45:00.242739 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 10:45:00 crc kubenswrapper[4831]: I1204 10:45:00.243135 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 10:45:00 crc kubenswrapper[4831]: I1204 10:45:00.258016 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414085-vrmpn"] Dec 04 10:45:00 crc kubenswrapper[4831]: I1204 10:45:00.258327 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lhcc\" (UniqueName: \"kubernetes.io/projected/4acb6010-6e0f-45c8-b768-1bd3a9a090c8-kube-api-access-7lhcc\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xdn78\" (UID: \"4acb6010-6e0f-45c8-b768-1bd3a9a090c8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdn78" Dec 04 10:45:00 crc kubenswrapper[4831]: I1204 10:45:00.258498 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4acb6010-6e0f-45c8-b768-1bd3a9a090c8-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xdn78\" (UID: \"4acb6010-6e0f-45c8-b768-1bd3a9a090c8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdn78" Dec 04 10:45:00 crc kubenswrapper[4831]: I1204 10:45:00.258535 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4acb6010-6e0f-45c8-b768-1bd3a9a090c8-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xdn78\" (UID: \"4acb6010-6e0f-45c8-b768-1bd3a9a090c8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdn78" Dec 04 10:45:00 crc kubenswrapper[4831]: I1204 10:45:00.360313 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phcg4\" (UniqueName: \"kubernetes.io/projected/f2b62802-882f-4b75-95ca-736ffafb4d63-kube-api-access-phcg4\") pod \"collect-profiles-29414085-vrmpn\" (UID: \"f2b62802-882f-4b75-95ca-736ffafb4d63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-vrmpn" Dec 04 10:45:00 crc kubenswrapper[4831]: I1204 10:45:00.360418 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2b62802-882f-4b75-95ca-736ffafb4d63-config-volume\") pod \"collect-profiles-29414085-vrmpn\" (UID: \"f2b62802-882f-4b75-95ca-736ffafb4d63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-vrmpn" Dec 04 10:45:00 crc kubenswrapper[4831]: I1204 10:45:00.360471 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f2b62802-882f-4b75-95ca-736ffafb4d63-secret-volume\") pod \"collect-profiles-29414085-vrmpn\" (UID: \"f2b62802-882f-4b75-95ca-736ffafb4d63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-vrmpn" Dec 04 10:45:00 crc kubenswrapper[4831]: I1204 10:45:00.360495 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4acb6010-6e0f-45c8-b768-1bd3a9a090c8-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xdn78\" (UID: \"4acb6010-6e0f-45c8-b768-1bd3a9a090c8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdn78" Dec 04 10:45:00 crc kubenswrapper[4831]: I1204 10:45:00.360518 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4acb6010-6e0f-45c8-b768-1bd3a9a090c8-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xdn78\" (UID: \"4acb6010-6e0f-45c8-b768-1bd3a9a090c8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdn78" Dec 04 10:45:00 crc kubenswrapper[4831]: I1204 10:45:00.360624 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lhcc\" (UniqueName: \"kubernetes.io/projected/4acb6010-6e0f-45c8-b768-1bd3a9a090c8-kube-api-access-7lhcc\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xdn78\" (UID: \"4acb6010-6e0f-45c8-b768-1bd3a9a090c8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdn78" Dec 04 10:45:00 crc kubenswrapper[4831]: I1204 10:45:00.367811 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4acb6010-6e0f-45c8-b768-1bd3a9a090c8-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xdn78\" (UID: \"4acb6010-6e0f-45c8-b768-1bd3a9a090c8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdn78" Dec 04 10:45:00 crc kubenswrapper[4831]: I1204 10:45:00.367811 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4acb6010-6e0f-45c8-b768-1bd3a9a090c8-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xdn78\" (UID: \"4acb6010-6e0f-45c8-b768-1bd3a9a090c8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdn78" Dec 04 10:45:00 crc kubenswrapper[4831]: I1204 10:45:00.380696 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lhcc\" (UniqueName: \"kubernetes.io/projected/4acb6010-6e0f-45c8-b768-1bd3a9a090c8-kube-api-access-7lhcc\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xdn78\" (UID: \"4acb6010-6e0f-45c8-b768-1bd3a9a090c8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdn78" Dec 04 10:45:00 crc kubenswrapper[4831]: I1204 10:45:00.462219 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phcg4\" (UniqueName: \"kubernetes.io/projected/f2b62802-882f-4b75-95ca-736ffafb4d63-kube-api-access-phcg4\") pod \"collect-profiles-29414085-vrmpn\" (UID: \"f2b62802-882f-4b75-95ca-736ffafb4d63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-vrmpn" Dec 04 10:45:00 crc kubenswrapper[4831]: I1204 10:45:00.462317 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2b62802-882f-4b75-95ca-736ffafb4d63-config-volume\") pod \"collect-profiles-29414085-vrmpn\" (UID: \"f2b62802-882f-4b75-95ca-736ffafb4d63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-vrmpn" Dec 04 10:45:00 crc kubenswrapper[4831]: I1204 10:45:00.462361 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f2b62802-882f-4b75-95ca-736ffafb4d63-secret-volume\") pod \"collect-profiles-29414085-vrmpn\" (UID: \"f2b62802-882f-4b75-95ca-736ffafb4d63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-vrmpn" Dec 04 10:45:00 crc kubenswrapper[4831]: I1204 10:45:00.463422 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2b62802-882f-4b75-95ca-736ffafb4d63-config-volume\") pod \"collect-profiles-29414085-vrmpn\" (UID: \"f2b62802-882f-4b75-95ca-736ffafb4d63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-vrmpn" Dec 04 10:45:00 crc kubenswrapper[4831]: I1204 10:45:00.466973 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f2b62802-882f-4b75-95ca-736ffafb4d63-secret-volume\") pod \"collect-profiles-29414085-vrmpn\" (UID: \"f2b62802-882f-4b75-95ca-736ffafb4d63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-vrmpn" Dec 04 10:45:00 crc kubenswrapper[4831]: I1204 10:45:00.467455 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdn78" Dec 04 10:45:00 crc kubenswrapper[4831]: I1204 10:45:00.480803 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phcg4\" (UniqueName: \"kubernetes.io/projected/f2b62802-882f-4b75-95ca-736ffafb4d63-kube-api-access-phcg4\") pod \"collect-profiles-29414085-vrmpn\" (UID: \"f2b62802-882f-4b75-95ca-736ffafb4d63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-vrmpn" Dec 04 10:45:00 crc kubenswrapper[4831]: I1204 10:45:00.612095 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-vrmpn" Dec 04 10:45:00 crc kubenswrapper[4831]: I1204 10:45:00.998728 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdn78"] Dec 04 10:45:01 crc kubenswrapper[4831]: I1204 10:45:01.068619 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414085-vrmpn"] Dec 04 10:45:01 crc kubenswrapper[4831]: I1204 10:45:01.076305 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdn78" event={"ID":"4acb6010-6e0f-45c8-b768-1bd3a9a090c8","Type":"ContainerStarted","Data":"68a888bca141032878d9fcae1adc86140b4e32d6196e632fe53fb9464c0e3360"} Dec 04 10:45:02 crc kubenswrapper[4831]: I1204 10:45:02.089452 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdn78" event={"ID":"4acb6010-6e0f-45c8-b768-1bd3a9a090c8","Type":"ContainerStarted","Data":"92de8076f198470586c17cf59098a3abe87d4723998eabcd87989abacb699a4a"} Dec 04 10:45:02 crc kubenswrapper[4831]: I1204 10:45:02.093413 4831 generic.go:334] "Generic (PLEG): container finished" podID="f2b62802-882f-4b75-95ca-736ffafb4d63" containerID="06412bf6396e3da51e6721e086e6137cda6d472fcd88c2b07fa29fd2f28c8fd2" exitCode=0 Dec 04 10:45:02 crc kubenswrapper[4831]: I1204 10:45:02.093459 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-vrmpn" event={"ID":"f2b62802-882f-4b75-95ca-736ffafb4d63","Type":"ContainerDied","Data":"06412bf6396e3da51e6721e086e6137cda6d472fcd88c2b07fa29fd2f28c8fd2"} Dec 04 10:45:02 crc kubenswrapper[4831]: I1204 10:45:02.093503 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-vrmpn" event={"ID":"f2b62802-882f-4b75-95ca-736ffafb4d63","Type":"ContainerStarted","Data":"258c26946c91eb1768ad895cafadb70f9ac08d6cef0c6341a53f6c35696710f4"} Dec 04 10:45:02 crc kubenswrapper[4831]: I1204 10:45:02.111516 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdn78" podStartSLOduration=1.7194286760000002 podStartE2EDuration="2.111491169s" podCreationTimestamp="2025-12-04 10:45:00 +0000 UTC" firstStartedPulling="2025-12-04 10:45:01.00206129 +0000 UTC m=+1797.951236604" lastFinishedPulling="2025-12-04 10:45:01.394123783 +0000 UTC m=+1798.343299097" observedRunningTime="2025-12-04 10:45:02.109838385 +0000 UTC m=+1799.059013719" watchObservedRunningTime="2025-12-04 10:45:02.111491169 +0000 UTC m=+1799.060666483" Dec 04 10:45:03 crc kubenswrapper[4831]: E1204 10:45:03.279075 4831 info.go:109] Failed to get network devices: open /sys/class/net/258c26946c91eb1/address: no such file or directory Dec 04 10:45:03 crc kubenswrapper[4831]: I1204 10:45:03.485759 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-vrmpn" Dec 04 10:45:03 crc kubenswrapper[4831]: I1204 10:45:03.525407 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phcg4\" (UniqueName: \"kubernetes.io/projected/f2b62802-882f-4b75-95ca-736ffafb4d63-kube-api-access-phcg4\") pod \"f2b62802-882f-4b75-95ca-736ffafb4d63\" (UID: \"f2b62802-882f-4b75-95ca-736ffafb4d63\") " Dec 04 10:45:03 crc kubenswrapper[4831]: I1204 10:45:03.525687 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2b62802-882f-4b75-95ca-736ffafb4d63-config-volume\") pod \"f2b62802-882f-4b75-95ca-736ffafb4d63\" (UID: \"f2b62802-882f-4b75-95ca-736ffafb4d63\") " Dec 04 10:45:03 crc kubenswrapper[4831]: I1204 10:45:03.525797 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f2b62802-882f-4b75-95ca-736ffafb4d63-secret-volume\") pod \"f2b62802-882f-4b75-95ca-736ffafb4d63\" (UID: \"f2b62802-882f-4b75-95ca-736ffafb4d63\") " Dec 04 10:45:03 crc kubenswrapper[4831]: I1204 10:45:03.526328 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2b62802-882f-4b75-95ca-736ffafb4d63-config-volume" (OuterVolumeSpecName: "config-volume") pod "f2b62802-882f-4b75-95ca-736ffafb4d63" (UID: "f2b62802-882f-4b75-95ca-736ffafb4d63"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:45:03 crc kubenswrapper[4831]: I1204 10:45:03.531764 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2b62802-882f-4b75-95ca-736ffafb4d63-kube-api-access-phcg4" (OuterVolumeSpecName: "kube-api-access-phcg4") pod "f2b62802-882f-4b75-95ca-736ffafb4d63" (UID: "f2b62802-882f-4b75-95ca-736ffafb4d63"). InnerVolumeSpecName "kube-api-access-phcg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:45:03 crc kubenswrapper[4831]: I1204 10:45:03.532251 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2b62802-882f-4b75-95ca-736ffafb4d63-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f2b62802-882f-4b75-95ca-736ffafb4d63" (UID: "f2b62802-882f-4b75-95ca-736ffafb4d63"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:45:03 crc kubenswrapper[4831]: I1204 10:45:03.628229 4831 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2b62802-882f-4b75-95ca-736ffafb4d63-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 10:45:03 crc kubenswrapper[4831]: I1204 10:45:03.628507 4831 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f2b62802-882f-4b75-95ca-736ffafb4d63-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 10:45:03 crc kubenswrapper[4831]: I1204 10:45:03.628566 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phcg4\" (UniqueName: \"kubernetes.io/projected/f2b62802-882f-4b75-95ca-736ffafb4d63-kube-api-access-phcg4\") on node \"crc\" DevicePath \"\"" Dec 04 10:45:04 crc kubenswrapper[4831]: I1204 10:45:04.114868 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-vrmpn" event={"ID":"f2b62802-882f-4b75-95ca-736ffafb4d63","Type":"ContainerDied","Data":"258c26946c91eb1768ad895cafadb70f9ac08d6cef0c6341a53f6c35696710f4"} Dec 04 10:45:04 crc kubenswrapper[4831]: I1204 10:45:04.115391 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="258c26946c91eb1768ad895cafadb70f9ac08d6cef0c6341a53f6c35696710f4" Dec 04 10:45:04 crc kubenswrapper[4831]: I1204 10:45:04.115345 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-vrmpn" Dec 04 10:45:08 crc kubenswrapper[4831]: I1204 10:45:08.277340 4831 scope.go:117] "RemoveContainer" containerID="37e17da4eccc398a7fd4a49aa969c9f666aa6c38055bd6e9bb03c0ad0e553e58" Dec 04 10:45:08 crc kubenswrapper[4831]: E1204 10:45:08.278190 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:45:16 crc kubenswrapper[4831]: I1204 10:45:16.062090 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-dmcxs"] Dec 04 10:45:16 crc kubenswrapper[4831]: I1204 10:45:16.070977 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-dmcxs"] Dec 04 10:45:17 crc kubenswrapper[4831]: I1204 10:45:17.286944 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94129f00-4043-4552-9724-feef1585cd20" path="/var/lib/kubelet/pods/94129f00-4043-4552-9724-feef1585cd20/volumes" Dec 04 10:45:18 crc kubenswrapper[4831]: I1204 10:45:18.030645 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-g5bvk"] Dec 04 10:45:18 crc kubenswrapper[4831]: I1204 10:45:18.040313 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-g5bvk"] Dec 04 10:45:19 crc kubenswrapper[4831]: I1204 10:45:19.031120 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-vthqg"] Dec 04 10:45:19 crc kubenswrapper[4831]: I1204 10:45:19.040132 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-vthqg"] Dec 04 10:45:19 crc kubenswrapper[4831]: I1204 10:45:19.294026 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18f48aa7-65d1-41ce-bc0d-4973db8b7abe" path="/var/lib/kubelet/pods/18f48aa7-65d1-41ce-bc0d-4973db8b7abe/volumes" Dec 04 10:45:19 crc kubenswrapper[4831]: I1204 10:45:19.294731 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="292354b3-46c7-4c76-b593-dda39380e797" path="/var/lib/kubelet/pods/292354b3-46c7-4c76-b593-dda39380e797/volumes" Dec 04 10:45:20 crc kubenswrapper[4831]: I1204 10:45:20.044493 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-x7lqv"] Dec 04 10:45:20 crc kubenswrapper[4831]: I1204 10:45:20.070413 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-x7lqv"] Dec 04 10:45:21 crc kubenswrapper[4831]: I1204 10:45:21.031111 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-hd6qr"] Dec 04 10:45:21 crc kubenswrapper[4831]: I1204 10:45:21.043857 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-hd6qr"] Dec 04 10:45:21 crc kubenswrapper[4831]: I1204 10:45:21.288341 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9066e1c7-0172-4ff8-b3ef-b4f5e316a4d2" path="/var/lib/kubelet/pods/9066e1c7-0172-4ff8-b3ef-b4f5e316a4d2/volumes" Dec 04 10:45:21 crc kubenswrapper[4831]: I1204 10:45:21.288920 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f292b8ac-6250-4a8a-b73e-75c6aeebe9d5" path="/var/lib/kubelet/pods/f292b8ac-6250-4a8a-b73e-75c6aeebe9d5/volumes" Dec 04 10:45:22 crc kubenswrapper[4831]: I1204 10:45:22.276493 4831 scope.go:117] "RemoveContainer" containerID="37e17da4eccc398a7fd4a49aa969c9f666aa6c38055bd6e9bb03c0ad0e553e58" Dec 04 10:45:22 crc kubenswrapper[4831]: E1204 10:45:22.276887 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:45:25 crc kubenswrapper[4831]: I1204 10:45:25.052501 4831 scope.go:117] "RemoveContainer" containerID="f926e57e3fd75cc46053dfb6e24b1843bd59fb8625828329f64b893c77a761da" Dec 04 10:45:25 crc kubenswrapper[4831]: I1204 10:45:25.095492 4831 scope.go:117] "RemoveContainer" containerID="374ed1560d678f11ec4bed60c79dbd54071ebb83505a5e220a5dee9f726db39f" Dec 04 10:45:25 crc kubenswrapper[4831]: I1204 10:45:25.148561 4831 scope.go:117] "RemoveContainer" containerID="f4613a44c044b4534a4ec68c77b6c0ccafde0d0eef4915d6068731e902b02e5f" Dec 04 10:45:25 crc kubenswrapper[4831]: I1204 10:45:25.189377 4831 scope.go:117] "RemoveContainer" containerID="c41373f79e32aa972af49c9686a345355f2b1507c776d100bb2e83e3fadd9eed" Dec 04 10:45:25 crc kubenswrapper[4831]: I1204 10:45:25.234593 4831 scope.go:117] "RemoveContainer" containerID="b1f56fd39c8ea9d813961a992c388c6af7b8f455249d842576221b585ac0e10e" Dec 04 10:45:25 crc kubenswrapper[4831]: I1204 10:45:25.282695 4831 scope.go:117] "RemoveContainer" containerID="89f75b3af9abc32d45d9c268e7e64c96787f53d6d6d7e348929c9398347edde5" Dec 04 10:45:25 crc kubenswrapper[4831]: I1204 10:45:25.318081 4831 scope.go:117] "RemoveContainer" containerID="3955d53461eddec4027820d5075fab1744fdfab5ddf7f0c3d04e497e833aa4b9" Dec 04 10:45:25 crc kubenswrapper[4831]: I1204 10:45:25.352828 4831 scope.go:117] "RemoveContainer" containerID="0932a40e56e000a340375dbe475e2b4e990f48ccb1c48929447bb5a112edb0b3" Dec 04 10:45:35 crc kubenswrapper[4831]: I1204 10:45:35.279940 4831 scope.go:117] "RemoveContainer" containerID="37e17da4eccc398a7fd4a49aa969c9f666aa6c38055bd6e9bb03c0ad0e553e58" Dec 04 10:45:35 crc kubenswrapper[4831]: E1204 10:45:35.280943 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:45:39 crc kubenswrapper[4831]: I1204 10:45:39.053073 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-63ed-account-create-lqdgk"] Dec 04 10:45:39 crc kubenswrapper[4831]: I1204 10:45:39.066057 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-2dcb-account-create-xjrzk"] Dec 04 10:45:39 crc kubenswrapper[4831]: I1204 10:45:39.074456 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-b1d5-account-create-lq99p"] Dec 04 10:45:39 crc kubenswrapper[4831]: I1204 10:45:39.082823 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-63ed-account-create-lqdgk"] Dec 04 10:45:39 crc kubenswrapper[4831]: I1204 10:45:39.090235 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-2dcb-account-create-xjrzk"] Dec 04 10:45:39 crc kubenswrapper[4831]: I1204 10:45:39.097610 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-b1d5-account-create-lq99p"] Dec 04 10:45:39 crc kubenswrapper[4831]: I1204 10:45:39.286865 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65a7e487-a7d3-491f-bc96-e7e2ff378a2c" path="/var/lib/kubelet/pods/65a7e487-a7d3-491f-bc96-e7e2ff378a2c/volumes" Dec 04 10:45:39 crc kubenswrapper[4831]: I1204 10:45:39.287413 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcf3269a-7a62-450b-b6e3-5b18451dd26f" path="/var/lib/kubelet/pods/bcf3269a-7a62-450b-b6e3-5b18451dd26f/volumes" Dec 04 10:45:39 crc kubenswrapper[4831]: I1204 10:45:39.287926 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0cfead2-fbc2-4c82-a779-c1419a2bdd13" path="/var/lib/kubelet/pods/d0cfead2-fbc2-4c82-a779-c1419a2bdd13/volumes" Dec 04 10:45:49 crc kubenswrapper[4831]: I1204 10:45:49.277239 4831 scope.go:117] "RemoveContainer" containerID="37e17da4eccc398a7fd4a49aa969c9f666aa6c38055bd6e9bb03c0ad0e553e58" Dec 04 10:45:49 crc kubenswrapper[4831]: E1204 10:45:49.277847 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:46:04 crc kubenswrapper[4831]: I1204 10:46:04.277578 4831 scope.go:117] "RemoveContainer" containerID="37e17da4eccc398a7fd4a49aa969c9f666aa6c38055bd6e9bb03c0ad0e553e58" Dec 04 10:46:04 crc kubenswrapper[4831]: E1204 10:46:04.278358 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:46:11 crc kubenswrapper[4831]: I1204 10:46:11.042219 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-x2xtp"] Dec 04 10:46:11 crc kubenswrapper[4831]: I1204 10:46:11.054225 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-x2xtp"] Dec 04 10:46:11 crc kubenswrapper[4831]: I1204 10:46:11.290419 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83153b32-d324-4aec-a5f3-ff0c0a47c0ee" path="/var/lib/kubelet/pods/83153b32-d324-4aec-a5f3-ff0c0a47c0ee/volumes" Dec 04 10:46:18 crc kubenswrapper[4831]: I1204 10:46:18.276560 4831 scope.go:117] "RemoveContainer" containerID="37e17da4eccc398a7fd4a49aa969c9f666aa6c38055bd6e9bb03c0ad0e553e58" Dec 04 10:46:18 crc kubenswrapper[4831]: E1204 10:46:18.277493 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:46:20 crc kubenswrapper[4831]: I1204 10:46:20.888367 4831 generic.go:334] "Generic (PLEG): container finished" podID="4acb6010-6e0f-45c8-b768-1bd3a9a090c8" containerID="92de8076f198470586c17cf59098a3abe87d4723998eabcd87989abacb699a4a" exitCode=0 Dec 04 10:46:20 crc kubenswrapper[4831]: I1204 10:46:20.888470 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdn78" event={"ID":"4acb6010-6e0f-45c8-b768-1bd3a9a090c8","Type":"ContainerDied","Data":"92de8076f198470586c17cf59098a3abe87d4723998eabcd87989abacb699a4a"} Dec 04 10:46:22 crc kubenswrapper[4831]: I1204 10:46:22.410962 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdn78" Dec 04 10:46:22 crc kubenswrapper[4831]: I1204 10:46:22.516686 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4acb6010-6e0f-45c8-b768-1bd3a9a090c8-inventory\") pod \"4acb6010-6e0f-45c8-b768-1bd3a9a090c8\" (UID: \"4acb6010-6e0f-45c8-b768-1bd3a9a090c8\") " Dec 04 10:46:22 crc kubenswrapper[4831]: I1204 10:46:22.516817 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lhcc\" (UniqueName: \"kubernetes.io/projected/4acb6010-6e0f-45c8-b768-1bd3a9a090c8-kube-api-access-7lhcc\") pod \"4acb6010-6e0f-45c8-b768-1bd3a9a090c8\" (UID: \"4acb6010-6e0f-45c8-b768-1bd3a9a090c8\") " Dec 04 10:46:22 crc kubenswrapper[4831]: I1204 10:46:22.516879 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4acb6010-6e0f-45c8-b768-1bd3a9a090c8-ssh-key\") pod \"4acb6010-6e0f-45c8-b768-1bd3a9a090c8\" (UID: \"4acb6010-6e0f-45c8-b768-1bd3a9a090c8\") " Dec 04 10:46:22 crc kubenswrapper[4831]: I1204 10:46:22.526273 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4acb6010-6e0f-45c8-b768-1bd3a9a090c8-kube-api-access-7lhcc" (OuterVolumeSpecName: "kube-api-access-7lhcc") pod "4acb6010-6e0f-45c8-b768-1bd3a9a090c8" (UID: "4acb6010-6e0f-45c8-b768-1bd3a9a090c8"). InnerVolumeSpecName "kube-api-access-7lhcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:46:22 crc kubenswrapper[4831]: I1204 10:46:22.554972 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4acb6010-6e0f-45c8-b768-1bd3a9a090c8-inventory" (OuterVolumeSpecName: "inventory") pod "4acb6010-6e0f-45c8-b768-1bd3a9a090c8" (UID: "4acb6010-6e0f-45c8-b768-1bd3a9a090c8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:46:22 crc kubenswrapper[4831]: I1204 10:46:22.557802 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4acb6010-6e0f-45c8-b768-1bd3a9a090c8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4acb6010-6e0f-45c8-b768-1bd3a9a090c8" (UID: "4acb6010-6e0f-45c8-b768-1bd3a9a090c8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:46:22 crc kubenswrapper[4831]: I1204 10:46:22.619427 4831 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4acb6010-6e0f-45c8-b768-1bd3a9a090c8-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:46:22 crc kubenswrapper[4831]: I1204 10:46:22.619491 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lhcc\" (UniqueName: \"kubernetes.io/projected/4acb6010-6e0f-45c8-b768-1bd3a9a090c8-kube-api-access-7lhcc\") on node \"crc\" DevicePath \"\"" Dec 04 10:46:22 crc kubenswrapper[4831]: I1204 10:46:22.619506 4831 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4acb6010-6e0f-45c8-b768-1bd3a9a090c8-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:46:22 crc kubenswrapper[4831]: I1204 10:46:22.912914 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdn78" event={"ID":"4acb6010-6e0f-45c8-b768-1bd3a9a090c8","Type":"ContainerDied","Data":"68a888bca141032878d9fcae1adc86140b4e32d6196e632fe53fb9464c0e3360"} Dec 04 10:46:22 crc kubenswrapper[4831]: I1204 10:46:22.912963 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68a888bca141032878d9fcae1adc86140b4e32d6196e632fe53fb9464c0e3360" Dec 04 10:46:22 crc kubenswrapper[4831]: I1204 10:46:22.913031 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdn78" Dec 04 10:46:22 crc kubenswrapper[4831]: I1204 10:46:22.995965 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pgmvz"] Dec 04 10:46:22 crc kubenswrapper[4831]: E1204 10:46:22.996629 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2b62802-882f-4b75-95ca-736ffafb4d63" containerName="collect-profiles" Dec 04 10:46:22 crc kubenswrapper[4831]: I1204 10:46:22.996675 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2b62802-882f-4b75-95ca-736ffafb4d63" containerName="collect-profiles" Dec 04 10:46:22 crc kubenswrapper[4831]: E1204 10:46:22.996713 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4acb6010-6e0f-45c8-b768-1bd3a9a090c8" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 04 10:46:22 crc kubenswrapper[4831]: I1204 10:46:22.996723 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="4acb6010-6e0f-45c8-b768-1bd3a9a090c8" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 04 10:46:22 crc kubenswrapper[4831]: I1204 10:46:22.996951 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="4acb6010-6e0f-45c8-b768-1bd3a9a090c8" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 04 10:46:22 crc kubenswrapper[4831]: I1204 10:46:22.996986 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2b62802-882f-4b75-95ca-736ffafb4d63" containerName="collect-profiles" Dec 04 10:46:22 crc kubenswrapper[4831]: I1204 10:46:22.998042 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pgmvz" Dec 04 10:46:23 crc kubenswrapper[4831]: I1204 10:46:23.001847 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:46:23 crc kubenswrapper[4831]: I1204 10:46:23.002433 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:46:23 crc kubenswrapper[4831]: I1204 10:46:23.002645 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:46:23 crc kubenswrapper[4831]: I1204 10:46:23.002863 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-l2wn5" Dec 04 10:46:23 crc kubenswrapper[4831]: I1204 10:46:23.025451 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pgmvz"] Dec 04 10:46:23 crc kubenswrapper[4831]: I1204 10:46:23.029841 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5b9af01-df1b-475d-9333-b620a4eacf73-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pgmvz\" (UID: \"c5b9af01-df1b-475d-9333-b620a4eacf73\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pgmvz" Dec 04 10:46:23 crc kubenswrapper[4831]: I1204 10:46:23.029903 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrng7\" (UniqueName: \"kubernetes.io/projected/c5b9af01-df1b-475d-9333-b620a4eacf73-kube-api-access-qrng7\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pgmvz\" (UID: \"c5b9af01-df1b-475d-9333-b620a4eacf73\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pgmvz" Dec 04 10:46:23 crc kubenswrapper[4831]: I1204 10:46:23.030007 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5b9af01-df1b-475d-9333-b620a4eacf73-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pgmvz\" (UID: \"c5b9af01-df1b-475d-9333-b620a4eacf73\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pgmvz" Dec 04 10:46:23 crc kubenswrapper[4831]: I1204 10:46:23.131597 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5b9af01-df1b-475d-9333-b620a4eacf73-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pgmvz\" (UID: \"c5b9af01-df1b-475d-9333-b620a4eacf73\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pgmvz" Dec 04 10:46:23 crc kubenswrapper[4831]: I1204 10:46:23.131980 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5b9af01-df1b-475d-9333-b620a4eacf73-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pgmvz\" (UID: \"c5b9af01-df1b-475d-9333-b620a4eacf73\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pgmvz" Dec 04 10:46:23 crc kubenswrapper[4831]: I1204 10:46:23.132068 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrng7\" (UniqueName: \"kubernetes.io/projected/c5b9af01-df1b-475d-9333-b620a4eacf73-kube-api-access-qrng7\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pgmvz\" (UID: \"c5b9af01-df1b-475d-9333-b620a4eacf73\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pgmvz" Dec 04 10:46:23 crc kubenswrapper[4831]: I1204 10:46:23.141803 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5b9af01-df1b-475d-9333-b620a4eacf73-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pgmvz\" (UID: \"c5b9af01-df1b-475d-9333-b620a4eacf73\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pgmvz" Dec 04 10:46:23 crc kubenswrapper[4831]: I1204 10:46:23.142539 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5b9af01-df1b-475d-9333-b620a4eacf73-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pgmvz\" (UID: \"c5b9af01-df1b-475d-9333-b620a4eacf73\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pgmvz" Dec 04 10:46:23 crc kubenswrapper[4831]: I1204 10:46:23.150279 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrng7\" (UniqueName: \"kubernetes.io/projected/c5b9af01-df1b-475d-9333-b620a4eacf73-kube-api-access-qrng7\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pgmvz\" (UID: \"c5b9af01-df1b-475d-9333-b620a4eacf73\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pgmvz" Dec 04 10:46:23 crc kubenswrapper[4831]: I1204 10:46:23.322268 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pgmvz" Dec 04 10:46:23 crc kubenswrapper[4831]: I1204 10:46:23.847318 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pgmvz"] Dec 04 10:46:23 crc kubenswrapper[4831]: I1204 10:46:23.854620 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 10:46:23 crc kubenswrapper[4831]: I1204 10:46:23.921397 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pgmvz" event={"ID":"c5b9af01-df1b-475d-9333-b620a4eacf73","Type":"ContainerStarted","Data":"6dfea721143ed8cb161eaa64bfd8e803bf2d2262388520552d63b2bdfa00d64b"} Dec 04 10:46:25 crc kubenswrapper[4831]: I1204 10:46:25.516120 4831 scope.go:117] "RemoveContainer" containerID="f1f6ba1c9cfda0d1b792b32aa7c5ecd02169e2bed716e062c9ea7a74dabd2a2b" Dec 04 10:46:25 crc kubenswrapper[4831]: I1204 10:46:25.543822 4831 scope.go:117] "RemoveContainer" containerID="3803d71b1b1890b734afe453af5bc2ca843a43328ac506860f50d2a6674b0fbb" Dec 04 10:46:25 crc kubenswrapper[4831]: I1204 10:46:25.735142 4831 scope.go:117] "RemoveContainer" containerID="ff06089c757328619340b12c1a486391c56eb662de2ec8be9963d598b2045d3a" Dec 04 10:46:25 crc kubenswrapper[4831]: I1204 10:46:25.762939 4831 scope.go:117] "RemoveContainer" containerID="fac8026c758d9cd1924bd9e9aa0add42ed0ae439dccf199606ae46498ddb299c" Dec 04 10:46:25 crc kubenswrapper[4831]: I1204 10:46:25.957768 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pgmvz" event={"ID":"c5b9af01-df1b-475d-9333-b620a4eacf73","Type":"ContainerStarted","Data":"ffe2ad8f4c4c31880f26b6e9325dd4ed4071bf377c114dad5eea9b422f39ed57"} Dec 04 10:46:25 crc kubenswrapper[4831]: I1204 10:46:25.977339 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pgmvz" podStartSLOduration=3.087244977 podStartE2EDuration="3.977314898s" podCreationTimestamp="2025-12-04 10:46:22 +0000 UTC" firstStartedPulling="2025-12-04 10:46:23.854341063 +0000 UTC m=+1880.803516377" lastFinishedPulling="2025-12-04 10:46:24.744410994 +0000 UTC m=+1881.693586298" observedRunningTime="2025-12-04 10:46:25.973450626 +0000 UTC m=+1882.922625970" watchObservedRunningTime="2025-12-04 10:46:25.977314898 +0000 UTC m=+1882.926490222" Dec 04 10:46:29 crc kubenswrapper[4831]: I1204 10:46:29.997926 4831 generic.go:334] "Generic (PLEG): container finished" podID="c5b9af01-df1b-475d-9333-b620a4eacf73" containerID="ffe2ad8f4c4c31880f26b6e9325dd4ed4071bf377c114dad5eea9b422f39ed57" exitCode=0 Dec 04 10:46:29 crc kubenswrapper[4831]: I1204 10:46:29.998041 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pgmvz" event={"ID":"c5b9af01-df1b-475d-9333-b620a4eacf73","Type":"ContainerDied","Data":"ffe2ad8f4c4c31880f26b6e9325dd4ed4071bf377c114dad5eea9b422f39ed57"} Dec 04 10:46:30 crc kubenswrapper[4831]: I1204 10:46:30.277084 4831 scope.go:117] "RemoveContainer" containerID="37e17da4eccc398a7fd4a49aa969c9f666aa6c38055bd6e9bb03c0ad0e553e58" Dec 04 10:46:30 crc kubenswrapper[4831]: E1204 10:46:30.277378 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:46:31 crc kubenswrapper[4831]: I1204 10:46:31.453760 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pgmvz" Dec 04 10:46:31 crc kubenswrapper[4831]: I1204 10:46:31.500244 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5b9af01-df1b-475d-9333-b620a4eacf73-inventory\") pod \"c5b9af01-df1b-475d-9333-b620a4eacf73\" (UID: \"c5b9af01-df1b-475d-9333-b620a4eacf73\") " Dec 04 10:46:31 crc kubenswrapper[4831]: I1204 10:46:31.500467 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5b9af01-df1b-475d-9333-b620a4eacf73-ssh-key\") pod \"c5b9af01-df1b-475d-9333-b620a4eacf73\" (UID: \"c5b9af01-df1b-475d-9333-b620a4eacf73\") " Dec 04 10:46:31 crc kubenswrapper[4831]: I1204 10:46:31.500522 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrng7\" (UniqueName: \"kubernetes.io/projected/c5b9af01-df1b-475d-9333-b620a4eacf73-kube-api-access-qrng7\") pod \"c5b9af01-df1b-475d-9333-b620a4eacf73\" (UID: \"c5b9af01-df1b-475d-9333-b620a4eacf73\") " Dec 04 10:46:31 crc kubenswrapper[4831]: I1204 10:46:31.506843 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5b9af01-df1b-475d-9333-b620a4eacf73-kube-api-access-qrng7" (OuterVolumeSpecName: "kube-api-access-qrng7") pod "c5b9af01-df1b-475d-9333-b620a4eacf73" (UID: "c5b9af01-df1b-475d-9333-b620a4eacf73"). InnerVolumeSpecName "kube-api-access-qrng7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:46:31 crc kubenswrapper[4831]: I1204 10:46:31.526682 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5b9af01-df1b-475d-9333-b620a4eacf73-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c5b9af01-df1b-475d-9333-b620a4eacf73" (UID: "c5b9af01-df1b-475d-9333-b620a4eacf73"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:46:31 crc kubenswrapper[4831]: I1204 10:46:31.555760 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5b9af01-df1b-475d-9333-b620a4eacf73-inventory" (OuterVolumeSpecName: "inventory") pod "c5b9af01-df1b-475d-9333-b620a4eacf73" (UID: "c5b9af01-df1b-475d-9333-b620a4eacf73"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:46:31 crc kubenswrapper[4831]: I1204 10:46:31.603542 4831 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5b9af01-df1b-475d-9333-b620a4eacf73-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:46:31 crc kubenswrapper[4831]: I1204 10:46:31.603572 4831 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5b9af01-df1b-475d-9333-b620a4eacf73-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:46:31 crc kubenswrapper[4831]: I1204 10:46:31.603627 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrng7\" (UniqueName: \"kubernetes.io/projected/c5b9af01-df1b-475d-9333-b620a4eacf73-kube-api-access-qrng7\") on node \"crc\" DevicePath \"\"" Dec 04 10:46:32 crc kubenswrapper[4831]: I1204 10:46:32.022508 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pgmvz" event={"ID":"c5b9af01-df1b-475d-9333-b620a4eacf73","Type":"ContainerDied","Data":"6dfea721143ed8cb161eaa64bfd8e803bf2d2262388520552d63b2bdfa00d64b"} Dec 04 10:46:32 crc kubenswrapper[4831]: I1204 10:46:32.022771 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6dfea721143ed8cb161eaa64bfd8e803bf2d2262388520552d63b2bdfa00d64b" Dec 04 10:46:32 crc kubenswrapper[4831]: I1204 10:46:32.022585 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pgmvz" Dec 04 10:46:32 crc kubenswrapper[4831]: E1204 10:46:32.083962 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5b9af01_df1b_475d_9333_b620a4eacf73.slice\": RecentStats: unable to find data in memory cache]" Dec 04 10:46:32 crc kubenswrapper[4831]: I1204 10:46:32.098772 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-cdjgr"] Dec 04 10:46:32 crc kubenswrapper[4831]: E1204 10:46:32.099532 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5b9af01-df1b-475d-9333-b620a4eacf73" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 04 10:46:32 crc kubenswrapper[4831]: I1204 10:46:32.099605 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5b9af01-df1b-475d-9333-b620a4eacf73" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 04 10:46:32 crc kubenswrapper[4831]: I1204 10:46:32.099866 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5b9af01-df1b-475d-9333-b620a4eacf73" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 04 10:46:32 crc kubenswrapper[4831]: I1204 10:46:32.100565 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cdjgr" Dec 04 10:46:32 crc kubenswrapper[4831]: I1204 10:46:32.106166 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-l2wn5" Dec 04 10:46:32 crc kubenswrapper[4831]: I1204 10:46:32.108351 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:46:32 crc kubenswrapper[4831]: I1204 10:46:32.108721 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:46:32 crc kubenswrapper[4831]: I1204 10:46:32.112852 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:46:32 crc kubenswrapper[4831]: I1204 10:46:32.114396 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6caf006b-f650-4ea1-89bd-466b524c2049-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-cdjgr\" (UID: \"6caf006b-f650-4ea1-89bd-466b524c2049\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cdjgr" Dec 04 10:46:32 crc kubenswrapper[4831]: I1204 10:46:32.114449 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjbmn\" (UniqueName: \"kubernetes.io/projected/6caf006b-f650-4ea1-89bd-466b524c2049-kube-api-access-fjbmn\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-cdjgr\" (UID: \"6caf006b-f650-4ea1-89bd-466b524c2049\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cdjgr" Dec 04 10:46:32 crc kubenswrapper[4831]: I1204 10:46:32.114627 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6caf006b-f650-4ea1-89bd-466b524c2049-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-cdjgr\" (UID: \"6caf006b-f650-4ea1-89bd-466b524c2049\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cdjgr" Dec 04 10:46:32 crc kubenswrapper[4831]: I1204 10:46:32.118945 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-cdjgr"] Dec 04 10:46:32 crc kubenswrapper[4831]: I1204 10:46:32.217749 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6caf006b-f650-4ea1-89bd-466b524c2049-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-cdjgr\" (UID: \"6caf006b-f650-4ea1-89bd-466b524c2049\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cdjgr" Dec 04 10:46:32 crc kubenswrapper[4831]: I1204 10:46:32.217801 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjbmn\" (UniqueName: \"kubernetes.io/projected/6caf006b-f650-4ea1-89bd-466b524c2049-kube-api-access-fjbmn\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-cdjgr\" (UID: \"6caf006b-f650-4ea1-89bd-466b524c2049\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cdjgr" Dec 04 10:46:32 crc kubenswrapper[4831]: I1204 10:46:32.218028 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6caf006b-f650-4ea1-89bd-466b524c2049-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-cdjgr\" (UID: \"6caf006b-f650-4ea1-89bd-466b524c2049\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cdjgr" Dec 04 10:46:32 crc kubenswrapper[4831]: I1204 10:46:32.223468 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6caf006b-f650-4ea1-89bd-466b524c2049-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-cdjgr\" (UID: \"6caf006b-f650-4ea1-89bd-466b524c2049\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cdjgr" Dec 04 10:46:32 crc kubenswrapper[4831]: I1204 10:46:32.223538 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6caf006b-f650-4ea1-89bd-466b524c2049-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-cdjgr\" (UID: \"6caf006b-f650-4ea1-89bd-466b524c2049\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cdjgr" Dec 04 10:46:32 crc kubenswrapper[4831]: I1204 10:46:32.236146 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjbmn\" (UniqueName: \"kubernetes.io/projected/6caf006b-f650-4ea1-89bd-466b524c2049-kube-api-access-fjbmn\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-cdjgr\" (UID: \"6caf006b-f650-4ea1-89bd-466b524c2049\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cdjgr" Dec 04 10:46:32 crc kubenswrapper[4831]: I1204 10:46:32.420553 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cdjgr" Dec 04 10:46:33 crc kubenswrapper[4831]: I1204 10:46:33.027205 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-cdjgr"] Dec 04 10:46:33 crc kubenswrapper[4831]: W1204 10:46:33.031195 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6caf006b_f650_4ea1_89bd_466b524c2049.slice/crio-2506ca33ebe315fd864e30b3ac8923a8df7129536675b2a875cff498b170ccad WatchSource:0}: Error finding container 2506ca33ebe315fd864e30b3ac8923a8df7129536675b2a875cff498b170ccad: Status 404 returned error can't find the container with id 2506ca33ebe315fd864e30b3ac8923a8df7129536675b2a875cff498b170ccad Dec 04 10:46:34 crc kubenswrapper[4831]: I1204 10:46:34.032535 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-mmb2j"] Dec 04 10:46:34 crc kubenswrapper[4831]: I1204 10:46:34.043170 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-mmb2j"] Dec 04 10:46:34 crc kubenswrapper[4831]: I1204 10:46:34.043526 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cdjgr" event={"ID":"6caf006b-f650-4ea1-89bd-466b524c2049","Type":"ContainerStarted","Data":"0810cf7133fb6e7db30e2bc8a02cd84fecf6a809e78cdadaac02df6e65c4df6a"} Dec 04 10:46:34 crc kubenswrapper[4831]: I1204 10:46:34.043573 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cdjgr" event={"ID":"6caf006b-f650-4ea1-89bd-466b524c2049","Type":"ContainerStarted","Data":"2506ca33ebe315fd864e30b3ac8923a8df7129536675b2a875cff498b170ccad"} Dec 04 10:46:34 crc kubenswrapper[4831]: I1204 10:46:34.070183 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cdjgr" podStartSLOduration=1.448237025 podStartE2EDuration="2.070164537s" podCreationTimestamp="2025-12-04 10:46:32 +0000 UTC" firstStartedPulling="2025-12-04 10:46:33.033697999 +0000 UTC m=+1889.982873313" lastFinishedPulling="2025-12-04 10:46:33.655625501 +0000 UTC m=+1890.604800825" observedRunningTime="2025-12-04 10:46:34.060468231 +0000 UTC m=+1891.009643565" watchObservedRunningTime="2025-12-04 10:46:34.070164537 +0000 UTC m=+1891.019339851" Dec 04 10:46:35 crc kubenswrapper[4831]: I1204 10:46:35.292965 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dbc983b-0bfd-4646-9a07-4f0894f1c480" path="/var/lib/kubelet/pods/2dbc983b-0bfd-4646-9a07-4f0894f1c480/volumes" Dec 04 10:46:37 crc kubenswrapper[4831]: I1204 10:46:37.034055 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-r67gq"] Dec 04 10:46:37 crc kubenswrapper[4831]: I1204 10:46:37.052676 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-r67gq"] Dec 04 10:46:37 crc kubenswrapper[4831]: I1204 10:46:37.291752 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6a6469d-082d-4be1-ac56-bf92b750390d" path="/var/lib/kubelet/pods/c6a6469d-082d-4be1-ac56-bf92b750390d/volumes" Dec 04 10:46:45 crc kubenswrapper[4831]: I1204 10:46:45.276292 4831 scope.go:117] "RemoveContainer" containerID="37e17da4eccc398a7fd4a49aa969c9f666aa6c38055bd6e9bb03c0ad0e553e58" Dec 04 10:46:45 crc kubenswrapper[4831]: E1204 10:46:45.277234 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:46:59 crc kubenswrapper[4831]: I1204 10:46:59.279234 4831 scope.go:117] "RemoveContainer" containerID="37e17da4eccc398a7fd4a49aa969c9f666aa6c38055bd6e9bb03c0ad0e553e58" Dec 04 10:46:59 crc kubenswrapper[4831]: E1204 10:46:59.280167 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:47:00 crc kubenswrapper[4831]: I1204 10:47:00.404311 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kthqz"] Dec 04 10:47:00 crc kubenswrapper[4831]: I1204 10:47:00.407297 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kthqz" Dec 04 10:47:00 crc kubenswrapper[4831]: I1204 10:47:00.430977 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kthqz"] Dec 04 10:47:00 crc kubenswrapper[4831]: I1204 10:47:00.487038 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cef09d83-7db7-46be-b7e3-cecd6dc28139-utilities\") pod \"redhat-marketplace-kthqz\" (UID: \"cef09d83-7db7-46be-b7e3-cecd6dc28139\") " pod="openshift-marketplace/redhat-marketplace-kthqz" Dec 04 10:47:00 crc kubenswrapper[4831]: I1204 10:47:00.487165 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cef09d83-7db7-46be-b7e3-cecd6dc28139-catalog-content\") pod \"redhat-marketplace-kthqz\" (UID: \"cef09d83-7db7-46be-b7e3-cecd6dc28139\") " pod="openshift-marketplace/redhat-marketplace-kthqz" Dec 04 10:47:00 crc kubenswrapper[4831]: I1204 10:47:00.487210 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlzh4\" (UniqueName: \"kubernetes.io/projected/cef09d83-7db7-46be-b7e3-cecd6dc28139-kube-api-access-tlzh4\") pod \"redhat-marketplace-kthqz\" (UID: \"cef09d83-7db7-46be-b7e3-cecd6dc28139\") " pod="openshift-marketplace/redhat-marketplace-kthqz" Dec 04 10:47:00 crc kubenswrapper[4831]: I1204 10:47:00.589876 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cef09d83-7db7-46be-b7e3-cecd6dc28139-utilities\") pod \"redhat-marketplace-kthqz\" (UID: \"cef09d83-7db7-46be-b7e3-cecd6dc28139\") " pod="openshift-marketplace/redhat-marketplace-kthqz" Dec 04 10:47:00 crc kubenswrapper[4831]: I1204 10:47:00.589965 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cef09d83-7db7-46be-b7e3-cecd6dc28139-catalog-content\") pod \"redhat-marketplace-kthqz\" (UID: \"cef09d83-7db7-46be-b7e3-cecd6dc28139\") " pod="openshift-marketplace/redhat-marketplace-kthqz" Dec 04 10:47:00 crc kubenswrapper[4831]: I1204 10:47:00.590037 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlzh4\" (UniqueName: \"kubernetes.io/projected/cef09d83-7db7-46be-b7e3-cecd6dc28139-kube-api-access-tlzh4\") pod \"redhat-marketplace-kthqz\" (UID: \"cef09d83-7db7-46be-b7e3-cecd6dc28139\") " pod="openshift-marketplace/redhat-marketplace-kthqz" Dec 04 10:47:00 crc kubenswrapper[4831]: I1204 10:47:00.590548 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cef09d83-7db7-46be-b7e3-cecd6dc28139-utilities\") pod \"redhat-marketplace-kthqz\" (UID: \"cef09d83-7db7-46be-b7e3-cecd6dc28139\") " pod="openshift-marketplace/redhat-marketplace-kthqz" Dec 04 10:47:00 crc kubenswrapper[4831]: I1204 10:47:00.590640 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cef09d83-7db7-46be-b7e3-cecd6dc28139-catalog-content\") pod \"redhat-marketplace-kthqz\" (UID: \"cef09d83-7db7-46be-b7e3-cecd6dc28139\") " pod="openshift-marketplace/redhat-marketplace-kthqz" Dec 04 10:47:00 crc kubenswrapper[4831]: I1204 10:47:00.609739 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlzh4\" (UniqueName: \"kubernetes.io/projected/cef09d83-7db7-46be-b7e3-cecd6dc28139-kube-api-access-tlzh4\") pod \"redhat-marketplace-kthqz\" (UID: \"cef09d83-7db7-46be-b7e3-cecd6dc28139\") " pod="openshift-marketplace/redhat-marketplace-kthqz" Dec 04 10:47:00 crc kubenswrapper[4831]: I1204 10:47:00.767908 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kthqz" Dec 04 10:47:01 crc kubenswrapper[4831]: I1204 10:47:01.305415 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kthqz"] Dec 04 10:47:01 crc kubenswrapper[4831]: I1204 10:47:01.336654 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kthqz" event={"ID":"cef09d83-7db7-46be-b7e3-cecd6dc28139","Type":"ContainerStarted","Data":"999c119d3ed84af0aa2738cb4cfe2c0c48304173e526b3b01b765e46ee2a8e17"} Dec 04 10:47:02 crc kubenswrapper[4831]: I1204 10:47:02.349093 4831 generic.go:334] "Generic (PLEG): container finished" podID="cef09d83-7db7-46be-b7e3-cecd6dc28139" containerID="371cdb3e426a27731c8dbe00b3e549176de85ae8be42e17781471266a50e5a44" exitCode=0 Dec 04 10:47:02 crc kubenswrapper[4831]: I1204 10:47:02.349192 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kthqz" event={"ID":"cef09d83-7db7-46be-b7e3-cecd6dc28139","Type":"ContainerDied","Data":"371cdb3e426a27731c8dbe00b3e549176de85ae8be42e17781471266a50e5a44"} Dec 04 10:47:03 crc kubenswrapper[4831]: I1204 10:47:03.360806 4831 generic.go:334] "Generic (PLEG): container finished" podID="cef09d83-7db7-46be-b7e3-cecd6dc28139" containerID="32b03b9cab358f0b7bf4301ab3dd8664f48c833349ecdf730786c4a9025fe7ad" exitCode=0 Dec 04 10:47:03 crc kubenswrapper[4831]: I1204 10:47:03.360848 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kthqz" event={"ID":"cef09d83-7db7-46be-b7e3-cecd6dc28139","Type":"ContainerDied","Data":"32b03b9cab358f0b7bf4301ab3dd8664f48c833349ecdf730786c4a9025fe7ad"} Dec 04 10:47:04 crc kubenswrapper[4831]: I1204 10:47:04.372432 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kthqz" event={"ID":"cef09d83-7db7-46be-b7e3-cecd6dc28139","Type":"ContainerStarted","Data":"4d18820089c7d9d018fb92fd9f4aea8724caaac97feeca24f377db0180827d07"} Dec 04 10:47:04 crc kubenswrapper[4831]: I1204 10:47:04.397982 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kthqz" podStartSLOduration=2.925421444 podStartE2EDuration="4.397958596s" podCreationTimestamp="2025-12-04 10:47:00 +0000 UTC" firstStartedPulling="2025-12-04 10:47:02.352141048 +0000 UTC m=+1919.301316362" lastFinishedPulling="2025-12-04 10:47:03.8246782 +0000 UTC m=+1920.773853514" observedRunningTime="2025-12-04 10:47:04.391038094 +0000 UTC m=+1921.340213418" watchObservedRunningTime="2025-12-04 10:47:04.397958596 +0000 UTC m=+1921.347133910" Dec 04 10:47:10 crc kubenswrapper[4831]: I1204 10:47:10.768970 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kthqz" Dec 04 10:47:10 crc kubenswrapper[4831]: I1204 10:47:10.769638 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kthqz" Dec 04 10:47:10 crc kubenswrapper[4831]: I1204 10:47:10.822463 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kthqz" Dec 04 10:47:11 crc kubenswrapper[4831]: I1204 10:47:11.439883 4831 generic.go:334] "Generic (PLEG): container finished" podID="6caf006b-f650-4ea1-89bd-466b524c2049" containerID="0810cf7133fb6e7db30e2bc8a02cd84fecf6a809e78cdadaac02df6e65c4df6a" exitCode=0 Dec 04 10:47:11 crc kubenswrapper[4831]: I1204 10:47:11.440785 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cdjgr" event={"ID":"6caf006b-f650-4ea1-89bd-466b524c2049","Type":"ContainerDied","Data":"0810cf7133fb6e7db30e2bc8a02cd84fecf6a809e78cdadaac02df6e65c4df6a"} Dec 04 10:47:11 crc kubenswrapper[4831]: I1204 10:47:11.488586 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kthqz" Dec 04 10:47:12 crc kubenswrapper[4831]: I1204 10:47:12.392435 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kthqz"] Dec 04 10:47:12 crc kubenswrapper[4831]: I1204 10:47:12.899166 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cdjgr" Dec 04 10:47:12 crc kubenswrapper[4831]: I1204 10:47:12.957413 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjbmn\" (UniqueName: \"kubernetes.io/projected/6caf006b-f650-4ea1-89bd-466b524c2049-kube-api-access-fjbmn\") pod \"6caf006b-f650-4ea1-89bd-466b524c2049\" (UID: \"6caf006b-f650-4ea1-89bd-466b524c2049\") " Dec 04 10:47:12 crc kubenswrapper[4831]: I1204 10:47:12.957686 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6caf006b-f650-4ea1-89bd-466b524c2049-ssh-key\") pod \"6caf006b-f650-4ea1-89bd-466b524c2049\" (UID: \"6caf006b-f650-4ea1-89bd-466b524c2049\") " Dec 04 10:47:12 crc kubenswrapper[4831]: I1204 10:47:12.957875 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6caf006b-f650-4ea1-89bd-466b524c2049-inventory\") pod \"6caf006b-f650-4ea1-89bd-466b524c2049\" (UID: \"6caf006b-f650-4ea1-89bd-466b524c2049\") " Dec 04 10:47:12 crc kubenswrapper[4831]: I1204 10:47:12.966297 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6caf006b-f650-4ea1-89bd-466b524c2049-kube-api-access-fjbmn" (OuterVolumeSpecName: "kube-api-access-fjbmn") pod "6caf006b-f650-4ea1-89bd-466b524c2049" (UID: "6caf006b-f650-4ea1-89bd-466b524c2049"). InnerVolumeSpecName "kube-api-access-fjbmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:47:12 crc kubenswrapper[4831]: I1204 10:47:12.996399 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6caf006b-f650-4ea1-89bd-466b524c2049-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6caf006b-f650-4ea1-89bd-466b524c2049" (UID: "6caf006b-f650-4ea1-89bd-466b524c2049"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:47:13 crc kubenswrapper[4831]: I1204 10:47:13.009501 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6caf006b-f650-4ea1-89bd-466b524c2049-inventory" (OuterVolumeSpecName: "inventory") pod "6caf006b-f650-4ea1-89bd-466b524c2049" (UID: "6caf006b-f650-4ea1-89bd-466b524c2049"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:47:13 crc kubenswrapper[4831]: I1204 10:47:13.059330 4831 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6caf006b-f650-4ea1-89bd-466b524c2049-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:47:13 crc kubenswrapper[4831]: I1204 10:47:13.059407 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjbmn\" (UniqueName: \"kubernetes.io/projected/6caf006b-f650-4ea1-89bd-466b524c2049-kube-api-access-fjbmn\") on node \"crc\" DevicePath \"\"" Dec 04 10:47:13 crc kubenswrapper[4831]: I1204 10:47:13.059418 4831 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6caf006b-f650-4ea1-89bd-466b524c2049-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:47:13 crc kubenswrapper[4831]: I1204 10:47:13.286926 4831 scope.go:117] "RemoveContainer" containerID="37e17da4eccc398a7fd4a49aa969c9f666aa6c38055bd6e9bb03c0ad0e553e58" Dec 04 10:47:13 crc kubenswrapper[4831]: E1204 10:47:13.287531 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:47:13 crc kubenswrapper[4831]: I1204 10:47:13.464961 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cdjgr" Dec 04 10:47:13 crc kubenswrapper[4831]: I1204 10:47:13.464961 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cdjgr" event={"ID":"6caf006b-f650-4ea1-89bd-466b524c2049","Type":"ContainerDied","Data":"2506ca33ebe315fd864e30b3ac8923a8df7129536675b2a875cff498b170ccad"} Dec 04 10:47:13 crc kubenswrapper[4831]: I1204 10:47:13.465022 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2506ca33ebe315fd864e30b3ac8923a8df7129536675b2a875cff498b170ccad" Dec 04 10:47:13 crc kubenswrapper[4831]: I1204 10:47:13.465097 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kthqz" podUID="cef09d83-7db7-46be-b7e3-cecd6dc28139" containerName="registry-server" containerID="cri-o://4d18820089c7d9d018fb92fd9f4aea8724caaac97feeca24f377db0180827d07" gracePeriod=2 Dec 04 10:47:13 crc kubenswrapper[4831]: I1204 10:47:13.552991 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6glvd"] Dec 04 10:47:13 crc kubenswrapper[4831]: E1204 10:47:13.553436 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6caf006b-f650-4ea1-89bd-466b524c2049" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:47:13 crc kubenswrapper[4831]: I1204 10:47:13.553452 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6caf006b-f650-4ea1-89bd-466b524c2049" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:47:13 crc kubenswrapper[4831]: I1204 10:47:13.553759 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="6caf006b-f650-4ea1-89bd-466b524c2049" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:47:13 crc kubenswrapper[4831]: I1204 10:47:13.554442 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6glvd" Dec 04 10:47:13 crc kubenswrapper[4831]: I1204 10:47:13.557154 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:47:13 crc kubenswrapper[4831]: I1204 10:47:13.560219 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:47:13 crc kubenswrapper[4831]: I1204 10:47:13.560387 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-l2wn5" Dec 04 10:47:13 crc kubenswrapper[4831]: I1204 10:47:13.560832 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:47:13 crc kubenswrapper[4831]: I1204 10:47:13.567776 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6glvd"] Dec 04 10:47:13 crc kubenswrapper[4831]: I1204 10:47:13.674034 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkxn6\" (UniqueName: \"kubernetes.io/projected/62c2750f-dec9-4b21-b4c2-e1ee98b3b754-kube-api-access-fkxn6\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6glvd\" (UID: \"62c2750f-dec9-4b21-b4c2-e1ee98b3b754\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6glvd" Dec 04 10:47:13 crc kubenswrapper[4831]: I1204 10:47:13.674236 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62c2750f-dec9-4b21-b4c2-e1ee98b3b754-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6glvd\" (UID: \"62c2750f-dec9-4b21-b4c2-e1ee98b3b754\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6glvd" Dec 04 10:47:13 crc kubenswrapper[4831]: I1204 10:47:13.674320 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62c2750f-dec9-4b21-b4c2-e1ee98b3b754-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6glvd\" (UID: \"62c2750f-dec9-4b21-b4c2-e1ee98b3b754\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6glvd" Dec 04 10:47:13 crc kubenswrapper[4831]: I1204 10:47:13.776593 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkxn6\" (UniqueName: \"kubernetes.io/projected/62c2750f-dec9-4b21-b4c2-e1ee98b3b754-kube-api-access-fkxn6\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6glvd\" (UID: \"62c2750f-dec9-4b21-b4c2-e1ee98b3b754\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6glvd" Dec 04 10:47:13 crc kubenswrapper[4831]: I1204 10:47:13.776816 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62c2750f-dec9-4b21-b4c2-e1ee98b3b754-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6glvd\" (UID: \"62c2750f-dec9-4b21-b4c2-e1ee98b3b754\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6glvd" Dec 04 10:47:13 crc kubenswrapper[4831]: I1204 10:47:13.776898 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62c2750f-dec9-4b21-b4c2-e1ee98b3b754-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6glvd\" (UID: \"62c2750f-dec9-4b21-b4c2-e1ee98b3b754\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6glvd" Dec 04 10:47:13 crc kubenswrapper[4831]: I1204 10:47:13.785021 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62c2750f-dec9-4b21-b4c2-e1ee98b3b754-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6glvd\" (UID: \"62c2750f-dec9-4b21-b4c2-e1ee98b3b754\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6glvd" Dec 04 10:47:13 crc kubenswrapper[4831]: I1204 10:47:13.785021 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62c2750f-dec9-4b21-b4c2-e1ee98b3b754-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6glvd\" (UID: \"62c2750f-dec9-4b21-b4c2-e1ee98b3b754\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6glvd" Dec 04 10:47:13 crc kubenswrapper[4831]: I1204 10:47:13.803824 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkxn6\" (UniqueName: \"kubernetes.io/projected/62c2750f-dec9-4b21-b4c2-e1ee98b3b754-kube-api-access-fkxn6\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6glvd\" (UID: \"62c2750f-dec9-4b21-b4c2-e1ee98b3b754\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6glvd" Dec 04 10:47:14 crc kubenswrapper[4831]: I1204 10:47:14.001418 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6glvd" Dec 04 10:47:14 crc kubenswrapper[4831]: I1204 10:47:14.448490 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kthqz" Dec 04 10:47:14 crc kubenswrapper[4831]: I1204 10:47:14.492518 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cef09d83-7db7-46be-b7e3-cecd6dc28139-catalog-content\") pod \"cef09d83-7db7-46be-b7e3-cecd6dc28139\" (UID: \"cef09d83-7db7-46be-b7e3-cecd6dc28139\") " Dec 04 10:47:14 crc kubenswrapper[4831]: I1204 10:47:14.492682 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlzh4\" (UniqueName: \"kubernetes.io/projected/cef09d83-7db7-46be-b7e3-cecd6dc28139-kube-api-access-tlzh4\") pod \"cef09d83-7db7-46be-b7e3-cecd6dc28139\" (UID: \"cef09d83-7db7-46be-b7e3-cecd6dc28139\") " Dec 04 10:47:14 crc kubenswrapper[4831]: I1204 10:47:14.492860 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cef09d83-7db7-46be-b7e3-cecd6dc28139-utilities\") pod \"cef09d83-7db7-46be-b7e3-cecd6dc28139\" (UID: \"cef09d83-7db7-46be-b7e3-cecd6dc28139\") " Dec 04 10:47:14 crc kubenswrapper[4831]: I1204 10:47:14.496302 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cef09d83-7db7-46be-b7e3-cecd6dc28139-utilities" (OuterVolumeSpecName: "utilities") pod "cef09d83-7db7-46be-b7e3-cecd6dc28139" (UID: "cef09d83-7db7-46be-b7e3-cecd6dc28139"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:47:14 crc kubenswrapper[4831]: I1204 10:47:14.502358 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cef09d83-7db7-46be-b7e3-cecd6dc28139-kube-api-access-tlzh4" (OuterVolumeSpecName: "kube-api-access-tlzh4") pod "cef09d83-7db7-46be-b7e3-cecd6dc28139" (UID: "cef09d83-7db7-46be-b7e3-cecd6dc28139"). InnerVolumeSpecName "kube-api-access-tlzh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:47:14 crc kubenswrapper[4831]: I1204 10:47:14.511872 4831 generic.go:334] "Generic (PLEG): container finished" podID="cef09d83-7db7-46be-b7e3-cecd6dc28139" containerID="4d18820089c7d9d018fb92fd9f4aea8724caaac97feeca24f377db0180827d07" exitCode=0 Dec 04 10:47:14 crc kubenswrapper[4831]: I1204 10:47:14.511934 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kthqz" event={"ID":"cef09d83-7db7-46be-b7e3-cecd6dc28139","Type":"ContainerDied","Data":"4d18820089c7d9d018fb92fd9f4aea8724caaac97feeca24f377db0180827d07"} Dec 04 10:47:14 crc kubenswrapper[4831]: I1204 10:47:14.511966 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kthqz" event={"ID":"cef09d83-7db7-46be-b7e3-cecd6dc28139","Type":"ContainerDied","Data":"999c119d3ed84af0aa2738cb4cfe2c0c48304173e526b3b01b765e46ee2a8e17"} Dec 04 10:47:14 crc kubenswrapper[4831]: I1204 10:47:14.511991 4831 scope.go:117] "RemoveContainer" containerID="4d18820089c7d9d018fb92fd9f4aea8724caaac97feeca24f377db0180827d07" Dec 04 10:47:14 crc kubenswrapper[4831]: I1204 10:47:14.512479 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kthqz" Dec 04 10:47:14 crc kubenswrapper[4831]: I1204 10:47:14.528348 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cef09d83-7db7-46be-b7e3-cecd6dc28139-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cef09d83-7db7-46be-b7e3-cecd6dc28139" (UID: "cef09d83-7db7-46be-b7e3-cecd6dc28139"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:47:14 crc kubenswrapper[4831]: I1204 10:47:14.542187 4831 scope.go:117] "RemoveContainer" containerID="32b03b9cab358f0b7bf4301ab3dd8664f48c833349ecdf730786c4a9025fe7ad" Dec 04 10:47:14 crc kubenswrapper[4831]: I1204 10:47:14.580314 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6glvd"] Dec 04 10:47:14 crc kubenswrapper[4831]: I1204 10:47:14.597858 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cef09d83-7db7-46be-b7e3-cecd6dc28139-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:47:14 crc kubenswrapper[4831]: I1204 10:47:14.597910 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cef09d83-7db7-46be-b7e3-cecd6dc28139-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:47:14 crc kubenswrapper[4831]: I1204 10:47:14.597931 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlzh4\" (UniqueName: \"kubernetes.io/projected/cef09d83-7db7-46be-b7e3-cecd6dc28139-kube-api-access-tlzh4\") on node \"crc\" DevicePath \"\"" Dec 04 10:47:14 crc kubenswrapper[4831]: I1204 10:47:14.603926 4831 scope.go:117] "RemoveContainer" containerID="371cdb3e426a27731c8dbe00b3e549176de85ae8be42e17781471266a50e5a44" Dec 04 10:47:14 crc kubenswrapper[4831]: W1204 10:47:14.613837 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62c2750f_dec9_4b21_b4c2_e1ee98b3b754.slice/crio-b073467db8d41c4a731d2e6307dbcc99440a5869a7dd07bc1a3614976d35bc86 WatchSource:0}: Error finding container b073467db8d41c4a731d2e6307dbcc99440a5869a7dd07bc1a3614976d35bc86: Status 404 returned error can't find the container with id b073467db8d41c4a731d2e6307dbcc99440a5869a7dd07bc1a3614976d35bc86 Dec 04 10:47:14 crc kubenswrapper[4831]: I1204 10:47:14.640568 4831 scope.go:117] "RemoveContainer" containerID="4d18820089c7d9d018fb92fd9f4aea8724caaac97feeca24f377db0180827d07" Dec 04 10:47:14 crc kubenswrapper[4831]: E1204 10:47:14.641160 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d18820089c7d9d018fb92fd9f4aea8724caaac97feeca24f377db0180827d07\": container with ID starting with 4d18820089c7d9d018fb92fd9f4aea8724caaac97feeca24f377db0180827d07 not found: ID does not exist" containerID="4d18820089c7d9d018fb92fd9f4aea8724caaac97feeca24f377db0180827d07" Dec 04 10:47:14 crc kubenswrapper[4831]: I1204 10:47:14.641199 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d18820089c7d9d018fb92fd9f4aea8724caaac97feeca24f377db0180827d07"} err="failed to get container status \"4d18820089c7d9d018fb92fd9f4aea8724caaac97feeca24f377db0180827d07\": rpc error: code = NotFound desc = could not find container \"4d18820089c7d9d018fb92fd9f4aea8724caaac97feeca24f377db0180827d07\": container with ID starting with 4d18820089c7d9d018fb92fd9f4aea8724caaac97feeca24f377db0180827d07 not found: ID does not exist" Dec 04 10:47:14 crc kubenswrapper[4831]: I1204 10:47:14.641232 4831 scope.go:117] "RemoveContainer" containerID="32b03b9cab358f0b7bf4301ab3dd8664f48c833349ecdf730786c4a9025fe7ad" Dec 04 10:47:14 crc kubenswrapper[4831]: E1204 10:47:14.641476 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32b03b9cab358f0b7bf4301ab3dd8664f48c833349ecdf730786c4a9025fe7ad\": container with ID starting with 32b03b9cab358f0b7bf4301ab3dd8664f48c833349ecdf730786c4a9025fe7ad not found: ID does not exist" containerID="32b03b9cab358f0b7bf4301ab3dd8664f48c833349ecdf730786c4a9025fe7ad" Dec 04 10:47:14 crc kubenswrapper[4831]: I1204 10:47:14.641509 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32b03b9cab358f0b7bf4301ab3dd8664f48c833349ecdf730786c4a9025fe7ad"} err="failed to get container status \"32b03b9cab358f0b7bf4301ab3dd8664f48c833349ecdf730786c4a9025fe7ad\": rpc error: code = NotFound desc = could not find container \"32b03b9cab358f0b7bf4301ab3dd8664f48c833349ecdf730786c4a9025fe7ad\": container with ID starting with 32b03b9cab358f0b7bf4301ab3dd8664f48c833349ecdf730786c4a9025fe7ad not found: ID does not exist" Dec 04 10:47:14 crc kubenswrapper[4831]: I1204 10:47:14.641526 4831 scope.go:117] "RemoveContainer" containerID="371cdb3e426a27731c8dbe00b3e549176de85ae8be42e17781471266a50e5a44" Dec 04 10:47:14 crc kubenswrapper[4831]: E1204 10:47:14.641935 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"371cdb3e426a27731c8dbe00b3e549176de85ae8be42e17781471266a50e5a44\": container with ID starting with 371cdb3e426a27731c8dbe00b3e549176de85ae8be42e17781471266a50e5a44 not found: ID does not exist" containerID="371cdb3e426a27731c8dbe00b3e549176de85ae8be42e17781471266a50e5a44" Dec 04 10:47:14 crc kubenswrapper[4831]: I1204 10:47:14.641968 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"371cdb3e426a27731c8dbe00b3e549176de85ae8be42e17781471266a50e5a44"} err="failed to get container status \"371cdb3e426a27731c8dbe00b3e549176de85ae8be42e17781471266a50e5a44\": rpc error: code = NotFound desc = could not find container \"371cdb3e426a27731c8dbe00b3e549176de85ae8be42e17781471266a50e5a44\": container with ID starting with 371cdb3e426a27731c8dbe00b3e549176de85ae8be42e17781471266a50e5a44 not found: ID does not exist" Dec 04 10:47:14 crc kubenswrapper[4831]: I1204 10:47:14.871716 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kthqz"] Dec 04 10:47:14 crc kubenswrapper[4831]: I1204 10:47:14.880011 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kthqz"] Dec 04 10:47:15 crc kubenswrapper[4831]: I1204 10:47:15.298399 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cef09d83-7db7-46be-b7e3-cecd6dc28139" path="/var/lib/kubelet/pods/cef09d83-7db7-46be-b7e3-cecd6dc28139/volumes" Dec 04 10:47:15 crc kubenswrapper[4831]: I1204 10:47:15.524729 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6glvd" event={"ID":"62c2750f-dec9-4b21-b4c2-e1ee98b3b754","Type":"ContainerStarted","Data":"f64d79096a73250c13190ae5e6ff1cb8d46592d36823d698fb442876c9522cb2"} Dec 04 10:47:15 crc kubenswrapper[4831]: I1204 10:47:15.524847 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6glvd" event={"ID":"62c2750f-dec9-4b21-b4c2-e1ee98b3b754","Type":"ContainerStarted","Data":"b073467db8d41c4a731d2e6307dbcc99440a5869a7dd07bc1a3614976d35bc86"} Dec 04 10:47:15 crc kubenswrapper[4831]: I1204 10:47:15.557206 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6glvd" podStartSLOduration=2.136316768 podStartE2EDuration="2.55718621s" podCreationTimestamp="2025-12-04 10:47:13 +0000 UTC" firstStartedPulling="2025-12-04 10:47:14.618255788 +0000 UTC m=+1931.567431102" lastFinishedPulling="2025-12-04 10:47:15.03912523 +0000 UTC m=+1931.988300544" observedRunningTime="2025-12-04 10:47:15.55340867 +0000 UTC m=+1932.502583984" watchObservedRunningTime="2025-12-04 10:47:15.55718621 +0000 UTC m=+1932.506361524" Dec 04 10:47:19 crc kubenswrapper[4831]: I1204 10:47:19.044859 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-6wj7l"] Dec 04 10:47:19 crc kubenswrapper[4831]: I1204 10:47:19.058833 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-6wj7l"] Dec 04 10:47:19 crc kubenswrapper[4831]: I1204 10:47:19.289597 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="577e755d-3044-4c40-bdba-51fe1291b774" path="/var/lib/kubelet/pods/577e755d-3044-4c40-bdba-51fe1291b774/volumes" Dec 04 10:47:25 crc kubenswrapper[4831]: I1204 10:47:25.890541 4831 scope.go:117] "RemoveContainer" containerID="b61459417786d9bc254d7a785d825caf7f95f76f29934f94f5ef29526dfebf73" Dec 04 10:47:25 crc kubenswrapper[4831]: I1204 10:47:25.951223 4831 scope.go:117] "RemoveContainer" containerID="76f66e055faa71edef7fd22c90355f8ef99f2fbe084150654596ff28ab7ea2f9" Dec 04 10:47:25 crc kubenswrapper[4831]: I1204 10:47:25.989835 4831 scope.go:117] "RemoveContainer" containerID="f626c01b67169f6dbf326e381e3a344d2ef41499a05917f879afca21d3c87fad" Dec 04 10:47:26 crc kubenswrapper[4831]: I1204 10:47:26.277133 4831 scope.go:117] "RemoveContainer" containerID="37e17da4eccc398a7fd4a49aa969c9f666aa6c38055bd6e9bb03c0ad0e553e58" Dec 04 10:47:26 crc kubenswrapper[4831]: E1204 10:47:26.277580 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:47:37 crc kubenswrapper[4831]: I1204 10:47:37.277265 4831 scope.go:117] "RemoveContainer" containerID="37e17da4eccc398a7fd4a49aa969c9f666aa6c38055bd6e9bb03c0ad0e553e58" Dec 04 10:47:37 crc kubenswrapper[4831]: E1204 10:47:37.278402 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:47:48 crc kubenswrapper[4831]: I1204 10:47:48.276680 4831 scope.go:117] "RemoveContainer" containerID="37e17da4eccc398a7fd4a49aa969c9f666aa6c38055bd6e9bb03c0ad0e553e58" Dec 04 10:47:48 crc kubenswrapper[4831]: E1204 10:47:48.277605 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:48:00 crc kubenswrapper[4831]: I1204 10:48:00.276305 4831 scope.go:117] "RemoveContainer" containerID="37e17da4eccc398a7fd4a49aa969c9f666aa6c38055bd6e9bb03c0ad0e553e58" Dec 04 10:48:00 crc kubenswrapper[4831]: E1204 10:48:00.278575 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:48:08 crc kubenswrapper[4831]: I1204 10:48:08.095601 4831 generic.go:334] "Generic (PLEG): container finished" podID="62c2750f-dec9-4b21-b4c2-e1ee98b3b754" containerID="f64d79096a73250c13190ae5e6ff1cb8d46592d36823d698fb442876c9522cb2" exitCode=0 Dec 04 10:48:08 crc kubenswrapper[4831]: I1204 10:48:08.095729 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6glvd" event={"ID":"62c2750f-dec9-4b21-b4c2-e1ee98b3b754","Type":"ContainerDied","Data":"f64d79096a73250c13190ae5e6ff1cb8d46592d36823d698fb442876c9522cb2"} Dec 04 10:48:09 crc kubenswrapper[4831]: I1204 10:48:09.623891 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6glvd" Dec 04 10:48:09 crc kubenswrapper[4831]: I1204 10:48:09.708312 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62c2750f-dec9-4b21-b4c2-e1ee98b3b754-inventory\") pod \"62c2750f-dec9-4b21-b4c2-e1ee98b3b754\" (UID: \"62c2750f-dec9-4b21-b4c2-e1ee98b3b754\") " Dec 04 10:48:09 crc kubenswrapper[4831]: I1204 10:48:09.708485 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkxn6\" (UniqueName: \"kubernetes.io/projected/62c2750f-dec9-4b21-b4c2-e1ee98b3b754-kube-api-access-fkxn6\") pod \"62c2750f-dec9-4b21-b4c2-e1ee98b3b754\" (UID: \"62c2750f-dec9-4b21-b4c2-e1ee98b3b754\") " Dec 04 10:48:09 crc kubenswrapper[4831]: I1204 10:48:09.708644 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62c2750f-dec9-4b21-b4c2-e1ee98b3b754-ssh-key\") pod \"62c2750f-dec9-4b21-b4c2-e1ee98b3b754\" (UID: \"62c2750f-dec9-4b21-b4c2-e1ee98b3b754\") " Dec 04 10:48:09 crc kubenswrapper[4831]: I1204 10:48:09.715969 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62c2750f-dec9-4b21-b4c2-e1ee98b3b754-kube-api-access-fkxn6" (OuterVolumeSpecName: "kube-api-access-fkxn6") pod "62c2750f-dec9-4b21-b4c2-e1ee98b3b754" (UID: "62c2750f-dec9-4b21-b4c2-e1ee98b3b754"). InnerVolumeSpecName "kube-api-access-fkxn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:48:09 crc kubenswrapper[4831]: I1204 10:48:09.757134 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62c2750f-dec9-4b21-b4c2-e1ee98b3b754-inventory" (OuterVolumeSpecName: "inventory") pod "62c2750f-dec9-4b21-b4c2-e1ee98b3b754" (UID: "62c2750f-dec9-4b21-b4c2-e1ee98b3b754"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:48:09 crc kubenswrapper[4831]: I1204 10:48:09.779114 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62c2750f-dec9-4b21-b4c2-e1ee98b3b754-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "62c2750f-dec9-4b21-b4c2-e1ee98b3b754" (UID: "62c2750f-dec9-4b21-b4c2-e1ee98b3b754"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:48:09 crc kubenswrapper[4831]: I1204 10:48:09.811089 4831 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62c2750f-dec9-4b21-b4c2-e1ee98b3b754-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:48:09 crc kubenswrapper[4831]: I1204 10:48:09.811131 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkxn6\" (UniqueName: \"kubernetes.io/projected/62c2750f-dec9-4b21-b4c2-e1ee98b3b754-kube-api-access-fkxn6\") on node \"crc\" DevicePath \"\"" Dec 04 10:48:09 crc kubenswrapper[4831]: I1204 10:48:09.811142 4831 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62c2750f-dec9-4b21-b4c2-e1ee98b3b754-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:48:10 crc kubenswrapper[4831]: I1204 10:48:10.117207 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6glvd" event={"ID":"62c2750f-dec9-4b21-b4c2-e1ee98b3b754","Type":"ContainerDied","Data":"b073467db8d41c4a731d2e6307dbcc99440a5869a7dd07bc1a3614976d35bc86"} Dec 04 10:48:10 crc kubenswrapper[4831]: I1204 10:48:10.117805 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b073467db8d41c4a731d2e6307dbcc99440a5869a7dd07bc1a3614976d35bc86" Dec 04 10:48:10 crc kubenswrapper[4831]: I1204 10:48:10.117310 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6glvd" Dec 04 10:48:10 crc kubenswrapper[4831]: I1204 10:48:10.241849 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-rjtxp"] Dec 04 10:48:10 crc kubenswrapper[4831]: E1204 10:48:10.242431 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62c2750f-dec9-4b21-b4c2-e1ee98b3b754" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:48:10 crc kubenswrapper[4831]: I1204 10:48:10.242454 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="62c2750f-dec9-4b21-b4c2-e1ee98b3b754" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:48:10 crc kubenswrapper[4831]: E1204 10:48:10.242476 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cef09d83-7db7-46be-b7e3-cecd6dc28139" containerName="extract-utilities" Dec 04 10:48:10 crc kubenswrapper[4831]: I1204 10:48:10.242483 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="cef09d83-7db7-46be-b7e3-cecd6dc28139" containerName="extract-utilities" Dec 04 10:48:10 crc kubenswrapper[4831]: E1204 10:48:10.242505 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cef09d83-7db7-46be-b7e3-cecd6dc28139" containerName="registry-server" Dec 04 10:48:10 crc kubenswrapper[4831]: I1204 10:48:10.242511 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="cef09d83-7db7-46be-b7e3-cecd6dc28139" containerName="registry-server" Dec 04 10:48:10 crc kubenswrapper[4831]: E1204 10:48:10.242560 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cef09d83-7db7-46be-b7e3-cecd6dc28139" containerName="extract-content" Dec 04 10:48:10 crc kubenswrapper[4831]: I1204 10:48:10.242574 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="cef09d83-7db7-46be-b7e3-cecd6dc28139" containerName="extract-content" Dec 04 10:48:10 crc kubenswrapper[4831]: I1204 10:48:10.242867 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="62c2750f-dec9-4b21-b4c2-e1ee98b3b754" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:48:10 crc kubenswrapper[4831]: I1204 10:48:10.242891 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="cef09d83-7db7-46be-b7e3-cecd6dc28139" containerName="registry-server" Dec 04 10:48:10 crc kubenswrapper[4831]: I1204 10:48:10.244256 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-rjtxp" Dec 04 10:48:10 crc kubenswrapper[4831]: I1204 10:48:10.246731 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:48:10 crc kubenswrapper[4831]: I1204 10:48:10.247298 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-l2wn5" Dec 04 10:48:10 crc kubenswrapper[4831]: I1204 10:48:10.247434 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:48:10 crc kubenswrapper[4831]: I1204 10:48:10.247572 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:48:10 crc kubenswrapper[4831]: I1204 10:48:10.257442 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-rjtxp"] Dec 04 10:48:10 crc kubenswrapper[4831]: I1204 10:48:10.322244 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b4e80c5d-a48e-4f72-b5f5-b3c5ed331758-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-rjtxp\" (UID: \"b4e80c5d-a48e-4f72-b5f5-b3c5ed331758\") " pod="openstack/ssh-known-hosts-edpm-deployment-rjtxp" Dec 04 10:48:10 crc kubenswrapper[4831]: I1204 10:48:10.322396 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrpzq\" (UniqueName: \"kubernetes.io/projected/b4e80c5d-a48e-4f72-b5f5-b3c5ed331758-kube-api-access-lrpzq\") pod \"ssh-known-hosts-edpm-deployment-rjtxp\" (UID: \"b4e80c5d-a48e-4f72-b5f5-b3c5ed331758\") " pod="openstack/ssh-known-hosts-edpm-deployment-rjtxp" Dec 04 10:48:10 crc kubenswrapper[4831]: I1204 10:48:10.322548 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b4e80c5d-a48e-4f72-b5f5-b3c5ed331758-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-rjtxp\" (UID: \"b4e80c5d-a48e-4f72-b5f5-b3c5ed331758\") " pod="openstack/ssh-known-hosts-edpm-deployment-rjtxp" Dec 04 10:48:10 crc kubenswrapper[4831]: I1204 10:48:10.424136 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b4e80c5d-a48e-4f72-b5f5-b3c5ed331758-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-rjtxp\" (UID: \"b4e80c5d-a48e-4f72-b5f5-b3c5ed331758\") " pod="openstack/ssh-known-hosts-edpm-deployment-rjtxp" Dec 04 10:48:10 crc kubenswrapper[4831]: I1204 10:48:10.424269 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrpzq\" (UniqueName: \"kubernetes.io/projected/b4e80c5d-a48e-4f72-b5f5-b3c5ed331758-kube-api-access-lrpzq\") pod \"ssh-known-hosts-edpm-deployment-rjtxp\" (UID: \"b4e80c5d-a48e-4f72-b5f5-b3c5ed331758\") " pod="openstack/ssh-known-hosts-edpm-deployment-rjtxp" Dec 04 10:48:10 crc kubenswrapper[4831]: I1204 10:48:10.424514 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b4e80c5d-a48e-4f72-b5f5-b3c5ed331758-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-rjtxp\" (UID: \"b4e80c5d-a48e-4f72-b5f5-b3c5ed331758\") " pod="openstack/ssh-known-hosts-edpm-deployment-rjtxp" Dec 04 10:48:10 crc kubenswrapper[4831]: I1204 10:48:10.429665 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b4e80c5d-a48e-4f72-b5f5-b3c5ed331758-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-rjtxp\" (UID: \"b4e80c5d-a48e-4f72-b5f5-b3c5ed331758\") " pod="openstack/ssh-known-hosts-edpm-deployment-rjtxp" Dec 04 10:48:10 crc kubenswrapper[4831]: I1204 10:48:10.429967 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b4e80c5d-a48e-4f72-b5f5-b3c5ed331758-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-rjtxp\" (UID: \"b4e80c5d-a48e-4f72-b5f5-b3c5ed331758\") " pod="openstack/ssh-known-hosts-edpm-deployment-rjtxp" Dec 04 10:48:10 crc kubenswrapper[4831]: I1204 10:48:10.442848 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrpzq\" (UniqueName: \"kubernetes.io/projected/b4e80c5d-a48e-4f72-b5f5-b3c5ed331758-kube-api-access-lrpzq\") pod \"ssh-known-hosts-edpm-deployment-rjtxp\" (UID: \"b4e80c5d-a48e-4f72-b5f5-b3c5ed331758\") " pod="openstack/ssh-known-hosts-edpm-deployment-rjtxp" Dec 04 10:48:10 crc kubenswrapper[4831]: I1204 10:48:10.574071 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-rjtxp" Dec 04 10:48:11 crc kubenswrapper[4831]: I1204 10:48:11.102996 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-rjtxp"] Dec 04 10:48:11 crc kubenswrapper[4831]: I1204 10:48:11.142632 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-rjtxp" event={"ID":"b4e80c5d-a48e-4f72-b5f5-b3c5ed331758","Type":"ContainerStarted","Data":"340a37d2531c8427ab9abcd8c72546e30a9076ac9842647c398c6d1695c51d56"} Dec 04 10:48:11 crc kubenswrapper[4831]: I1204 10:48:11.277760 4831 scope.go:117] "RemoveContainer" containerID="37e17da4eccc398a7fd4a49aa969c9f666aa6c38055bd6e9bb03c0ad0e553e58" Dec 04 10:48:11 crc kubenswrapper[4831]: E1204 10:48:11.278019 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:48:12 crc kubenswrapper[4831]: I1204 10:48:12.152997 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-rjtxp" event={"ID":"b4e80c5d-a48e-4f72-b5f5-b3c5ed331758","Type":"ContainerStarted","Data":"ab2cfade96f2b1b82623d7a47e5d29a7792c017b858f1429cc90cfa6d61c5e24"} Dec 04 10:48:12 crc kubenswrapper[4831]: I1204 10:48:12.178916 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-rjtxp" podStartSLOduration=1.755020725 podStartE2EDuration="2.178896988s" podCreationTimestamp="2025-12-04 10:48:10 +0000 UTC" firstStartedPulling="2025-12-04 10:48:11.106620866 +0000 UTC m=+1988.055796180" lastFinishedPulling="2025-12-04 10:48:11.530497129 +0000 UTC m=+1988.479672443" observedRunningTime="2025-12-04 10:48:12.172988012 +0000 UTC m=+1989.122163326" watchObservedRunningTime="2025-12-04 10:48:12.178896988 +0000 UTC m=+1989.128072302" Dec 04 10:48:19 crc kubenswrapper[4831]: I1204 10:48:19.230621 4831 generic.go:334] "Generic (PLEG): container finished" podID="b4e80c5d-a48e-4f72-b5f5-b3c5ed331758" containerID="ab2cfade96f2b1b82623d7a47e5d29a7792c017b858f1429cc90cfa6d61c5e24" exitCode=0 Dec 04 10:48:19 crc kubenswrapper[4831]: I1204 10:48:19.230687 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-rjtxp" event={"ID":"b4e80c5d-a48e-4f72-b5f5-b3c5ed331758","Type":"ContainerDied","Data":"ab2cfade96f2b1b82623d7a47e5d29a7792c017b858f1429cc90cfa6d61c5e24"} Dec 04 10:48:20 crc kubenswrapper[4831]: I1204 10:48:20.633511 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-rjtxp" Dec 04 10:48:20 crc kubenswrapper[4831]: I1204 10:48:20.748142 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b4e80c5d-a48e-4f72-b5f5-b3c5ed331758-inventory-0\") pod \"b4e80c5d-a48e-4f72-b5f5-b3c5ed331758\" (UID: \"b4e80c5d-a48e-4f72-b5f5-b3c5ed331758\") " Dec 04 10:48:20 crc kubenswrapper[4831]: I1204 10:48:20.748590 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b4e80c5d-a48e-4f72-b5f5-b3c5ed331758-ssh-key-openstack-edpm-ipam\") pod \"b4e80c5d-a48e-4f72-b5f5-b3c5ed331758\" (UID: \"b4e80c5d-a48e-4f72-b5f5-b3c5ed331758\") " Dec 04 10:48:20 crc kubenswrapper[4831]: I1204 10:48:20.748692 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrpzq\" (UniqueName: \"kubernetes.io/projected/b4e80c5d-a48e-4f72-b5f5-b3c5ed331758-kube-api-access-lrpzq\") pod \"b4e80c5d-a48e-4f72-b5f5-b3c5ed331758\" (UID: \"b4e80c5d-a48e-4f72-b5f5-b3c5ed331758\") " Dec 04 10:48:20 crc kubenswrapper[4831]: I1204 10:48:20.753899 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4e80c5d-a48e-4f72-b5f5-b3c5ed331758-kube-api-access-lrpzq" (OuterVolumeSpecName: "kube-api-access-lrpzq") pod "b4e80c5d-a48e-4f72-b5f5-b3c5ed331758" (UID: "b4e80c5d-a48e-4f72-b5f5-b3c5ed331758"). InnerVolumeSpecName "kube-api-access-lrpzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:48:20 crc kubenswrapper[4831]: I1204 10:48:20.774868 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4e80c5d-a48e-4f72-b5f5-b3c5ed331758-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b4e80c5d-a48e-4f72-b5f5-b3c5ed331758" (UID: "b4e80c5d-a48e-4f72-b5f5-b3c5ed331758"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:48:20 crc kubenswrapper[4831]: I1204 10:48:20.786324 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4e80c5d-a48e-4f72-b5f5-b3c5ed331758-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "b4e80c5d-a48e-4f72-b5f5-b3c5ed331758" (UID: "b4e80c5d-a48e-4f72-b5f5-b3c5ed331758"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:48:20 crc kubenswrapper[4831]: I1204 10:48:20.851230 4831 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b4e80c5d-a48e-4f72-b5f5-b3c5ed331758-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 04 10:48:20 crc kubenswrapper[4831]: I1204 10:48:20.851260 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrpzq\" (UniqueName: \"kubernetes.io/projected/b4e80c5d-a48e-4f72-b5f5-b3c5ed331758-kube-api-access-lrpzq\") on node \"crc\" DevicePath \"\"" Dec 04 10:48:20 crc kubenswrapper[4831]: I1204 10:48:20.851272 4831 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b4e80c5d-a48e-4f72-b5f5-b3c5ed331758-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:48:21 crc kubenswrapper[4831]: I1204 10:48:21.251081 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-rjtxp" event={"ID":"b4e80c5d-a48e-4f72-b5f5-b3c5ed331758","Type":"ContainerDied","Data":"340a37d2531c8427ab9abcd8c72546e30a9076ac9842647c398c6d1695c51d56"} Dec 04 10:48:21 crc kubenswrapper[4831]: I1204 10:48:21.251121 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="340a37d2531c8427ab9abcd8c72546e30a9076ac9842647c398c6d1695c51d56" Dec 04 10:48:21 crc kubenswrapper[4831]: I1204 10:48:21.251161 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-rjtxp" Dec 04 10:48:21 crc kubenswrapper[4831]: I1204 10:48:21.319416 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-6wmvp"] Dec 04 10:48:21 crc kubenswrapper[4831]: E1204 10:48:21.319892 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4e80c5d-a48e-4f72-b5f5-b3c5ed331758" containerName="ssh-known-hosts-edpm-deployment" Dec 04 10:48:21 crc kubenswrapper[4831]: I1204 10:48:21.319914 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4e80c5d-a48e-4f72-b5f5-b3c5ed331758" containerName="ssh-known-hosts-edpm-deployment" Dec 04 10:48:21 crc kubenswrapper[4831]: I1204 10:48:21.320151 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4e80c5d-a48e-4f72-b5f5-b3c5ed331758" containerName="ssh-known-hosts-edpm-deployment" Dec 04 10:48:21 crc kubenswrapper[4831]: I1204 10:48:21.321530 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6wmvp" Dec 04 10:48:21 crc kubenswrapper[4831]: I1204 10:48:21.327471 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:48:21 crc kubenswrapper[4831]: I1204 10:48:21.327734 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:48:21 crc kubenswrapper[4831]: I1204 10:48:21.327875 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-l2wn5" Dec 04 10:48:21 crc kubenswrapper[4831]: I1204 10:48:21.327998 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:48:21 crc kubenswrapper[4831]: I1204 10:48:21.333137 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-6wmvp"] Dec 04 10:48:21 crc kubenswrapper[4831]: I1204 10:48:21.480459 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4wcq\" (UniqueName: \"kubernetes.io/projected/af857683-5749-4e71-8714-0049cd774f67-kube-api-access-t4wcq\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6wmvp\" (UID: \"af857683-5749-4e71-8714-0049cd774f67\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6wmvp" Dec 04 10:48:21 crc kubenswrapper[4831]: I1204 10:48:21.480525 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af857683-5749-4e71-8714-0049cd774f67-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6wmvp\" (UID: \"af857683-5749-4e71-8714-0049cd774f67\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6wmvp" Dec 04 10:48:21 crc kubenswrapper[4831]: I1204 10:48:21.480655 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/af857683-5749-4e71-8714-0049cd774f67-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6wmvp\" (UID: \"af857683-5749-4e71-8714-0049cd774f67\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6wmvp" Dec 04 10:48:21 crc kubenswrapper[4831]: I1204 10:48:21.582424 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/af857683-5749-4e71-8714-0049cd774f67-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6wmvp\" (UID: \"af857683-5749-4e71-8714-0049cd774f67\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6wmvp" Dec 04 10:48:21 crc kubenswrapper[4831]: I1204 10:48:21.582592 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4wcq\" (UniqueName: \"kubernetes.io/projected/af857683-5749-4e71-8714-0049cd774f67-kube-api-access-t4wcq\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6wmvp\" (UID: \"af857683-5749-4e71-8714-0049cd774f67\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6wmvp" Dec 04 10:48:21 crc kubenswrapper[4831]: I1204 10:48:21.582614 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af857683-5749-4e71-8714-0049cd774f67-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6wmvp\" (UID: \"af857683-5749-4e71-8714-0049cd774f67\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6wmvp" Dec 04 10:48:21 crc kubenswrapper[4831]: I1204 10:48:21.587005 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/af857683-5749-4e71-8714-0049cd774f67-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6wmvp\" (UID: \"af857683-5749-4e71-8714-0049cd774f67\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6wmvp" Dec 04 10:48:21 crc kubenswrapper[4831]: I1204 10:48:21.595381 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af857683-5749-4e71-8714-0049cd774f67-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6wmvp\" (UID: \"af857683-5749-4e71-8714-0049cd774f67\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6wmvp" Dec 04 10:48:21 crc kubenswrapper[4831]: I1204 10:48:21.602758 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4wcq\" (UniqueName: \"kubernetes.io/projected/af857683-5749-4e71-8714-0049cd774f67-kube-api-access-t4wcq\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6wmvp\" (UID: \"af857683-5749-4e71-8714-0049cd774f67\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6wmvp" Dec 04 10:48:21 crc kubenswrapper[4831]: I1204 10:48:21.692089 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6wmvp" Dec 04 10:48:22 crc kubenswrapper[4831]: I1204 10:48:22.228688 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-6wmvp"] Dec 04 10:48:22 crc kubenswrapper[4831]: I1204 10:48:22.264357 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6wmvp" event={"ID":"af857683-5749-4e71-8714-0049cd774f67","Type":"ContainerStarted","Data":"e5a22f6e7fbf9463865d13c318c119ae48df42d26300609462e4cec843bf7700"} Dec 04 10:48:23 crc kubenswrapper[4831]: I1204 10:48:23.292084 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6wmvp" event={"ID":"af857683-5749-4e71-8714-0049cd774f67","Type":"ContainerStarted","Data":"085ca47924cd11565df90871c01ebe285f4b59a8c9af94f184ca76c5b2dc317f"} Dec 04 10:48:23 crc kubenswrapper[4831]: I1204 10:48:23.321788 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6wmvp" podStartSLOduration=1.725686231 podStartE2EDuration="2.321766631s" podCreationTimestamp="2025-12-04 10:48:21 +0000 UTC" firstStartedPulling="2025-12-04 10:48:22.232065868 +0000 UTC m=+1999.181241182" lastFinishedPulling="2025-12-04 10:48:22.828146268 +0000 UTC m=+1999.777321582" observedRunningTime="2025-12-04 10:48:23.318143365 +0000 UTC m=+2000.267318679" watchObservedRunningTime="2025-12-04 10:48:23.321766631 +0000 UTC m=+2000.270941945" Dec 04 10:48:26 crc kubenswrapper[4831]: I1204 10:48:26.277978 4831 scope.go:117] "RemoveContainer" containerID="37e17da4eccc398a7fd4a49aa969c9f666aa6c38055bd6e9bb03c0ad0e553e58" Dec 04 10:48:27 crc kubenswrapper[4831]: I1204 10:48:27.360127 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" event={"ID":"8475bb26-8864-4d49-935b-db7d4cb73387","Type":"ContainerStarted","Data":"80db97831c4a7e8b6da9c9072594827bb567104b86c587975c95004562b3e059"} Dec 04 10:48:31 crc kubenswrapper[4831]: I1204 10:48:31.396080 4831 generic.go:334] "Generic (PLEG): container finished" podID="af857683-5749-4e71-8714-0049cd774f67" containerID="085ca47924cd11565df90871c01ebe285f4b59a8c9af94f184ca76c5b2dc317f" exitCode=0 Dec 04 10:48:31 crc kubenswrapper[4831]: I1204 10:48:31.396164 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6wmvp" event={"ID":"af857683-5749-4e71-8714-0049cd774f67","Type":"ContainerDied","Data":"085ca47924cd11565df90871c01ebe285f4b59a8c9af94f184ca76c5b2dc317f"} Dec 04 10:48:32 crc kubenswrapper[4831]: I1204 10:48:32.895267 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6wmvp" Dec 04 10:48:33 crc kubenswrapper[4831]: I1204 10:48:33.040784 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4wcq\" (UniqueName: \"kubernetes.io/projected/af857683-5749-4e71-8714-0049cd774f67-kube-api-access-t4wcq\") pod \"af857683-5749-4e71-8714-0049cd774f67\" (UID: \"af857683-5749-4e71-8714-0049cd774f67\") " Dec 04 10:48:33 crc kubenswrapper[4831]: I1204 10:48:33.040925 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/af857683-5749-4e71-8714-0049cd774f67-ssh-key\") pod \"af857683-5749-4e71-8714-0049cd774f67\" (UID: \"af857683-5749-4e71-8714-0049cd774f67\") " Dec 04 10:48:33 crc kubenswrapper[4831]: I1204 10:48:33.040969 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af857683-5749-4e71-8714-0049cd774f67-inventory\") pod \"af857683-5749-4e71-8714-0049cd774f67\" (UID: \"af857683-5749-4e71-8714-0049cd774f67\") " Dec 04 10:48:33 crc kubenswrapper[4831]: I1204 10:48:33.047470 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af857683-5749-4e71-8714-0049cd774f67-kube-api-access-t4wcq" (OuterVolumeSpecName: "kube-api-access-t4wcq") pod "af857683-5749-4e71-8714-0049cd774f67" (UID: "af857683-5749-4e71-8714-0049cd774f67"). InnerVolumeSpecName "kube-api-access-t4wcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:48:33 crc kubenswrapper[4831]: I1204 10:48:33.068355 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af857683-5749-4e71-8714-0049cd774f67-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "af857683-5749-4e71-8714-0049cd774f67" (UID: "af857683-5749-4e71-8714-0049cd774f67"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:48:33 crc kubenswrapper[4831]: I1204 10:48:33.093232 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af857683-5749-4e71-8714-0049cd774f67-inventory" (OuterVolumeSpecName: "inventory") pod "af857683-5749-4e71-8714-0049cd774f67" (UID: "af857683-5749-4e71-8714-0049cd774f67"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:48:33 crc kubenswrapper[4831]: I1204 10:48:33.143951 4831 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/af857683-5749-4e71-8714-0049cd774f67-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:48:33 crc kubenswrapper[4831]: I1204 10:48:33.143987 4831 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af857683-5749-4e71-8714-0049cd774f67-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:48:33 crc kubenswrapper[4831]: I1204 10:48:33.143999 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4wcq\" (UniqueName: \"kubernetes.io/projected/af857683-5749-4e71-8714-0049cd774f67-kube-api-access-t4wcq\") on node \"crc\" DevicePath \"\"" Dec 04 10:48:33 crc kubenswrapper[4831]: I1204 10:48:33.414060 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6wmvp" event={"ID":"af857683-5749-4e71-8714-0049cd774f67","Type":"ContainerDied","Data":"e5a22f6e7fbf9463865d13c318c119ae48df42d26300609462e4cec843bf7700"} Dec 04 10:48:33 crc kubenswrapper[4831]: I1204 10:48:33.414104 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5a22f6e7fbf9463865d13c318c119ae48df42d26300609462e4cec843bf7700" Dec 04 10:48:33 crc kubenswrapper[4831]: I1204 10:48:33.414106 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6wmvp" Dec 04 10:48:33 crc kubenswrapper[4831]: I1204 10:48:33.486898 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-592mg"] Dec 04 10:48:33 crc kubenswrapper[4831]: E1204 10:48:33.487403 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af857683-5749-4e71-8714-0049cd774f67" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:48:33 crc kubenswrapper[4831]: I1204 10:48:33.487430 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="af857683-5749-4e71-8714-0049cd774f67" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:48:33 crc kubenswrapper[4831]: I1204 10:48:33.487887 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="af857683-5749-4e71-8714-0049cd774f67" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:48:33 crc kubenswrapper[4831]: I1204 10:48:33.488697 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-592mg" Dec 04 10:48:33 crc kubenswrapper[4831]: I1204 10:48:33.491775 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:48:33 crc kubenswrapper[4831]: I1204 10:48:33.491899 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-l2wn5" Dec 04 10:48:33 crc kubenswrapper[4831]: I1204 10:48:33.492275 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:48:33 crc kubenswrapper[4831]: I1204 10:48:33.494982 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:48:33 crc kubenswrapper[4831]: I1204 10:48:33.504531 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-592mg"] Dec 04 10:48:33 crc kubenswrapper[4831]: I1204 10:48:33.652166 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a83de3fa-04fa-43f2-bfd0-48e9e3928c34-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-592mg\" (UID: \"a83de3fa-04fa-43f2-bfd0-48e9e3928c34\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-592mg" Dec 04 10:48:33 crc kubenswrapper[4831]: I1204 10:48:33.652254 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j45l\" (UniqueName: \"kubernetes.io/projected/a83de3fa-04fa-43f2-bfd0-48e9e3928c34-kube-api-access-2j45l\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-592mg\" (UID: \"a83de3fa-04fa-43f2-bfd0-48e9e3928c34\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-592mg" Dec 04 10:48:33 crc kubenswrapper[4831]: I1204 10:48:33.652323 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a83de3fa-04fa-43f2-bfd0-48e9e3928c34-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-592mg\" (UID: \"a83de3fa-04fa-43f2-bfd0-48e9e3928c34\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-592mg" Dec 04 10:48:33 crc kubenswrapper[4831]: I1204 10:48:33.754335 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a83de3fa-04fa-43f2-bfd0-48e9e3928c34-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-592mg\" (UID: \"a83de3fa-04fa-43f2-bfd0-48e9e3928c34\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-592mg" Dec 04 10:48:33 crc kubenswrapper[4831]: I1204 10:48:33.754485 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a83de3fa-04fa-43f2-bfd0-48e9e3928c34-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-592mg\" (UID: \"a83de3fa-04fa-43f2-bfd0-48e9e3928c34\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-592mg" Dec 04 10:48:33 crc kubenswrapper[4831]: I1204 10:48:33.754511 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j45l\" (UniqueName: \"kubernetes.io/projected/a83de3fa-04fa-43f2-bfd0-48e9e3928c34-kube-api-access-2j45l\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-592mg\" (UID: \"a83de3fa-04fa-43f2-bfd0-48e9e3928c34\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-592mg" Dec 04 10:48:33 crc kubenswrapper[4831]: I1204 10:48:33.759851 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a83de3fa-04fa-43f2-bfd0-48e9e3928c34-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-592mg\" (UID: \"a83de3fa-04fa-43f2-bfd0-48e9e3928c34\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-592mg" Dec 04 10:48:33 crc kubenswrapper[4831]: I1204 10:48:33.761749 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a83de3fa-04fa-43f2-bfd0-48e9e3928c34-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-592mg\" (UID: \"a83de3fa-04fa-43f2-bfd0-48e9e3928c34\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-592mg" Dec 04 10:48:33 crc kubenswrapper[4831]: I1204 10:48:33.770516 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j45l\" (UniqueName: \"kubernetes.io/projected/a83de3fa-04fa-43f2-bfd0-48e9e3928c34-kube-api-access-2j45l\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-592mg\" (UID: \"a83de3fa-04fa-43f2-bfd0-48e9e3928c34\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-592mg" Dec 04 10:48:33 crc kubenswrapper[4831]: I1204 10:48:33.848323 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-592mg" Dec 04 10:48:34 crc kubenswrapper[4831]: I1204 10:48:34.369555 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-592mg"] Dec 04 10:48:34 crc kubenswrapper[4831]: I1204 10:48:34.424502 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-592mg" event={"ID":"a83de3fa-04fa-43f2-bfd0-48e9e3928c34","Type":"ContainerStarted","Data":"243ef0745746346908f9ca854e4d4453140bb0506c8d35b97ac134bbe853a09b"} Dec 04 10:48:36 crc kubenswrapper[4831]: I1204 10:48:36.458859 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-592mg" event={"ID":"a83de3fa-04fa-43f2-bfd0-48e9e3928c34","Type":"ContainerStarted","Data":"a162c6fa226ad27df2c39ffe8bbcb080257bdd3229cf7e762da339e369ad102e"} Dec 04 10:48:36 crc kubenswrapper[4831]: I1204 10:48:36.476567 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-592mg" podStartSLOduration=2.187094304 podStartE2EDuration="3.476547334s" podCreationTimestamp="2025-12-04 10:48:33 +0000 UTC" firstStartedPulling="2025-12-04 10:48:34.377723777 +0000 UTC m=+2011.326899091" lastFinishedPulling="2025-12-04 10:48:35.667176807 +0000 UTC m=+2012.616352121" observedRunningTime="2025-12-04 10:48:36.472940357 +0000 UTC m=+2013.422115661" watchObservedRunningTime="2025-12-04 10:48:36.476547334 +0000 UTC m=+2013.425722638" Dec 04 10:48:45 crc kubenswrapper[4831]: I1204 10:48:45.539579 4831 generic.go:334] "Generic (PLEG): container finished" podID="a83de3fa-04fa-43f2-bfd0-48e9e3928c34" containerID="a162c6fa226ad27df2c39ffe8bbcb080257bdd3229cf7e762da339e369ad102e" exitCode=0 Dec 04 10:48:45 crc kubenswrapper[4831]: I1204 10:48:45.539671 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-592mg" event={"ID":"a83de3fa-04fa-43f2-bfd0-48e9e3928c34","Type":"ContainerDied","Data":"a162c6fa226ad27df2c39ffe8bbcb080257bdd3229cf7e762da339e369ad102e"} Dec 04 10:48:46 crc kubenswrapper[4831]: I1204 10:48:46.963888 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-592mg" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.021210 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2j45l\" (UniqueName: \"kubernetes.io/projected/a83de3fa-04fa-43f2-bfd0-48e9e3928c34-kube-api-access-2j45l\") pod \"a83de3fa-04fa-43f2-bfd0-48e9e3928c34\" (UID: \"a83de3fa-04fa-43f2-bfd0-48e9e3928c34\") " Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.021784 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a83de3fa-04fa-43f2-bfd0-48e9e3928c34-inventory\") pod \"a83de3fa-04fa-43f2-bfd0-48e9e3928c34\" (UID: \"a83de3fa-04fa-43f2-bfd0-48e9e3928c34\") " Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.021812 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a83de3fa-04fa-43f2-bfd0-48e9e3928c34-ssh-key\") pod \"a83de3fa-04fa-43f2-bfd0-48e9e3928c34\" (UID: \"a83de3fa-04fa-43f2-bfd0-48e9e3928c34\") " Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.031161 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a83de3fa-04fa-43f2-bfd0-48e9e3928c34-kube-api-access-2j45l" (OuterVolumeSpecName: "kube-api-access-2j45l") pod "a83de3fa-04fa-43f2-bfd0-48e9e3928c34" (UID: "a83de3fa-04fa-43f2-bfd0-48e9e3928c34"). InnerVolumeSpecName "kube-api-access-2j45l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.060424 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a83de3fa-04fa-43f2-bfd0-48e9e3928c34-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a83de3fa-04fa-43f2-bfd0-48e9e3928c34" (UID: "a83de3fa-04fa-43f2-bfd0-48e9e3928c34"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.062030 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a83de3fa-04fa-43f2-bfd0-48e9e3928c34-inventory" (OuterVolumeSpecName: "inventory") pod "a83de3fa-04fa-43f2-bfd0-48e9e3928c34" (UID: "a83de3fa-04fa-43f2-bfd0-48e9e3928c34"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.137957 4831 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a83de3fa-04fa-43f2-bfd0-48e9e3928c34-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.137991 4831 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a83de3fa-04fa-43f2-bfd0-48e9e3928c34-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.138000 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2j45l\" (UniqueName: \"kubernetes.io/projected/a83de3fa-04fa-43f2-bfd0-48e9e3928c34-kube-api-access-2j45l\") on node \"crc\" DevicePath \"\"" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.564416 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-592mg" event={"ID":"a83de3fa-04fa-43f2-bfd0-48e9e3928c34","Type":"ContainerDied","Data":"243ef0745746346908f9ca854e4d4453140bb0506c8d35b97ac134bbe853a09b"} Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.564853 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="243ef0745746346908f9ca854e4d4453140bb0506c8d35b97ac134bbe853a09b" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.564453 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-592mg" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.644967 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6"] Dec 04 10:48:47 crc kubenswrapper[4831]: E1204 10:48:47.645639 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a83de3fa-04fa-43f2-bfd0-48e9e3928c34" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.645681 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a83de3fa-04fa-43f2-bfd0-48e9e3928c34" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.645937 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a83de3fa-04fa-43f2-bfd0-48e9e3928c34" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.646771 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.650442 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.650629 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.650986 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.651848 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.652017 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.652166 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.652921 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.661280 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-l2wn5" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.699871 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6"] Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.750376 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.750457 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.750504 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.750534 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.750586 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2b1e2fb0-a6df-4568-a7d9-cce135438da5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.750612 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2b1e2fb0-a6df-4568-a7d9-cce135438da5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.750668 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.750711 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.750966 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2b1e2fb0-a6df-4568-a7d9-cce135438da5-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.751087 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhbgf\" (UniqueName: \"kubernetes.io/projected/2b1e2fb0-a6df-4568-a7d9-cce135438da5-kube-api-access-hhbgf\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.751161 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.751268 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.751428 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2b1e2fb0-a6df-4568-a7d9-cce135438da5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.751620 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.853774 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2b1e2fb0-a6df-4568-a7d9-cce135438da5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.853847 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2b1e2fb0-a6df-4568-a7d9-cce135438da5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.853881 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.853922 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.853990 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2b1e2fb0-a6df-4568-a7d9-cce135438da5-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.854023 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhbgf\" (UniqueName: \"kubernetes.io/projected/2b1e2fb0-a6df-4568-a7d9-cce135438da5-kube-api-access-hhbgf\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.854054 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.854096 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.854147 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2b1e2fb0-a6df-4568-a7d9-cce135438da5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.854197 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.854241 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.854292 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.854339 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.854366 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.861410 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.861518 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2b1e2fb0-a6df-4568-a7d9-cce135438da5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.861907 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.862180 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.863802 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.864260 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2b1e2fb0-a6df-4568-a7d9-cce135438da5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.864738 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.865196 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2b1e2fb0-a6df-4568-a7d9-cce135438da5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.865684 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.866995 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.869270 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2b1e2fb0-a6df-4568-a7d9-cce135438da5-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.873447 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.873447 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.879289 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhbgf\" (UniqueName: \"kubernetes.io/projected/2b1e2fb0-a6df-4568-a7d9-cce135438da5-kube-api-access-hhbgf\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:47 crc kubenswrapper[4831]: I1204 10:48:47.991038 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:48:48 crc kubenswrapper[4831]: I1204 10:48:48.619213 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6"] Dec 04 10:48:49 crc kubenswrapper[4831]: I1204 10:48:49.587142 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" event={"ID":"2b1e2fb0-a6df-4568-a7d9-cce135438da5","Type":"ContainerStarted","Data":"385ea300a04b9fffbc7ce944e61c5464b73cf0e5a41afd446c6e8f40ff4a5605"} Dec 04 10:48:50 crc kubenswrapper[4831]: I1204 10:48:50.603003 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" event={"ID":"2b1e2fb0-a6df-4568-a7d9-cce135438da5","Type":"ContainerStarted","Data":"f3e2633d7cdeff08a619b6d18e4d7744c1391d897e801afd44c26e943892d511"} Dec 04 10:48:50 crc kubenswrapper[4831]: I1204 10:48:50.630562 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" podStartSLOduration=2.019473796 podStartE2EDuration="3.630545173s" podCreationTimestamp="2025-12-04 10:48:47 +0000 UTC" firstStartedPulling="2025-12-04 10:48:48.628118964 +0000 UTC m=+2025.577294268" lastFinishedPulling="2025-12-04 10:48:50.239190331 +0000 UTC m=+2027.188365645" observedRunningTime="2025-12-04 10:48:50.623959787 +0000 UTC m=+2027.573135141" watchObservedRunningTime="2025-12-04 10:48:50.630545173 +0000 UTC m=+2027.579720487" Dec 04 10:49:27 crc kubenswrapper[4831]: I1204 10:49:27.955747 4831 generic.go:334] "Generic (PLEG): container finished" podID="2b1e2fb0-a6df-4568-a7d9-cce135438da5" containerID="f3e2633d7cdeff08a619b6d18e4d7744c1391d897e801afd44c26e943892d511" exitCode=0 Dec 04 10:49:27 crc kubenswrapper[4831]: I1204 10:49:27.955805 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" event={"ID":"2b1e2fb0-a6df-4568-a7d9-cce135438da5","Type":"ContainerDied","Data":"f3e2633d7cdeff08a619b6d18e4d7744c1391d897e801afd44c26e943892d511"} Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.374965 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.553227 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-ovn-combined-ca-bundle\") pod \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.553348 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2b1e2fb0-a6df-4568-a7d9-cce135438da5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.553374 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-nova-combined-ca-bundle\") pod \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.554314 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-inventory\") pod \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.554344 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2b1e2fb0-a6df-4568-a7d9-cce135438da5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.554412 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2b1e2fb0-a6df-4568-a7d9-cce135438da5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.554460 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-repo-setup-combined-ca-bundle\") pod \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.554483 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-neutron-metadata-combined-ca-bundle\") pod \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.554504 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-libvirt-combined-ca-bundle\") pod \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.554547 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2b1e2fb0-a6df-4568-a7d9-cce135438da5-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.554573 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhbgf\" (UniqueName: \"kubernetes.io/projected/2b1e2fb0-a6df-4568-a7d9-cce135438da5-kube-api-access-hhbgf\") pod \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.554610 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-bootstrap-combined-ca-bundle\") pod \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.554630 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-ssh-key\") pod \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.554669 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-telemetry-combined-ca-bundle\") pod \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\" (UID: \"2b1e2fb0-a6df-4568-a7d9-cce135438da5\") " Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.561127 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "2b1e2fb0-a6df-4568-a7d9-cce135438da5" (UID: "2b1e2fb0-a6df-4568-a7d9-cce135438da5"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.561164 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "2b1e2fb0-a6df-4568-a7d9-cce135438da5" (UID: "2b1e2fb0-a6df-4568-a7d9-cce135438da5"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.561993 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "2b1e2fb0-a6df-4568-a7d9-cce135438da5" (UID: "2b1e2fb0-a6df-4568-a7d9-cce135438da5"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.562015 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "2b1e2fb0-a6df-4568-a7d9-cce135438da5" (UID: "2b1e2fb0-a6df-4568-a7d9-cce135438da5"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.562128 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b1e2fb0-a6df-4568-a7d9-cce135438da5-kube-api-access-hhbgf" (OuterVolumeSpecName: "kube-api-access-hhbgf") pod "2b1e2fb0-a6df-4568-a7d9-cce135438da5" (UID: "2b1e2fb0-a6df-4568-a7d9-cce135438da5"). InnerVolumeSpecName "kube-api-access-hhbgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.562159 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "2b1e2fb0-a6df-4568-a7d9-cce135438da5" (UID: "2b1e2fb0-a6df-4568-a7d9-cce135438da5"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.562876 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b1e2fb0-a6df-4568-a7d9-cce135438da5-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "2b1e2fb0-a6df-4568-a7d9-cce135438da5" (UID: "2b1e2fb0-a6df-4568-a7d9-cce135438da5"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.563997 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "2b1e2fb0-a6df-4568-a7d9-cce135438da5" (UID: "2b1e2fb0-a6df-4568-a7d9-cce135438da5"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.564517 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b1e2fb0-a6df-4568-a7d9-cce135438da5-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "2b1e2fb0-a6df-4568-a7d9-cce135438da5" (UID: "2b1e2fb0-a6df-4568-a7d9-cce135438da5"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.564845 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b1e2fb0-a6df-4568-a7d9-cce135438da5-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "2b1e2fb0-a6df-4568-a7d9-cce135438da5" (UID: "2b1e2fb0-a6df-4568-a7d9-cce135438da5"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.565245 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "2b1e2fb0-a6df-4568-a7d9-cce135438da5" (UID: "2b1e2fb0-a6df-4568-a7d9-cce135438da5"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.565899 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b1e2fb0-a6df-4568-a7d9-cce135438da5-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "2b1e2fb0-a6df-4568-a7d9-cce135438da5" (UID: "2b1e2fb0-a6df-4568-a7d9-cce135438da5"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.595905 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2b1e2fb0-a6df-4568-a7d9-cce135438da5" (UID: "2b1e2fb0-a6df-4568-a7d9-cce135438da5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.596257 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-inventory" (OuterVolumeSpecName: "inventory") pod "2b1e2fb0-a6df-4568-a7d9-cce135438da5" (UID: "2b1e2fb0-a6df-4568-a7d9-cce135438da5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.656728 4831 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2b1e2fb0-a6df-4568-a7d9-cce135438da5-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.656757 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhbgf\" (UniqueName: \"kubernetes.io/projected/2b1e2fb0-a6df-4568-a7d9-cce135438da5-kube-api-access-hhbgf\") on node \"crc\" DevicePath \"\"" Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.656770 4831 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.656798 4831 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.656809 4831 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.656817 4831 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.656825 4831 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2b1e2fb0-a6df-4568-a7d9-cce135438da5-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.656835 4831 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.656845 4831 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.656853 4831 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2b1e2fb0-a6df-4568-a7d9-cce135438da5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.656864 4831 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2b1e2fb0-a6df-4568-a7d9-cce135438da5-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.656875 4831 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.656886 4831 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.656894 4831 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1e2fb0-a6df-4568-a7d9-cce135438da5-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.975254 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" event={"ID":"2b1e2fb0-a6df-4568-a7d9-cce135438da5","Type":"ContainerDied","Data":"385ea300a04b9fffbc7ce944e61c5464b73cf0e5a41afd446c6e8f40ff4a5605"} Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.975642 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="385ea300a04b9fffbc7ce944e61c5464b73cf0e5a41afd446c6e8f40ff4a5605" Dec 04 10:49:29 crc kubenswrapper[4831]: I1204 10:49:29.975302 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6" Dec 04 10:49:30 crc kubenswrapper[4831]: I1204 10:49:30.098995 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-c58gq"] Dec 04 10:49:30 crc kubenswrapper[4831]: E1204 10:49:30.099757 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b1e2fb0-a6df-4568-a7d9-cce135438da5" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 04 10:49:30 crc kubenswrapper[4831]: I1204 10:49:30.099858 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b1e2fb0-a6df-4568-a7d9-cce135438da5" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 04 10:49:30 crc kubenswrapper[4831]: I1204 10:49:30.100184 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b1e2fb0-a6df-4568-a7d9-cce135438da5" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 04 10:49:30 crc kubenswrapper[4831]: I1204 10:49:30.101157 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c58gq" Dec 04 10:49:30 crc kubenswrapper[4831]: I1204 10:49:30.106385 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:49:30 crc kubenswrapper[4831]: I1204 10:49:30.106613 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-l2wn5" Dec 04 10:49:30 crc kubenswrapper[4831]: I1204 10:49:30.106798 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 04 10:49:30 crc kubenswrapper[4831]: I1204 10:49:30.107384 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:49:30 crc kubenswrapper[4831]: I1204 10:49:30.107651 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:49:30 crc kubenswrapper[4831]: I1204 10:49:30.110782 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-c58gq"] Dec 04 10:49:30 crc kubenswrapper[4831]: I1204 10:49:30.167654 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/514cea2a-db2a-476e-aace-741121838112-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c58gq\" (UID: \"514cea2a-db2a-476e-aace-741121838112\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c58gq" Dec 04 10:49:30 crc kubenswrapper[4831]: I1204 10:49:30.167816 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/514cea2a-db2a-476e-aace-741121838112-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c58gq\" (UID: \"514cea2a-db2a-476e-aace-741121838112\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c58gq" Dec 04 10:49:30 crc kubenswrapper[4831]: I1204 10:49:30.167867 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/514cea2a-db2a-476e-aace-741121838112-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c58gq\" (UID: \"514cea2a-db2a-476e-aace-741121838112\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c58gq" Dec 04 10:49:30 crc kubenswrapper[4831]: I1204 10:49:30.167898 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7p47\" (UniqueName: \"kubernetes.io/projected/514cea2a-db2a-476e-aace-741121838112-kube-api-access-f7p47\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c58gq\" (UID: \"514cea2a-db2a-476e-aace-741121838112\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c58gq" Dec 04 10:49:30 crc kubenswrapper[4831]: I1204 10:49:30.168015 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/514cea2a-db2a-476e-aace-741121838112-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c58gq\" (UID: \"514cea2a-db2a-476e-aace-741121838112\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c58gq" Dec 04 10:49:30 crc kubenswrapper[4831]: I1204 10:49:30.269560 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7p47\" (UniqueName: \"kubernetes.io/projected/514cea2a-db2a-476e-aace-741121838112-kube-api-access-f7p47\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c58gq\" (UID: \"514cea2a-db2a-476e-aace-741121838112\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c58gq" Dec 04 10:49:30 crc kubenswrapper[4831]: I1204 10:49:30.269616 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/514cea2a-db2a-476e-aace-741121838112-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c58gq\" (UID: \"514cea2a-db2a-476e-aace-741121838112\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c58gq" Dec 04 10:49:30 crc kubenswrapper[4831]: I1204 10:49:30.269746 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/514cea2a-db2a-476e-aace-741121838112-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c58gq\" (UID: \"514cea2a-db2a-476e-aace-741121838112\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c58gq" Dec 04 10:49:30 crc kubenswrapper[4831]: I1204 10:49:30.269799 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/514cea2a-db2a-476e-aace-741121838112-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c58gq\" (UID: \"514cea2a-db2a-476e-aace-741121838112\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c58gq" Dec 04 10:49:30 crc kubenswrapper[4831]: I1204 10:49:30.269829 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/514cea2a-db2a-476e-aace-741121838112-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c58gq\" (UID: \"514cea2a-db2a-476e-aace-741121838112\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c58gq" Dec 04 10:49:30 crc kubenswrapper[4831]: I1204 10:49:30.270770 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/514cea2a-db2a-476e-aace-741121838112-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c58gq\" (UID: \"514cea2a-db2a-476e-aace-741121838112\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c58gq" Dec 04 10:49:30 crc kubenswrapper[4831]: I1204 10:49:30.274096 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/514cea2a-db2a-476e-aace-741121838112-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c58gq\" (UID: \"514cea2a-db2a-476e-aace-741121838112\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c58gq" Dec 04 10:49:30 crc kubenswrapper[4831]: I1204 10:49:30.274293 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/514cea2a-db2a-476e-aace-741121838112-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c58gq\" (UID: \"514cea2a-db2a-476e-aace-741121838112\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c58gq" Dec 04 10:49:30 crc kubenswrapper[4831]: I1204 10:49:30.275517 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/514cea2a-db2a-476e-aace-741121838112-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c58gq\" (UID: \"514cea2a-db2a-476e-aace-741121838112\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c58gq" Dec 04 10:49:30 crc kubenswrapper[4831]: I1204 10:49:30.291957 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7p47\" (UniqueName: \"kubernetes.io/projected/514cea2a-db2a-476e-aace-741121838112-kube-api-access-f7p47\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c58gq\" (UID: \"514cea2a-db2a-476e-aace-741121838112\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c58gq" Dec 04 10:49:30 crc kubenswrapper[4831]: I1204 10:49:30.425050 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c58gq" Dec 04 10:49:30 crc kubenswrapper[4831]: I1204 10:49:30.993062 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-c58gq"] Dec 04 10:49:31 crc kubenswrapper[4831]: I1204 10:49:31.995219 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c58gq" event={"ID":"514cea2a-db2a-476e-aace-741121838112","Type":"ContainerStarted","Data":"f036a726fba2d78118565a06a067afa1cfe22dca69646f0c380047f212f31706"} Dec 04 10:49:31 crc kubenswrapper[4831]: I1204 10:49:31.995818 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c58gq" event={"ID":"514cea2a-db2a-476e-aace-741121838112","Type":"ContainerStarted","Data":"706d54664659700b34247b047aa134ce9bce04020f808eb60bd4dbd044b7d1fa"} Dec 04 10:49:32 crc kubenswrapper[4831]: I1204 10:49:32.025428 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c58gq" podStartSLOduration=1.533300299 podStartE2EDuration="2.025409334s" podCreationTimestamp="2025-12-04 10:49:30 +0000 UTC" firstStartedPulling="2025-12-04 10:49:30.992635865 +0000 UTC m=+2067.941811169" lastFinishedPulling="2025-12-04 10:49:31.48474485 +0000 UTC m=+2068.433920204" observedRunningTime="2025-12-04 10:49:32.018638603 +0000 UTC m=+2068.967813917" watchObservedRunningTime="2025-12-04 10:49:32.025409334 +0000 UTC m=+2068.974584648" Dec 04 10:50:36 crc kubenswrapper[4831]: I1204 10:50:36.620351 4831 generic.go:334] "Generic (PLEG): container finished" podID="514cea2a-db2a-476e-aace-741121838112" containerID="f036a726fba2d78118565a06a067afa1cfe22dca69646f0c380047f212f31706" exitCode=0 Dec 04 10:50:36 crc kubenswrapper[4831]: I1204 10:50:36.620438 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c58gq" event={"ID":"514cea2a-db2a-476e-aace-741121838112","Type":"ContainerDied","Data":"f036a726fba2d78118565a06a067afa1cfe22dca69646f0c380047f212f31706"} Dec 04 10:50:38 crc kubenswrapper[4831]: I1204 10:50:38.120165 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c58gq" Dec 04 10:50:38 crc kubenswrapper[4831]: I1204 10:50:38.302020 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/514cea2a-db2a-476e-aace-741121838112-inventory\") pod \"514cea2a-db2a-476e-aace-741121838112\" (UID: \"514cea2a-db2a-476e-aace-741121838112\") " Dec 04 10:50:38 crc kubenswrapper[4831]: I1204 10:50:38.302087 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/514cea2a-db2a-476e-aace-741121838112-ovn-combined-ca-bundle\") pod \"514cea2a-db2a-476e-aace-741121838112\" (UID: \"514cea2a-db2a-476e-aace-741121838112\") " Dec 04 10:50:38 crc kubenswrapper[4831]: I1204 10:50:38.302133 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/514cea2a-db2a-476e-aace-741121838112-ovncontroller-config-0\") pod \"514cea2a-db2a-476e-aace-741121838112\" (UID: \"514cea2a-db2a-476e-aace-741121838112\") " Dec 04 10:50:38 crc kubenswrapper[4831]: I1204 10:50:38.302366 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7p47\" (UniqueName: \"kubernetes.io/projected/514cea2a-db2a-476e-aace-741121838112-kube-api-access-f7p47\") pod \"514cea2a-db2a-476e-aace-741121838112\" (UID: \"514cea2a-db2a-476e-aace-741121838112\") " Dec 04 10:50:38 crc kubenswrapper[4831]: I1204 10:50:38.302501 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/514cea2a-db2a-476e-aace-741121838112-ssh-key\") pod \"514cea2a-db2a-476e-aace-741121838112\" (UID: \"514cea2a-db2a-476e-aace-741121838112\") " Dec 04 10:50:38 crc kubenswrapper[4831]: I1204 10:50:38.309820 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/514cea2a-db2a-476e-aace-741121838112-kube-api-access-f7p47" (OuterVolumeSpecName: "kube-api-access-f7p47") pod "514cea2a-db2a-476e-aace-741121838112" (UID: "514cea2a-db2a-476e-aace-741121838112"). InnerVolumeSpecName "kube-api-access-f7p47". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:50:38 crc kubenswrapper[4831]: I1204 10:50:38.309908 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/514cea2a-db2a-476e-aace-741121838112-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "514cea2a-db2a-476e-aace-741121838112" (UID: "514cea2a-db2a-476e-aace-741121838112"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:50:38 crc kubenswrapper[4831]: I1204 10:50:38.333044 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/514cea2a-db2a-476e-aace-741121838112-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "514cea2a-db2a-476e-aace-741121838112" (UID: "514cea2a-db2a-476e-aace-741121838112"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:50:38 crc kubenswrapper[4831]: I1204 10:50:38.333414 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/514cea2a-db2a-476e-aace-741121838112-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "514cea2a-db2a-476e-aace-741121838112" (UID: "514cea2a-db2a-476e-aace-741121838112"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:50:38 crc kubenswrapper[4831]: I1204 10:50:38.339925 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/514cea2a-db2a-476e-aace-741121838112-inventory" (OuterVolumeSpecName: "inventory") pod "514cea2a-db2a-476e-aace-741121838112" (UID: "514cea2a-db2a-476e-aace-741121838112"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:50:38 crc kubenswrapper[4831]: I1204 10:50:38.404467 4831 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/514cea2a-db2a-476e-aace-741121838112-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:50:38 crc kubenswrapper[4831]: I1204 10:50:38.404504 4831 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/514cea2a-db2a-476e-aace-741121838112-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:50:38 crc kubenswrapper[4831]: I1204 10:50:38.404517 4831 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/514cea2a-db2a-476e-aace-741121838112-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:50:38 crc kubenswrapper[4831]: I1204 10:50:38.404530 4831 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/514cea2a-db2a-476e-aace-741121838112-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:50:38 crc kubenswrapper[4831]: I1204 10:50:38.404539 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7p47\" (UniqueName: \"kubernetes.io/projected/514cea2a-db2a-476e-aace-741121838112-kube-api-access-f7p47\") on node \"crc\" DevicePath \"\"" Dec 04 10:50:38 crc kubenswrapper[4831]: I1204 10:50:38.671413 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c58gq" event={"ID":"514cea2a-db2a-476e-aace-741121838112","Type":"ContainerDied","Data":"706d54664659700b34247b047aa134ce9bce04020f808eb60bd4dbd044b7d1fa"} Dec 04 10:50:38 crc kubenswrapper[4831]: I1204 10:50:38.671734 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="706d54664659700b34247b047aa134ce9bce04020f808eb60bd4dbd044b7d1fa" Dec 04 10:50:38 crc kubenswrapper[4831]: I1204 10:50:38.671791 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c58gq" Dec 04 10:50:38 crc kubenswrapper[4831]: I1204 10:50:38.739576 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k"] Dec 04 10:50:38 crc kubenswrapper[4831]: E1204 10:50:38.740115 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="514cea2a-db2a-476e-aace-741121838112" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 04 10:50:38 crc kubenswrapper[4831]: I1204 10:50:38.740138 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="514cea2a-db2a-476e-aace-741121838112" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 04 10:50:38 crc kubenswrapper[4831]: I1204 10:50:38.740363 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="514cea2a-db2a-476e-aace-741121838112" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 04 10:50:38 crc kubenswrapper[4831]: I1204 10:50:38.741803 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k" Dec 04 10:50:38 crc kubenswrapper[4831]: I1204 10:50:38.744466 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 04 10:50:38 crc kubenswrapper[4831]: I1204 10:50:38.744887 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:50:38 crc kubenswrapper[4831]: I1204 10:50:38.745074 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:50:38 crc kubenswrapper[4831]: I1204 10:50:38.745296 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-l2wn5" Dec 04 10:50:38 crc kubenswrapper[4831]: I1204 10:50:38.745485 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:50:38 crc kubenswrapper[4831]: I1204 10:50:38.745689 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 04 10:50:38 crc kubenswrapper[4831]: I1204 10:50:38.749173 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k"] Dec 04 10:50:38 crc kubenswrapper[4831]: I1204 10:50:38.914818 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2b14e5b-5d51-4f51-b199-7dc570d507b6-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k\" (UID: \"e2b14e5b-5d51-4f51-b199-7dc570d507b6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k" Dec 04 10:50:38 crc kubenswrapper[4831]: I1204 10:50:38.914879 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2b14e5b-5d51-4f51-b199-7dc570d507b6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k\" (UID: \"e2b14e5b-5d51-4f51-b199-7dc570d507b6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k" Dec 04 10:50:38 crc kubenswrapper[4831]: I1204 10:50:38.914921 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2b14e5b-5d51-4f51-b199-7dc570d507b6-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k\" (UID: \"e2b14e5b-5d51-4f51-b199-7dc570d507b6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k" Dec 04 10:50:38 crc kubenswrapper[4831]: I1204 10:50:38.914991 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2b14e5b-5d51-4f51-b199-7dc570d507b6-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k\" (UID: \"e2b14e5b-5d51-4f51-b199-7dc570d507b6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k" Dec 04 10:50:38 crc kubenswrapper[4831]: I1204 10:50:38.915147 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l6rk\" (UniqueName: \"kubernetes.io/projected/e2b14e5b-5d51-4f51-b199-7dc570d507b6-kube-api-access-5l6rk\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k\" (UID: \"e2b14e5b-5d51-4f51-b199-7dc570d507b6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k" Dec 04 10:50:38 crc kubenswrapper[4831]: I1204 10:50:38.915200 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2b14e5b-5d51-4f51-b199-7dc570d507b6-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k\" (UID: \"e2b14e5b-5d51-4f51-b199-7dc570d507b6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k" Dec 04 10:50:39 crc kubenswrapper[4831]: I1204 10:50:39.017297 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l6rk\" (UniqueName: \"kubernetes.io/projected/e2b14e5b-5d51-4f51-b199-7dc570d507b6-kube-api-access-5l6rk\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k\" (UID: \"e2b14e5b-5d51-4f51-b199-7dc570d507b6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k" Dec 04 10:50:39 crc kubenswrapper[4831]: I1204 10:50:39.017357 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2b14e5b-5d51-4f51-b199-7dc570d507b6-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k\" (UID: \"e2b14e5b-5d51-4f51-b199-7dc570d507b6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k" Dec 04 10:50:39 crc kubenswrapper[4831]: I1204 10:50:39.017384 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2b14e5b-5d51-4f51-b199-7dc570d507b6-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k\" (UID: \"e2b14e5b-5d51-4f51-b199-7dc570d507b6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k" Dec 04 10:50:39 crc kubenswrapper[4831]: I1204 10:50:39.017415 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2b14e5b-5d51-4f51-b199-7dc570d507b6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k\" (UID: \"e2b14e5b-5d51-4f51-b199-7dc570d507b6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k" Dec 04 10:50:39 crc kubenswrapper[4831]: I1204 10:50:39.017460 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2b14e5b-5d51-4f51-b199-7dc570d507b6-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k\" (UID: \"e2b14e5b-5d51-4f51-b199-7dc570d507b6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k" Dec 04 10:50:39 crc kubenswrapper[4831]: I1204 10:50:39.017484 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2b14e5b-5d51-4f51-b199-7dc570d507b6-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k\" (UID: \"e2b14e5b-5d51-4f51-b199-7dc570d507b6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k" Dec 04 10:50:39 crc kubenswrapper[4831]: I1204 10:50:39.022715 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2b14e5b-5d51-4f51-b199-7dc570d507b6-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k\" (UID: \"e2b14e5b-5d51-4f51-b199-7dc570d507b6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k" Dec 04 10:50:39 crc kubenswrapper[4831]: I1204 10:50:39.023039 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2b14e5b-5d51-4f51-b199-7dc570d507b6-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k\" (UID: \"e2b14e5b-5d51-4f51-b199-7dc570d507b6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k" Dec 04 10:50:39 crc kubenswrapper[4831]: I1204 10:50:39.023910 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2b14e5b-5d51-4f51-b199-7dc570d507b6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k\" (UID: \"e2b14e5b-5d51-4f51-b199-7dc570d507b6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k" Dec 04 10:50:39 crc kubenswrapper[4831]: I1204 10:50:39.024057 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2b14e5b-5d51-4f51-b199-7dc570d507b6-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k\" (UID: \"e2b14e5b-5d51-4f51-b199-7dc570d507b6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k" Dec 04 10:50:39 crc kubenswrapper[4831]: I1204 10:50:39.035402 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2b14e5b-5d51-4f51-b199-7dc570d507b6-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k\" (UID: \"e2b14e5b-5d51-4f51-b199-7dc570d507b6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k" Dec 04 10:50:39 crc kubenswrapper[4831]: I1204 10:50:39.036367 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l6rk\" (UniqueName: \"kubernetes.io/projected/e2b14e5b-5d51-4f51-b199-7dc570d507b6-kube-api-access-5l6rk\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k\" (UID: \"e2b14e5b-5d51-4f51-b199-7dc570d507b6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k" Dec 04 10:50:39 crc kubenswrapper[4831]: I1204 10:50:39.065766 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k" Dec 04 10:50:39 crc kubenswrapper[4831]: I1204 10:50:39.588878 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k"] Dec 04 10:50:39 crc kubenswrapper[4831]: I1204 10:50:39.681033 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k" event={"ID":"e2b14e5b-5d51-4f51-b199-7dc570d507b6","Type":"ContainerStarted","Data":"82234b19bb79480f5e07666fe431e3912591d4f1b4dc6bc5cf5c23509c1bdc22"} Dec 04 10:50:40 crc kubenswrapper[4831]: I1204 10:50:40.706169 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k" event={"ID":"e2b14e5b-5d51-4f51-b199-7dc570d507b6","Type":"ContainerStarted","Data":"79bb67d211d03613e5163e0505a57924004ff6898e1422fffc4fa65fe372784b"} Dec 04 10:50:40 crc kubenswrapper[4831]: I1204 10:50:40.745240 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k" podStartSLOduration=2.192670504 podStartE2EDuration="2.745213735s" podCreationTimestamp="2025-12-04 10:50:38 +0000 UTC" firstStartedPulling="2025-12-04 10:50:39.599846576 +0000 UTC m=+2136.549021890" lastFinishedPulling="2025-12-04 10:50:40.152389807 +0000 UTC m=+2137.101565121" observedRunningTime="2025-12-04 10:50:40.730412539 +0000 UTC m=+2137.679587873" watchObservedRunningTime="2025-12-04 10:50:40.745213735 +0000 UTC m=+2137.694389079" Dec 04 10:50:51 crc kubenswrapper[4831]: I1204 10:50:51.971671 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:50:51 crc kubenswrapper[4831]: I1204 10:50:51.972119 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:51:21 crc kubenswrapper[4831]: I1204 10:51:21.972057 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:51:21 crc kubenswrapper[4831]: I1204 10:51:21.972605 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:51:29 crc kubenswrapper[4831]: E1204 10:51:29.939539 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2b14e5b_5d51_4f51_b199_7dc570d507b6.slice/crio-conmon-79bb67d211d03613e5163e0505a57924004ff6898e1422fffc4fa65fe372784b.scope\": RecentStats: unable to find data in memory cache]" Dec 04 10:51:30 crc kubenswrapper[4831]: I1204 10:51:30.171286 4831 generic.go:334] "Generic (PLEG): container finished" podID="e2b14e5b-5d51-4f51-b199-7dc570d507b6" containerID="79bb67d211d03613e5163e0505a57924004ff6898e1422fffc4fa65fe372784b" exitCode=0 Dec 04 10:51:30 crc kubenswrapper[4831]: I1204 10:51:30.171534 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k" event={"ID":"e2b14e5b-5d51-4f51-b199-7dc570d507b6","Type":"ContainerDied","Data":"79bb67d211d03613e5163e0505a57924004ff6898e1422fffc4fa65fe372784b"} Dec 04 10:51:31 crc kubenswrapper[4831]: I1204 10:51:31.613527 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k" Dec 04 10:51:31 crc kubenswrapper[4831]: I1204 10:51:31.717748 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2b14e5b-5d51-4f51-b199-7dc570d507b6-neutron-metadata-combined-ca-bundle\") pod \"e2b14e5b-5d51-4f51-b199-7dc570d507b6\" (UID: \"e2b14e5b-5d51-4f51-b199-7dc570d507b6\") " Dec 04 10:51:31 crc kubenswrapper[4831]: I1204 10:51:31.717822 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2b14e5b-5d51-4f51-b199-7dc570d507b6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"e2b14e5b-5d51-4f51-b199-7dc570d507b6\" (UID: \"e2b14e5b-5d51-4f51-b199-7dc570d507b6\") " Dec 04 10:51:31 crc kubenswrapper[4831]: I1204 10:51:31.717911 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2b14e5b-5d51-4f51-b199-7dc570d507b6-ssh-key\") pod \"e2b14e5b-5d51-4f51-b199-7dc570d507b6\" (UID: \"e2b14e5b-5d51-4f51-b199-7dc570d507b6\") " Dec 04 10:51:31 crc kubenswrapper[4831]: I1204 10:51:31.717939 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2b14e5b-5d51-4f51-b199-7dc570d507b6-nova-metadata-neutron-config-0\") pod \"e2b14e5b-5d51-4f51-b199-7dc570d507b6\" (UID: \"e2b14e5b-5d51-4f51-b199-7dc570d507b6\") " Dec 04 10:51:31 crc kubenswrapper[4831]: I1204 10:51:31.717965 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2b14e5b-5d51-4f51-b199-7dc570d507b6-inventory\") pod \"e2b14e5b-5d51-4f51-b199-7dc570d507b6\" (UID: \"e2b14e5b-5d51-4f51-b199-7dc570d507b6\") " Dec 04 10:51:31 crc kubenswrapper[4831]: I1204 10:51:31.718056 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5l6rk\" (UniqueName: \"kubernetes.io/projected/e2b14e5b-5d51-4f51-b199-7dc570d507b6-kube-api-access-5l6rk\") pod \"e2b14e5b-5d51-4f51-b199-7dc570d507b6\" (UID: \"e2b14e5b-5d51-4f51-b199-7dc570d507b6\") " Dec 04 10:51:31 crc kubenswrapper[4831]: I1204 10:51:31.734623 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2b14e5b-5d51-4f51-b199-7dc570d507b6-kube-api-access-5l6rk" (OuterVolumeSpecName: "kube-api-access-5l6rk") pod "e2b14e5b-5d51-4f51-b199-7dc570d507b6" (UID: "e2b14e5b-5d51-4f51-b199-7dc570d507b6"). InnerVolumeSpecName "kube-api-access-5l6rk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:51:31 crc kubenswrapper[4831]: I1204 10:51:31.738837 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2b14e5b-5d51-4f51-b199-7dc570d507b6-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "e2b14e5b-5d51-4f51-b199-7dc570d507b6" (UID: "e2b14e5b-5d51-4f51-b199-7dc570d507b6"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:51:31 crc kubenswrapper[4831]: I1204 10:51:31.750204 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2b14e5b-5d51-4f51-b199-7dc570d507b6-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "e2b14e5b-5d51-4f51-b199-7dc570d507b6" (UID: "e2b14e5b-5d51-4f51-b199-7dc570d507b6"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:51:31 crc kubenswrapper[4831]: I1204 10:51:31.755679 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2b14e5b-5d51-4f51-b199-7dc570d507b6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e2b14e5b-5d51-4f51-b199-7dc570d507b6" (UID: "e2b14e5b-5d51-4f51-b199-7dc570d507b6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:51:31 crc kubenswrapper[4831]: I1204 10:51:31.759973 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2b14e5b-5d51-4f51-b199-7dc570d507b6-inventory" (OuterVolumeSpecName: "inventory") pod "e2b14e5b-5d51-4f51-b199-7dc570d507b6" (UID: "e2b14e5b-5d51-4f51-b199-7dc570d507b6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:51:31 crc kubenswrapper[4831]: I1204 10:51:31.771046 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2b14e5b-5d51-4f51-b199-7dc570d507b6-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "e2b14e5b-5d51-4f51-b199-7dc570d507b6" (UID: "e2b14e5b-5d51-4f51-b199-7dc570d507b6"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:51:31 crc kubenswrapper[4831]: I1204 10:51:31.820197 4831 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2b14e5b-5d51-4f51-b199-7dc570d507b6-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:51:31 crc kubenswrapper[4831]: I1204 10:51:31.820238 4831 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2b14e5b-5d51-4f51-b199-7dc570d507b6-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:51:31 crc kubenswrapper[4831]: I1204 10:51:31.820254 4831 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2b14e5b-5d51-4f51-b199-7dc570d507b6-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:51:31 crc kubenswrapper[4831]: I1204 10:51:31.820267 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5l6rk\" (UniqueName: \"kubernetes.io/projected/e2b14e5b-5d51-4f51-b199-7dc570d507b6-kube-api-access-5l6rk\") on node \"crc\" DevicePath \"\"" Dec 04 10:51:31 crc kubenswrapper[4831]: I1204 10:51:31.820279 4831 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2b14e5b-5d51-4f51-b199-7dc570d507b6-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:51:31 crc kubenswrapper[4831]: I1204 10:51:31.820294 4831 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2b14e5b-5d51-4f51-b199-7dc570d507b6-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:51:32 crc kubenswrapper[4831]: I1204 10:51:32.190547 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k" event={"ID":"e2b14e5b-5d51-4f51-b199-7dc570d507b6","Type":"ContainerDied","Data":"82234b19bb79480f5e07666fe431e3912591d4f1b4dc6bc5cf5c23509c1bdc22"} Dec 04 10:51:32 crc kubenswrapper[4831]: I1204 10:51:32.190590 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82234b19bb79480f5e07666fe431e3912591d4f1b4dc6bc5cf5c23509c1bdc22" Dec 04 10:51:32 crc kubenswrapper[4831]: I1204 10:51:32.191050 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k" Dec 04 10:51:32 crc kubenswrapper[4831]: I1204 10:51:32.286217 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rh529"] Dec 04 10:51:32 crc kubenswrapper[4831]: E1204 10:51:32.286795 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2b14e5b-5d51-4f51-b199-7dc570d507b6" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 04 10:51:32 crc kubenswrapper[4831]: I1204 10:51:32.286819 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2b14e5b-5d51-4f51-b199-7dc570d507b6" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 04 10:51:32 crc kubenswrapper[4831]: I1204 10:51:32.287259 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2b14e5b-5d51-4f51-b199-7dc570d507b6" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 04 10:51:32 crc kubenswrapper[4831]: I1204 10:51:32.288089 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rh529" Dec 04 10:51:32 crc kubenswrapper[4831]: I1204 10:51:32.290804 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:51:32 crc kubenswrapper[4831]: I1204 10:51:32.291187 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:51:32 crc kubenswrapper[4831]: I1204 10:51:32.291324 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-l2wn5" Dec 04 10:51:32 crc kubenswrapper[4831]: I1204 10:51:32.292728 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:51:32 crc kubenswrapper[4831]: I1204 10:51:32.294727 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 04 10:51:32 crc kubenswrapper[4831]: I1204 10:51:32.302348 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rh529"] Dec 04 10:51:32 crc kubenswrapper[4831]: I1204 10:51:32.435847 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/edce9302-713f-454f-b725-e30e8f594cd3-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rh529\" (UID: \"edce9302-713f-454f-b725-e30e8f594cd3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rh529" Dec 04 10:51:32 crc kubenswrapper[4831]: I1204 10:51:32.436192 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/edce9302-713f-454f-b725-e30e8f594cd3-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rh529\" (UID: \"edce9302-713f-454f-b725-e30e8f594cd3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rh529" Dec 04 10:51:32 crc kubenswrapper[4831]: I1204 10:51:32.436338 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edce9302-713f-454f-b725-e30e8f594cd3-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rh529\" (UID: \"edce9302-713f-454f-b725-e30e8f594cd3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rh529" Dec 04 10:51:32 crc kubenswrapper[4831]: I1204 10:51:32.436536 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f422\" (UniqueName: \"kubernetes.io/projected/edce9302-713f-454f-b725-e30e8f594cd3-kube-api-access-8f422\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rh529\" (UID: \"edce9302-713f-454f-b725-e30e8f594cd3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rh529" Dec 04 10:51:32 crc kubenswrapper[4831]: I1204 10:51:32.436705 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/edce9302-713f-454f-b725-e30e8f594cd3-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rh529\" (UID: \"edce9302-713f-454f-b725-e30e8f594cd3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rh529" Dec 04 10:51:32 crc kubenswrapper[4831]: I1204 10:51:32.538472 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/edce9302-713f-454f-b725-e30e8f594cd3-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rh529\" (UID: \"edce9302-713f-454f-b725-e30e8f594cd3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rh529" Dec 04 10:51:32 crc kubenswrapper[4831]: I1204 10:51:32.538522 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/edce9302-713f-454f-b725-e30e8f594cd3-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rh529\" (UID: \"edce9302-713f-454f-b725-e30e8f594cd3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rh529" Dec 04 10:51:32 crc kubenswrapper[4831]: I1204 10:51:32.538570 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edce9302-713f-454f-b725-e30e8f594cd3-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rh529\" (UID: \"edce9302-713f-454f-b725-e30e8f594cd3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rh529" Dec 04 10:51:32 crc kubenswrapper[4831]: I1204 10:51:32.538689 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8f422\" (UniqueName: \"kubernetes.io/projected/edce9302-713f-454f-b725-e30e8f594cd3-kube-api-access-8f422\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rh529\" (UID: \"edce9302-713f-454f-b725-e30e8f594cd3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rh529" Dec 04 10:51:32 crc kubenswrapper[4831]: I1204 10:51:32.538736 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/edce9302-713f-454f-b725-e30e8f594cd3-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rh529\" (UID: \"edce9302-713f-454f-b725-e30e8f594cd3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rh529" Dec 04 10:51:32 crc kubenswrapper[4831]: I1204 10:51:32.542551 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/edce9302-713f-454f-b725-e30e8f594cd3-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rh529\" (UID: \"edce9302-713f-454f-b725-e30e8f594cd3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rh529" Dec 04 10:51:32 crc kubenswrapper[4831]: I1204 10:51:32.542576 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/edce9302-713f-454f-b725-e30e8f594cd3-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rh529\" (UID: \"edce9302-713f-454f-b725-e30e8f594cd3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rh529" Dec 04 10:51:32 crc kubenswrapper[4831]: I1204 10:51:32.542709 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/edce9302-713f-454f-b725-e30e8f594cd3-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rh529\" (UID: \"edce9302-713f-454f-b725-e30e8f594cd3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rh529" Dec 04 10:51:32 crc kubenswrapper[4831]: I1204 10:51:32.544158 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edce9302-713f-454f-b725-e30e8f594cd3-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rh529\" (UID: \"edce9302-713f-454f-b725-e30e8f594cd3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rh529" Dec 04 10:51:32 crc kubenswrapper[4831]: I1204 10:51:32.564041 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f422\" (UniqueName: \"kubernetes.io/projected/edce9302-713f-454f-b725-e30e8f594cd3-kube-api-access-8f422\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rh529\" (UID: \"edce9302-713f-454f-b725-e30e8f594cd3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rh529" Dec 04 10:51:32 crc kubenswrapper[4831]: I1204 10:51:32.618544 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rh529" Dec 04 10:51:33 crc kubenswrapper[4831]: I1204 10:51:33.174529 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rh529"] Dec 04 10:51:33 crc kubenswrapper[4831]: I1204 10:51:33.179361 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 10:51:33 crc kubenswrapper[4831]: I1204 10:51:33.200994 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rh529" event={"ID":"edce9302-713f-454f-b725-e30e8f594cd3","Type":"ContainerStarted","Data":"39eaa24ef73996a64a1ca9cb54b44be1153a0dd863107646f262e8513cfab806"} Dec 04 10:51:34 crc kubenswrapper[4831]: I1204 10:51:34.030985 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-l2dbx"] Dec 04 10:51:34 crc kubenswrapper[4831]: I1204 10:51:34.034905 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l2dbx" Dec 04 10:51:34 crc kubenswrapper[4831]: I1204 10:51:34.043030 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l2dbx"] Dec 04 10:51:34 crc kubenswrapper[4831]: I1204 10:51:34.173030 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78bc4fa0-3286-4b3e-a796-34d6c1784cdc-catalog-content\") pod \"redhat-operators-l2dbx\" (UID: \"78bc4fa0-3286-4b3e-a796-34d6c1784cdc\") " pod="openshift-marketplace/redhat-operators-l2dbx" Dec 04 10:51:34 crc kubenswrapper[4831]: I1204 10:51:34.173133 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78bc4fa0-3286-4b3e-a796-34d6c1784cdc-utilities\") pod \"redhat-operators-l2dbx\" (UID: \"78bc4fa0-3286-4b3e-a796-34d6c1784cdc\") " pod="openshift-marketplace/redhat-operators-l2dbx" Dec 04 10:51:34 crc kubenswrapper[4831]: I1204 10:51:34.173181 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqx5p\" (UniqueName: \"kubernetes.io/projected/78bc4fa0-3286-4b3e-a796-34d6c1784cdc-kube-api-access-fqx5p\") pod \"redhat-operators-l2dbx\" (UID: \"78bc4fa0-3286-4b3e-a796-34d6c1784cdc\") " pod="openshift-marketplace/redhat-operators-l2dbx" Dec 04 10:51:34 crc kubenswrapper[4831]: I1204 10:51:34.217849 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7m2f4"] Dec 04 10:51:34 crc kubenswrapper[4831]: I1204 10:51:34.224946 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7m2f4" Dec 04 10:51:34 crc kubenswrapper[4831]: I1204 10:51:34.253043 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7m2f4"] Dec 04 10:51:34 crc kubenswrapper[4831]: I1204 10:51:34.276093 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78bc4fa0-3286-4b3e-a796-34d6c1784cdc-catalog-content\") pod \"redhat-operators-l2dbx\" (UID: \"78bc4fa0-3286-4b3e-a796-34d6c1784cdc\") " pod="openshift-marketplace/redhat-operators-l2dbx" Dec 04 10:51:34 crc kubenswrapper[4831]: I1204 10:51:34.276187 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78bc4fa0-3286-4b3e-a796-34d6c1784cdc-utilities\") pod \"redhat-operators-l2dbx\" (UID: \"78bc4fa0-3286-4b3e-a796-34d6c1784cdc\") " pod="openshift-marketplace/redhat-operators-l2dbx" Dec 04 10:51:34 crc kubenswrapper[4831]: I1204 10:51:34.276224 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqx5p\" (UniqueName: \"kubernetes.io/projected/78bc4fa0-3286-4b3e-a796-34d6c1784cdc-kube-api-access-fqx5p\") pod \"redhat-operators-l2dbx\" (UID: \"78bc4fa0-3286-4b3e-a796-34d6c1784cdc\") " pod="openshift-marketplace/redhat-operators-l2dbx" Dec 04 10:51:34 crc kubenswrapper[4831]: I1204 10:51:34.276842 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78bc4fa0-3286-4b3e-a796-34d6c1784cdc-utilities\") pod \"redhat-operators-l2dbx\" (UID: \"78bc4fa0-3286-4b3e-a796-34d6c1784cdc\") " pod="openshift-marketplace/redhat-operators-l2dbx" Dec 04 10:51:34 crc kubenswrapper[4831]: I1204 10:51:34.277204 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78bc4fa0-3286-4b3e-a796-34d6c1784cdc-catalog-content\") pod \"redhat-operators-l2dbx\" (UID: \"78bc4fa0-3286-4b3e-a796-34d6c1784cdc\") " pod="openshift-marketplace/redhat-operators-l2dbx" Dec 04 10:51:34 crc kubenswrapper[4831]: I1204 10:51:34.303475 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqx5p\" (UniqueName: \"kubernetes.io/projected/78bc4fa0-3286-4b3e-a796-34d6c1784cdc-kube-api-access-fqx5p\") pod \"redhat-operators-l2dbx\" (UID: \"78bc4fa0-3286-4b3e-a796-34d6c1784cdc\") " pod="openshift-marketplace/redhat-operators-l2dbx" Dec 04 10:51:34 crc kubenswrapper[4831]: I1204 10:51:34.377893 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128686bd-ffd4-4b11-b829-98a65e2ddf01-catalog-content\") pod \"certified-operators-7m2f4\" (UID: \"128686bd-ffd4-4b11-b829-98a65e2ddf01\") " pod="openshift-marketplace/certified-operators-7m2f4" Dec 04 10:51:34 crc kubenswrapper[4831]: I1204 10:51:34.377998 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v2ld\" (UniqueName: \"kubernetes.io/projected/128686bd-ffd4-4b11-b829-98a65e2ddf01-kube-api-access-9v2ld\") pod \"certified-operators-7m2f4\" (UID: \"128686bd-ffd4-4b11-b829-98a65e2ddf01\") " pod="openshift-marketplace/certified-operators-7m2f4" Dec 04 10:51:34 crc kubenswrapper[4831]: I1204 10:51:34.378168 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128686bd-ffd4-4b11-b829-98a65e2ddf01-utilities\") pod \"certified-operators-7m2f4\" (UID: \"128686bd-ffd4-4b11-b829-98a65e2ddf01\") " pod="openshift-marketplace/certified-operators-7m2f4" Dec 04 10:51:34 crc kubenswrapper[4831]: I1204 10:51:34.444868 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l2dbx" Dec 04 10:51:34 crc kubenswrapper[4831]: I1204 10:51:34.480275 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128686bd-ffd4-4b11-b829-98a65e2ddf01-utilities\") pod \"certified-operators-7m2f4\" (UID: \"128686bd-ffd4-4b11-b829-98a65e2ddf01\") " pod="openshift-marketplace/certified-operators-7m2f4" Dec 04 10:51:34 crc kubenswrapper[4831]: I1204 10:51:34.480367 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128686bd-ffd4-4b11-b829-98a65e2ddf01-catalog-content\") pod \"certified-operators-7m2f4\" (UID: \"128686bd-ffd4-4b11-b829-98a65e2ddf01\") " pod="openshift-marketplace/certified-operators-7m2f4" Dec 04 10:51:34 crc kubenswrapper[4831]: I1204 10:51:34.480425 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v2ld\" (UniqueName: \"kubernetes.io/projected/128686bd-ffd4-4b11-b829-98a65e2ddf01-kube-api-access-9v2ld\") pod \"certified-operators-7m2f4\" (UID: \"128686bd-ffd4-4b11-b829-98a65e2ddf01\") " pod="openshift-marketplace/certified-operators-7m2f4" Dec 04 10:51:34 crc kubenswrapper[4831]: I1204 10:51:34.481238 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128686bd-ffd4-4b11-b829-98a65e2ddf01-utilities\") pod \"certified-operators-7m2f4\" (UID: \"128686bd-ffd4-4b11-b829-98a65e2ddf01\") " pod="openshift-marketplace/certified-operators-7m2f4" Dec 04 10:51:34 crc kubenswrapper[4831]: I1204 10:51:34.481504 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128686bd-ffd4-4b11-b829-98a65e2ddf01-catalog-content\") pod \"certified-operators-7m2f4\" (UID: \"128686bd-ffd4-4b11-b829-98a65e2ddf01\") " pod="openshift-marketplace/certified-operators-7m2f4" Dec 04 10:51:34 crc kubenswrapper[4831]: I1204 10:51:34.501850 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v2ld\" (UniqueName: \"kubernetes.io/projected/128686bd-ffd4-4b11-b829-98a65e2ddf01-kube-api-access-9v2ld\") pod \"certified-operators-7m2f4\" (UID: \"128686bd-ffd4-4b11-b829-98a65e2ddf01\") " pod="openshift-marketplace/certified-operators-7m2f4" Dec 04 10:51:34 crc kubenswrapper[4831]: I1204 10:51:34.543709 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7m2f4" Dec 04 10:51:34 crc kubenswrapper[4831]: I1204 10:51:34.998548 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l2dbx"] Dec 04 10:51:35 crc kubenswrapper[4831]: W1204 10:51:35.001409 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78bc4fa0_3286_4b3e_a796_34d6c1784cdc.slice/crio-642b9dfa94138e79f0f41f6e1175c539da761ac49c50ecdbb872b5dca8fe092c WatchSource:0}: Error finding container 642b9dfa94138e79f0f41f6e1175c539da761ac49c50ecdbb872b5dca8fe092c: Status 404 returned error can't find the container with id 642b9dfa94138e79f0f41f6e1175c539da761ac49c50ecdbb872b5dca8fe092c Dec 04 10:51:35 crc kubenswrapper[4831]: I1204 10:51:35.114262 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7m2f4"] Dec 04 10:51:35 crc kubenswrapper[4831]: W1204 10:51:35.125856 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod128686bd_ffd4_4b11_b829_98a65e2ddf01.slice/crio-6dfaed26dbca25b3f4f6720c1b42fb69a7271266bdfb344c614886578c7fa56f WatchSource:0}: Error finding container 6dfaed26dbca25b3f4f6720c1b42fb69a7271266bdfb344c614886578c7fa56f: Status 404 returned error can't find the container with id 6dfaed26dbca25b3f4f6720c1b42fb69a7271266bdfb344c614886578c7fa56f Dec 04 10:51:35 crc kubenswrapper[4831]: I1204 10:51:35.219173 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7m2f4" event={"ID":"128686bd-ffd4-4b11-b829-98a65e2ddf01","Type":"ContainerStarted","Data":"6dfaed26dbca25b3f4f6720c1b42fb69a7271266bdfb344c614886578c7fa56f"} Dec 04 10:51:35 crc kubenswrapper[4831]: I1204 10:51:35.221950 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l2dbx" event={"ID":"78bc4fa0-3286-4b3e-a796-34d6c1784cdc","Type":"ContainerStarted","Data":"642b9dfa94138e79f0f41f6e1175c539da761ac49c50ecdbb872b5dca8fe092c"} Dec 04 10:51:35 crc kubenswrapper[4831]: I1204 10:51:35.223268 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rh529" event={"ID":"edce9302-713f-454f-b725-e30e8f594cd3","Type":"ContainerStarted","Data":"e29fb17e04897092881580916ddef070c474732da6f1111303d1bde3ee20210b"} Dec 04 10:51:35 crc kubenswrapper[4831]: I1204 10:51:35.245209 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rh529" podStartSLOduration=2.483459363 podStartE2EDuration="3.245181596s" podCreationTimestamp="2025-12-04 10:51:32 +0000 UTC" firstStartedPulling="2025-12-04 10:51:33.179037322 +0000 UTC m=+2190.128212626" lastFinishedPulling="2025-12-04 10:51:33.940759545 +0000 UTC m=+2190.889934859" observedRunningTime="2025-12-04 10:51:35.23599917 +0000 UTC m=+2192.185174484" watchObservedRunningTime="2025-12-04 10:51:35.245181596 +0000 UTC m=+2192.194356910" Dec 04 10:51:36 crc kubenswrapper[4831]: I1204 10:51:36.234152 4831 generic.go:334] "Generic (PLEG): container finished" podID="78bc4fa0-3286-4b3e-a796-34d6c1784cdc" containerID="89441f5ca691d82e55bdfa38e6223a583808e7dc6510fcae85d15d4971bfdf69" exitCode=0 Dec 04 10:51:36 crc kubenswrapper[4831]: I1204 10:51:36.234272 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l2dbx" event={"ID":"78bc4fa0-3286-4b3e-a796-34d6c1784cdc","Type":"ContainerDied","Data":"89441f5ca691d82e55bdfa38e6223a583808e7dc6510fcae85d15d4971bfdf69"} Dec 04 10:51:36 crc kubenswrapper[4831]: I1204 10:51:36.239101 4831 generic.go:334] "Generic (PLEG): container finished" podID="128686bd-ffd4-4b11-b829-98a65e2ddf01" containerID="6d2e9719bae3ef35f2d793d19b99989f3a68bbe3d2dbc6ab48e1cd35783b1c87" exitCode=0 Dec 04 10:51:36 crc kubenswrapper[4831]: I1204 10:51:36.240717 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7m2f4" event={"ID":"128686bd-ffd4-4b11-b829-98a65e2ddf01","Type":"ContainerDied","Data":"6d2e9719bae3ef35f2d793d19b99989f3a68bbe3d2dbc6ab48e1cd35783b1c87"} Dec 04 10:51:36 crc kubenswrapper[4831]: I1204 10:51:36.416557 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fhczj"] Dec 04 10:51:36 crc kubenswrapper[4831]: I1204 10:51:36.418753 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fhczj" Dec 04 10:51:36 crc kubenswrapper[4831]: I1204 10:51:36.437605 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fhczj"] Dec 04 10:51:36 crc kubenswrapper[4831]: I1204 10:51:36.531715 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d0a4013-5ccc-42e0-8361-324fa61440c0-catalog-content\") pod \"community-operators-fhczj\" (UID: \"8d0a4013-5ccc-42e0-8361-324fa61440c0\") " pod="openshift-marketplace/community-operators-fhczj" Dec 04 10:51:36 crc kubenswrapper[4831]: I1204 10:51:36.532074 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d0a4013-5ccc-42e0-8361-324fa61440c0-utilities\") pod \"community-operators-fhczj\" (UID: \"8d0a4013-5ccc-42e0-8361-324fa61440c0\") " pod="openshift-marketplace/community-operators-fhczj" Dec 04 10:51:36 crc kubenswrapper[4831]: I1204 10:51:36.532308 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwgpm\" (UniqueName: \"kubernetes.io/projected/8d0a4013-5ccc-42e0-8361-324fa61440c0-kube-api-access-zwgpm\") pod \"community-operators-fhczj\" (UID: \"8d0a4013-5ccc-42e0-8361-324fa61440c0\") " pod="openshift-marketplace/community-operators-fhczj" Dec 04 10:51:36 crc kubenswrapper[4831]: I1204 10:51:36.634604 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d0a4013-5ccc-42e0-8361-324fa61440c0-catalog-content\") pod \"community-operators-fhczj\" (UID: \"8d0a4013-5ccc-42e0-8361-324fa61440c0\") " pod="openshift-marketplace/community-operators-fhczj" Dec 04 10:51:36 crc kubenswrapper[4831]: I1204 10:51:36.634743 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d0a4013-5ccc-42e0-8361-324fa61440c0-utilities\") pod \"community-operators-fhczj\" (UID: \"8d0a4013-5ccc-42e0-8361-324fa61440c0\") " pod="openshift-marketplace/community-operators-fhczj" Dec 04 10:51:36 crc kubenswrapper[4831]: I1204 10:51:36.634913 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwgpm\" (UniqueName: \"kubernetes.io/projected/8d0a4013-5ccc-42e0-8361-324fa61440c0-kube-api-access-zwgpm\") pod \"community-operators-fhczj\" (UID: \"8d0a4013-5ccc-42e0-8361-324fa61440c0\") " pod="openshift-marketplace/community-operators-fhczj" Dec 04 10:51:36 crc kubenswrapper[4831]: I1204 10:51:36.635267 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d0a4013-5ccc-42e0-8361-324fa61440c0-catalog-content\") pod \"community-operators-fhczj\" (UID: \"8d0a4013-5ccc-42e0-8361-324fa61440c0\") " pod="openshift-marketplace/community-operators-fhczj" Dec 04 10:51:36 crc kubenswrapper[4831]: I1204 10:51:36.635331 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d0a4013-5ccc-42e0-8361-324fa61440c0-utilities\") pod \"community-operators-fhczj\" (UID: \"8d0a4013-5ccc-42e0-8361-324fa61440c0\") " pod="openshift-marketplace/community-operators-fhczj" Dec 04 10:51:36 crc kubenswrapper[4831]: I1204 10:51:36.657576 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwgpm\" (UniqueName: \"kubernetes.io/projected/8d0a4013-5ccc-42e0-8361-324fa61440c0-kube-api-access-zwgpm\") pod \"community-operators-fhczj\" (UID: \"8d0a4013-5ccc-42e0-8361-324fa61440c0\") " pod="openshift-marketplace/community-operators-fhczj" Dec 04 10:51:36 crc kubenswrapper[4831]: I1204 10:51:36.741145 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fhczj" Dec 04 10:51:37 crc kubenswrapper[4831]: I1204 10:51:37.342186 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fhczj"] Dec 04 10:51:38 crc kubenswrapper[4831]: I1204 10:51:38.268366 4831 generic.go:334] "Generic (PLEG): container finished" podID="128686bd-ffd4-4b11-b829-98a65e2ddf01" containerID="fe8559a5b694fab98c4f1c6df41b9eb03dcf6f66b65d9289ff285a7e00b16e10" exitCode=0 Dec 04 10:51:38 crc kubenswrapper[4831]: I1204 10:51:38.268851 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7m2f4" event={"ID":"128686bd-ffd4-4b11-b829-98a65e2ddf01","Type":"ContainerDied","Data":"fe8559a5b694fab98c4f1c6df41b9eb03dcf6f66b65d9289ff285a7e00b16e10"} Dec 04 10:51:38 crc kubenswrapper[4831]: I1204 10:51:38.271714 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l2dbx" event={"ID":"78bc4fa0-3286-4b3e-a796-34d6c1784cdc","Type":"ContainerStarted","Data":"36149f81bc59e780cb8e76ba58030bc1d63dae4725451a04b1cf091246541b2c"} Dec 04 10:51:38 crc kubenswrapper[4831]: I1204 10:51:38.275711 4831 generic.go:334] "Generic (PLEG): container finished" podID="8d0a4013-5ccc-42e0-8361-324fa61440c0" containerID="2bc97772803d611d3a877da6d640cfccbfe5b6527650ac531ffb57d7bf52cda9" exitCode=0 Dec 04 10:51:38 crc kubenswrapper[4831]: I1204 10:51:38.275801 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhczj" event={"ID":"8d0a4013-5ccc-42e0-8361-324fa61440c0","Type":"ContainerDied","Data":"2bc97772803d611d3a877da6d640cfccbfe5b6527650ac531ffb57d7bf52cda9"} Dec 04 10:51:38 crc kubenswrapper[4831]: I1204 10:51:38.276115 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhczj" event={"ID":"8d0a4013-5ccc-42e0-8361-324fa61440c0","Type":"ContainerStarted","Data":"c90a9448b6c1b08a270f052ec0e0cfa46ff3854eb58144c8f7bb90250dbdb63a"} Dec 04 10:51:39 crc kubenswrapper[4831]: I1204 10:51:39.285771 4831 generic.go:334] "Generic (PLEG): container finished" podID="78bc4fa0-3286-4b3e-a796-34d6c1784cdc" containerID="36149f81bc59e780cb8e76ba58030bc1d63dae4725451a04b1cf091246541b2c" exitCode=0 Dec 04 10:51:39 crc kubenswrapper[4831]: I1204 10:51:39.289101 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l2dbx" event={"ID":"78bc4fa0-3286-4b3e-a796-34d6c1784cdc","Type":"ContainerDied","Data":"36149f81bc59e780cb8e76ba58030bc1d63dae4725451a04b1cf091246541b2c"} Dec 04 10:51:40 crc kubenswrapper[4831]: I1204 10:51:40.298679 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l2dbx" event={"ID":"78bc4fa0-3286-4b3e-a796-34d6c1784cdc","Type":"ContainerStarted","Data":"dcace4771ae283c47fd91d54f1f691d6160e7298fa9f6329c25207640fee902e"} Dec 04 10:51:40 crc kubenswrapper[4831]: I1204 10:51:40.303314 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7m2f4" event={"ID":"128686bd-ffd4-4b11-b829-98a65e2ddf01","Type":"ContainerStarted","Data":"4b6a4729f304c5ce9e98ff307a1c58cc23195fca575c226e09e5a93517974c25"} Dec 04 10:51:40 crc kubenswrapper[4831]: I1204 10:51:40.322019 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-l2dbx" podStartSLOduration=3.4019516899999998 podStartE2EDuration="6.32199555s" podCreationTimestamp="2025-12-04 10:51:34 +0000 UTC" firstStartedPulling="2025-12-04 10:51:36.23780947 +0000 UTC m=+2193.186984824" lastFinishedPulling="2025-12-04 10:51:39.15785337 +0000 UTC m=+2196.107028684" observedRunningTime="2025-12-04 10:51:40.314027967 +0000 UTC m=+2197.263203301" watchObservedRunningTime="2025-12-04 10:51:40.32199555 +0000 UTC m=+2197.271170864" Dec 04 10:51:40 crc kubenswrapper[4831]: I1204 10:51:40.342281 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7m2f4" podStartSLOduration=3.210131473 podStartE2EDuration="6.342258992s" podCreationTimestamp="2025-12-04 10:51:34 +0000 UTC" firstStartedPulling="2025-12-04 10:51:36.242916217 +0000 UTC m=+2193.192091531" lastFinishedPulling="2025-12-04 10:51:39.375043736 +0000 UTC m=+2196.324219050" observedRunningTime="2025-12-04 10:51:40.335140341 +0000 UTC m=+2197.284315695" watchObservedRunningTime="2025-12-04 10:51:40.342258992 +0000 UTC m=+2197.291434306" Dec 04 10:51:41 crc kubenswrapper[4831]: I1204 10:51:41.313950 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhczj" event={"ID":"8d0a4013-5ccc-42e0-8361-324fa61440c0","Type":"ContainerStarted","Data":"7c1f2fab6a581ab5d6cbf33ce8d8968fa87b21bed91eac04b5b6c207b85417ff"} Dec 04 10:51:43 crc kubenswrapper[4831]: I1204 10:51:43.334085 4831 generic.go:334] "Generic (PLEG): container finished" podID="8d0a4013-5ccc-42e0-8361-324fa61440c0" containerID="7c1f2fab6a581ab5d6cbf33ce8d8968fa87b21bed91eac04b5b6c207b85417ff" exitCode=0 Dec 04 10:51:43 crc kubenswrapper[4831]: I1204 10:51:43.334158 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhczj" event={"ID":"8d0a4013-5ccc-42e0-8361-324fa61440c0","Type":"ContainerDied","Data":"7c1f2fab6a581ab5d6cbf33ce8d8968fa87b21bed91eac04b5b6c207b85417ff"} Dec 04 10:51:44 crc kubenswrapper[4831]: I1204 10:51:44.445945 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l2dbx" Dec 04 10:51:44 crc kubenswrapper[4831]: I1204 10:51:44.446843 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-l2dbx" Dec 04 10:51:44 crc kubenswrapper[4831]: I1204 10:51:44.544377 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7m2f4" Dec 04 10:51:44 crc kubenswrapper[4831]: I1204 10:51:44.544441 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7m2f4" Dec 04 10:51:45 crc kubenswrapper[4831]: I1204 10:51:45.498923 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l2dbx" podUID="78bc4fa0-3286-4b3e-a796-34d6c1784cdc" containerName="registry-server" probeResult="failure" output=< Dec 04 10:51:45 crc kubenswrapper[4831]: timeout: failed to connect service ":50051" within 1s Dec 04 10:51:45 crc kubenswrapper[4831]: > Dec 04 10:51:45 crc kubenswrapper[4831]: I1204 10:51:45.587032 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-7m2f4" podUID="128686bd-ffd4-4b11-b829-98a65e2ddf01" containerName="registry-server" probeResult="failure" output=< Dec 04 10:51:45 crc kubenswrapper[4831]: timeout: failed to connect service ":50051" within 1s Dec 04 10:51:45 crc kubenswrapper[4831]: > Dec 04 10:51:47 crc kubenswrapper[4831]: I1204 10:51:47.372833 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhczj" event={"ID":"8d0a4013-5ccc-42e0-8361-324fa61440c0","Type":"ContainerStarted","Data":"7c0d86ee7a12d36e34c0270f296686d21765a7eae8bdca969f4d0186665375a7"} Dec 04 10:51:47 crc kubenswrapper[4831]: I1204 10:51:47.394934 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fhczj" podStartSLOduration=4.108374649 podStartE2EDuration="11.394909495s" podCreationTimestamp="2025-12-04 10:51:36 +0000 UTC" firstStartedPulling="2025-12-04 10:51:39.373996588 +0000 UTC m=+2196.323171922" lastFinishedPulling="2025-12-04 10:51:46.660531454 +0000 UTC m=+2203.609706768" observedRunningTime="2025-12-04 10:51:47.3902413 +0000 UTC m=+2204.339416624" watchObservedRunningTime="2025-12-04 10:51:47.394909495 +0000 UTC m=+2204.344084819" Dec 04 10:51:51 crc kubenswrapper[4831]: I1204 10:51:51.971799 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:51:51 crc kubenswrapper[4831]: I1204 10:51:51.972475 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:51:51 crc kubenswrapper[4831]: I1204 10:51:51.972544 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" Dec 04 10:51:51 crc kubenswrapper[4831]: I1204 10:51:51.973619 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"80db97831c4a7e8b6da9c9072594827bb567104b86c587975c95004562b3e059"} pod="openshift-machine-config-operator/machine-config-daemon-g76nn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 10:51:51 crc kubenswrapper[4831]: I1204 10:51:51.973755 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" containerID="cri-o://80db97831c4a7e8b6da9c9072594827bb567104b86c587975c95004562b3e059" gracePeriod=600 Dec 04 10:51:52 crc kubenswrapper[4831]: I1204 10:51:52.429888 4831 generic.go:334] "Generic (PLEG): container finished" podID="8475bb26-8864-4d49-935b-db7d4cb73387" containerID="80db97831c4a7e8b6da9c9072594827bb567104b86c587975c95004562b3e059" exitCode=0 Dec 04 10:51:52 crc kubenswrapper[4831]: I1204 10:51:52.430084 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" event={"ID":"8475bb26-8864-4d49-935b-db7d4cb73387","Type":"ContainerDied","Data":"80db97831c4a7e8b6da9c9072594827bb567104b86c587975c95004562b3e059"} Dec 04 10:51:52 crc kubenswrapper[4831]: I1204 10:51:52.430516 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" event={"ID":"8475bb26-8864-4d49-935b-db7d4cb73387","Type":"ContainerStarted","Data":"7680c56ce4695af5a3b6bef889c0877624c8a70c4075962710bf59431dcdb2e9"} Dec 04 10:51:52 crc kubenswrapper[4831]: I1204 10:51:52.430602 4831 scope.go:117] "RemoveContainer" containerID="37e17da4eccc398a7fd4a49aa969c9f666aa6c38055bd6e9bb03c0ad0e553e58" Dec 04 10:51:54 crc kubenswrapper[4831]: I1204 10:51:54.496436 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-l2dbx" Dec 04 10:51:54 crc kubenswrapper[4831]: I1204 10:51:54.554498 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-l2dbx" Dec 04 10:51:54 crc kubenswrapper[4831]: I1204 10:51:54.597126 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7m2f4" Dec 04 10:51:54 crc kubenswrapper[4831]: I1204 10:51:54.643978 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7m2f4" Dec 04 10:51:54 crc kubenswrapper[4831]: I1204 10:51:54.734205 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l2dbx"] Dec 04 10:51:56 crc kubenswrapper[4831]: I1204 10:51:56.471146 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-l2dbx" podUID="78bc4fa0-3286-4b3e-a796-34d6c1784cdc" containerName="registry-server" containerID="cri-o://dcace4771ae283c47fd91d54f1f691d6160e7298fa9f6329c25207640fee902e" gracePeriod=2 Dec 04 10:51:56 crc kubenswrapper[4831]: I1204 10:51:56.741582 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fhczj" Dec 04 10:51:56 crc kubenswrapper[4831]: I1204 10:51:56.741674 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fhczj" Dec 04 10:51:56 crc kubenswrapper[4831]: I1204 10:51:56.794637 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fhczj" Dec 04 10:51:56 crc kubenswrapper[4831]: I1204 10:51:56.935319 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7m2f4"] Dec 04 10:51:56 crc kubenswrapper[4831]: I1204 10:51:56.935895 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7m2f4" podUID="128686bd-ffd4-4b11-b829-98a65e2ddf01" containerName="registry-server" containerID="cri-o://4b6a4729f304c5ce9e98ff307a1c58cc23195fca575c226e09e5a93517974c25" gracePeriod=2 Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.126697 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l2dbx" Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.158365 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78bc4fa0-3286-4b3e-a796-34d6c1784cdc-catalog-content\") pod \"78bc4fa0-3286-4b3e-a796-34d6c1784cdc\" (UID: \"78bc4fa0-3286-4b3e-a796-34d6c1784cdc\") " Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.158491 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78bc4fa0-3286-4b3e-a796-34d6c1784cdc-utilities\") pod \"78bc4fa0-3286-4b3e-a796-34d6c1784cdc\" (UID: \"78bc4fa0-3286-4b3e-a796-34d6c1784cdc\") " Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.158663 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqx5p\" (UniqueName: \"kubernetes.io/projected/78bc4fa0-3286-4b3e-a796-34d6c1784cdc-kube-api-access-fqx5p\") pod \"78bc4fa0-3286-4b3e-a796-34d6c1784cdc\" (UID: \"78bc4fa0-3286-4b3e-a796-34d6c1784cdc\") " Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.161455 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78bc4fa0-3286-4b3e-a796-34d6c1784cdc-utilities" (OuterVolumeSpecName: "utilities") pod "78bc4fa0-3286-4b3e-a796-34d6c1784cdc" (UID: "78bc4fa0-3286-4b3e-a796-34d6c1784cdc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.175449 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78bc4fa0-3286-4b3e-a796-34d6c1784cdc-kube-api-access-fqx5p" (OuterVolumeSpecName: "kube-api-access-fqx5p") pod "78bc4fa0-3286-4b3e-a796-34d6c1784cdc" (UID: "78bc4fa0-3286-4b3e-a796-34d6c1784cdc"). InnerVolumeSpecName "kube-api-access-fqx5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.261762 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqx5p\" (UniqueName: \"kubernetes.io/projected/78bc4fa0-3286-4b3e-a796-34d6c1784cdc-kube-api-access-fqx5p\") on node \"crc\" DevicePath \"\"" Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.261808 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78bc4fa0-3286-4b3e-a796-34d6c1784cdc-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.283209 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78bc4fa0-3286-4b3e-a796-34d6c1784cdc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "78bc4fa0-3286-4b3e-a796-34d6c1784cdc" (UID: "78bc4fa0-3286-4b3e-a796-34d6c1784cdc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.310158 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7m2f4" Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.363129 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128686bd-ffd4-4b11-b829-98a65e2ddf01-catalog-content\") pod \"128686bd-ffd4-4b11-b829-98a65e2ddf01\" (UID: \"128686bd-ffd4-4b11-b829-98a65e2ddf01\") " Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.363491 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v2ld\" (UniqueName: \"kubernetes.io/projected/128686bd-ffd4-4b11-b829-98a65e2ddf01-kube-api-access-9v2ld\") pod \"128686bd-ffd4-4b11-b829-98a65e2ddf01\" (UID: \"128686bd-ffd4-4b11-b829-98a65e2ddf01\") " Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.363661 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128686bd-ffd4-4b11-b829-98a65e2ddf01-utilities\") pod \"128686bd-ffd4-4b11-b829-98a65e2ddf01\" (UID: \"128686bd-ffd4-4b11-b829-98a65e2ddf01\") " Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.364595 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78bc4fa0-3286-4b3e-a796-34d6c1784cdc-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.367434 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/128686bd-ffd4-4b11-b829-98a65e2ddf01-utilities" (OuterVolumeSpecName: "utilities") pod "128686bd-ffd4-4b11-b829-98a65e2ddf01" (UID: "128686bd-ffd4-4b11-b829-98a65e2ddf01"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.368963 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/128686bd-ffd4-4b11-b829-98a65e2ddf01-kube-api-access-9v2ld" (OuterVolumeSpecName: "kube-api-access-9v2ld") pod "128686bd-ffd4-4b11-b829-98a65e2ddf01" (UID: "128686bd-ffd4-4b11-b829-98a65e2ddf01"). InnerVolumeSpecName "kube-api-access-9v2ld". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.411282 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/128686bd-ffd4-4b11-b829-98a65e2ddf01-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "128686bd-ffd4-4b11-b829-98a65e2ddf01" (UID: "128686bd-ffd4-4b11-b829-98a65e2ddf01"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.466698 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9v2ld\" (UniqueName: \"kubernetes.io/projected/128686bd-ffd4-4b11-b829-98a65e2ddf01-kube-api-access-9v2ld\") on node \"crc\" DevicePath \"\"" Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.466747 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128686bd-ffd4-4b11-b829-98a65e2ddf01-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.466762 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128686bd-ffd4-4b11-b829-98a65e2ddf01-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.486110 4831 generic.go:334] "Generic (PLEG): container finished" podID="78bc4fa0-3286-4b3e-a796-34d6c1784cdc" containerID="dcace4771ae283c47fd91d54f1f691d6160e7298fa9f6329c25207640fee902e" exitCode=0 Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.486188 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l2dbx" Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.486194 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l2dbx" event={"ID":"78bc4fa0-3286-4b3e-a796-34d6c1784cdc","Type":"ContainerDied","Data":"dcace4771ae283c47fd91d54f1f691d6160e7298fa9f6329c25207640fee902e"} Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.488255 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l2dbx" event={"ID":"78bc4fa0-3286-4b3e-a796-34d6c1784cdc","Type":"ContainerDied","Data":"642b9dfa94138e79f0f41f6e1175c539da761ac49c50ecdbb872b5dca8fe092c"} Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.488289 4831 scope.go:117] "RemoveContainer" containerID="dcace4771ae283c47fd91d54f1f691d6160e7298fa9f6329c25207640fee902e" Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.496129 4831 generic.go:334] "Generic (PLEG): container finished" podID="128686bd-ffd4-4b11-b829-98a65e2ddf01" containerID="4b6a4729f304c5ce9e98ff307a1c58cc23195fca575c226e09e5a93517974c25" exitCode=0 Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.496251 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7m2f4" Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.496510 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7m2f4" event={"ID":"128686bd-ffd4-4b11-b829-98a65e2ddf01","Type":"ContainerDied","Data":"4b6a4729f304c5ce9e98ff307a1c58cc23195fca575c226e09e5a93517974c25"} Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.496596 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7m2f4" event={"ID":"128686bd-ffd4-4b11-b829-98a65e2ddf01","Type":"ContainerDied","Data":"6dfaed26dbca25b3f4f6720c1b42fb69a7271266bdfb344c614886578c7fa56f"} Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.524746 4831 scope.go:117] "RemoveContainer" containerID="36149f81bc59e780cb8e76ba58030bc1d63dae4725451a04b1cf091246541b2c" Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.533112 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l2dbx"] Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.543125 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-l2dbx"] Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.552399 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7m2f4"] Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.556021 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fhczj" Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.563055 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7m2f4"] Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.564084 4831 scope.go:117] "RemoveContainer" containerID="89441f5ca691d82e55bdfa38e6223a583808e7dc6510fcae85d15d4971bfdf69" Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.590150 4831 scope.go:117] "RemoveContainer" containerID="dcace4771ae283c47fd91d54f1f691d6160e7298fa9f6329c25207640fee902e" Dec 04 10:51:57 crc kubenswrapper[4831]: E1204 10:51:57.590623 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcace4771ae283c47fd91d54f1f691d6160e7298fa9f6329c25207640fee902e\": container with ID starting with dcace4771ae283c47fd91d54f1f691d6160e7298fa9f6329c25207640fee902e not found: ID does not exist" containerID="dcace4771ae283c47fd91d54f1f691d6160e7298fa9f6329c25207640fee902e" Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.590655 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcace4771ae283c47fd91d54f1f691d6160e7298fa9f6329c25207640fee902e"} err="failed to get container status \"dcace4771ae283c47fd91d54f1f691d6160e7298fa9f6329c25207640fee902e\": rpc error: code = NotFound desc = could not find container \"dcace4771ae283c47fd91d54f1f691d6160e7298fa9f6329c25207640fee902e\": container with ID starting with dcace4771ae283c47fd91d54f1f691d6160e7298fa9f6329c25207640fee902e not found: ID does not exist" Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.590708 4831 scope.go:117] "RemoveContainer" containerID="36149f81bc59e780cb8e76ba58030bc1d63dae4725451a04b1cf091246541b2c" Dec 04 10:51:57 crc kubenswrapper[4831]: E1204 10:51:57.590947 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36149f81bc59e780cb8e76ba58030bc1d63dae4725451a04b1cf091246541b2c\": container with ID starting with 36149f81bc59e780cb8e76ba58030bc1d63dae4725451a04b1cf091246541b2c not found: ID does not exist" containerID="36149f81bc59e780cb8e76ba58030bc1d63dae4725451a04b1cf091246541b2c" Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.590967 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36149f81bc59e780cb8e76ba58030bc1d63dae4725451a04b1cf091246541b2c"} err="failed to get container status \"36149f81bc59e780cb8e76ba58030bc1d63dae4725451a04b1cf091246541b2c\": rpc error: code = NotFound desc = could not find container \"36149f81bc59e780cb8e76ba58030bc1d63dae4725451a04b1cf091246541b2c\": container with ID starting with 36149f81bc59e780cb8e76ba58030bc1d63dae4725451a04b1cf091246541b2c not found: ID does not exist" Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.590978 4831 scope.go:117] "RemoveContainer" containerID="89441f5ca691d82e55bdfa38e6223a583808e7dc6510fcae85d15d4971bfdf69" Dec 04 10:51:57 crc kubenswrapper[4831]: E1204 10:51:57.591162 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89441f5ca691d82e55bdfa38e6223a583808e7dc6510fcae85d15d4971bfdf69\": container with ID starting with 89441f5ca691d82e55bdfa38e6223a583808e7dc6510fcae85d15d4971bfdf69 not found: ID does not exist" containerID="89441f5ca691d82e55bdfa38e6223a583808e7dc6510fcae85d15d4971bfdf69" Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.591183 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89441f5ca691d82e55bdfa38e6223a583808e7dc6510fcae85d15d4971bfdf69"} err="failed to get container status \"89441f5ca691d82e55bdfa38e6223a583808e7dc6510fcae85d15d4971bfdf69\": rpc error: code = NotFound desc = could not find container \"89441f5ca691d82e55bdfa38e6223a583808e7dc6510fcae85d15d4971bfdf69\": container with ID starting with 89441f5ca691d82e55bdfa38e6223a583808e7dc6510fcae85d15d4971bfdf69 not found: ID does not exist" Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.591195 4831 scope.go:117] "RemoveContainer" containerID="4b6a4729f304c5ce9e98ff307a1c58cc23195fca575c226e09e5a93517974c25" Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.651517 4831 scope.go:117] "RemoveContainer" containerID="fe8559a5b694fab98c4f1c6df41b9eb03dcf6f66b65d9289ff285a7e00b16e10" Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.678443 4831 scope.go:117] "RemoveContainer" containerID="6d2e9719bae3ef35f2d793d19b99989f3a68bbe3d2dbc6ab48e1cd35783b1c87" Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.721962 4831 scope.go:117] "RemoveContainer" containerID="4b6a4729f304c5ce9e98ff307a1c58cc23195fca575c226e09e5a93517974c25" Dec 04 10:51:57 crc kubenswrapper[4831]: E1204 10:51:57.722424 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b6a4729f304c5ce9e98ff307a1c58cc23195fca575c226e09e5a93517974c25\": container with ID starting with 4b6a4729f304c5ce9e98ff307a1c58cc23195fca575c226e09e5a93517974c25 not found: ID does not exist" containerID="4b6a4729f304c5ce9e98ff307a1c58cc23195fca575c226e09e5a93517974c25" Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.722461 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b6a4729f304c5ce9e98ff307a1c58cc23195fca575c226e09e5a93517974c25"} err="failed to get container status \"4b6a4729f304c5ce9e98ff307a1c58cc23195fca575c226e09e5a93517974c25\": rpc error: code = NotFound desc = could not find container \"4b6a4729f304c5ce9e98ff307a1c58cc23195fca575c226e09e5a93517974c25\": container with ID starting with 4b6a4729f304c5ce9e98ff307a1c58cc23195fca575c226e09e5a93517974c25 not found: ID does not exist" Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.722490 4831 scope.go:117] "RemoveContainer" containerID="fe8559a5b694fab98c4f1c6df41b9eb03dcf6f66b65d9289ff285a7e00b16e10" Dec 04 10:51:57 crc kubenswrapper[4831]: E1204 10:51:57.722961 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe8559a5b694fab98c4f1c6df41b9eb03dcf6f66b65d9289ff285a7e00b16e10\": container with ID starting with fe8559a5b694fab98c4f1c6df41b9eb03dcf6f66b65d9289ff285a7e00b16e10 not found: ID does not exist" containerID="fe8559a5b694fab98c4f1c6df41b9eb03dcf6f66b65d9289ff285a7e00b16e10" Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.722988 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe8559a5b694fab98c4f1c6df41b9eb03dcf6f66b65d9289ff285a7e00b16e10"} err="failed to get container status \"fe8559a5b694fab98c4f1c6df41b9eb03dcf6f66b65d9289ff285a7e00b16e10\": rpc error: code = NotFound desc = could not find container \"fe8559a5b694fab98c4f1c6df41b9eb03dcf6f66b65d9289ff285a7e00b16e10\": container with ID starting with fe8559a5b694fab98c4f1c6df41b9eb03dcf6f66b65d9289ff285a7e00b16e10 not found: ID does not exist" Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.723005 4831 scope.go:117] "RemoveContainer" containerID="6d2e9719bae3ef35f2d793d19b99989f3a68bbe3d2dbc6ab48e1cd35783b1c87" Dec 04 10:51:57 crc kubenswrapper[4831]: E1204 10:51:57.723327 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d2e9719bae3ef35f2d793d19b99989f3a68bbe3d2dbc6ab48e1cd35783b1c87\": container with ID starting with 6d2e9719bae3ef35f2d793d19b99989f3a68bbe3d2dbc6ab48e1cd35783b1c87 not found: ID does not exist" containerID="6d2e9719bae3ef35f2d793d19b99989f3a68bbe3d2dbc6ab48e1cd35783b1c87" Dec 04 10:51:57 crc kubenswrapper[4831]: I1204 10:51:57.723353 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d2e9719bae3ef35f2d793d19b99989f3a68bbe3d2dbc6ab48e1cd35783b1c87"} err="failed to get container status \"6d2e9719bae3ef35f2d793d19b99989f3a68bbe3d2dbc6ab48e1cd35783b1c87\": rpc error: code = NotFound desc = could not find container \"6d2e9719bae3ef35f2d793d19b99989f3a68bbe3d2dbc6ab48e1cd35783b1c87\": container with ID starting with 6d2e9719bae3ef35f2d793d19b99989f3a68bbe3d2dbc6ab48e1cd35783b1c87 not found: ID does not exist" Dec 04 10:51:59 crc kubenswrapper[4831]: I1204 10:51:59.297186 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="128686bd-ffd4-4b11-b829-98a65e2ddf01" path="/var/lib/kubelet/pods/128686bd-ffd4-4b11-b829-98a65e2ddf01/volumes" Dec 04 10:51:59 crc kubenswrapper[4831]: I1204 10:51:59.298754 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78bc4fa0-3286-4b3e-a796-34d6c1784cdc" path="/var/lib/kubelet/pods/78bc4fa0-3286-4b3e-a796-34d6c1784cdc/volumes" Dec 04 10:51:59 crc kubenswrapper[4831]: I1204 10:51:59.336171 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fhczj"] Dec 04 10:52:00 crc kubenswrapper[4831]: I1204 10:52:00.544196 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fhczj" podUID="8d0a4013-5ccc-42e0-8361-324fa61440c0" containerName="registry-server" containerID="cri-o://7c0d86ee7a12d36e34c0270f296686d21765a7eae8bdca969f4d0186665375a7" gracePeriod=2 Dec 04 10:52:00 crc kubenswrapper[4831]: E1204 10:52:00.785935 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d0a4013_5ccc_42e0_8361_324fa61440c0.slice/crio-conmon-7c0d86ee7a12d36e34c0270f296686d21765a7eae8bdca969f4d0186665375a7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d0a4013_5ccc_42e0_8361_324fa61440c0.slice/crio-7c0d86ee7a12d36e34c0270f296686d21765a7eae8bdca969f4d0186665375a7.scope\": RecentStats: unable to find data in memory cache]" Dec 04 10:52:01 crc kubenswrapper[4831]: I1204 10:52:01.009584 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fhczj" Dec 04 10:52:01 crc kubenswrapper[4831]: I1204 10:52:01.145601 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d0a4013-5ccc-42e0-8361-324fa61440c0-utilities\") pod \"8d0a4013-5ccc-42e0-8361-324fa61440c0\" (UID: \"8d0a4013-5ccc-42e0-8361-324fa61440c0\") " Dec 04 10:52:01 crc kubenswrapper[4831]: I1204 10:52:01.145738 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwgpm\" (UniqueName: \"kubernetes.io/projected/8d0a4013-5ccc-42e0-8361-324fa61440c0-kube-api-access-zwgpm\") pod \"8d0a4013-5ccc-42e0-8361-324fa61440c0\" (UID: \"8d0a4013-5ccc-42e0-8361-324fa61440c0\") " Dec 04 10:52:01 crc kubenswrapper[4831]: I1204 10:52:01.146044 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d0a4013-5ccc-42e0-8361-324fa61440c0-catalog-content\") pod \"8d0a4013-5ccc-42e0-8361-324fa61440c0\" (UID: \"8d0a4013-5ccc-42e0-8361-324fa61440c0\") " Dec 04 10:52:01 crc kubenswrapper[4831]: I1204 10:52:01.147735 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d0a4013-5ccc-42e0-8361-324fa61440c0-utilities" (OuterVolumeSpecName: "utilities") pod "8d0a4013-5ccc-42e0-8361-324fa61440c0" (UID: "8d0a4013-5ccc-42e0-8361-324fa61440c0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:52:01 crc kubenswrapper[4831]: I1204 10:52:01.152919 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d0a4013-5ccc-42e0-8361-324fa61440c0-kube-api-access-zwgpm" (OuterVolumeSpecName: "kube-api-access-zwgpm") pod "8d0a4013-5ccc-42e0-8361-324fa61440c0" (UID: "8d0a4013-5ccc-42e0-8361-324fa61440c0"). InnerVolumeSpecName "kube-api-access-zwgpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:52:01 crc kubenswrapper[4831]: I1204 10:52:01.190611 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d0a4013-5ccc-42e0-8361-324fa61440c0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8d0a4013-5ccc-42e0-8361-324fa61440c0" (UID: "8d0a4013-5ccc-42e0-8361-324fa61440c0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:52:01 crc kubenswrapper[4831]: I1204 10:52:01.248364 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d0a4013-5ccc-42e0-8361-324fa61440c0-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:52:01 crc kubenswrapper[4831]: I1204 10:52:01.248409 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwgpm\" (UniqueName: \"kubernetes.io/projected/8d0a4013-5ccc-42e0-8361-324fa61440c0-kube-api-access-zwgpm\") on node \"crc\" DevicePath \"\"" Dec 04 10:52:01 crc kubenswrapper[4831]: I1204 10:52:01.248422 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d0a4013-5ccc-42e0-8361-324fa61440c0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:52:01 crc kubenswrapper[4831]: I1204 10:52:01.558398 4831 generic.go:334] "Generic (PLEG): container finished" podID="8d0a4013-5ccc-42e0-8361-324fa61440c0" containerID="7c0d86ee7a12d36e34c0270f296686d21765a7eae8bdca969f4d0186665375a7" exitCode=0 Dec 04 10:52:01 crc kubenswrapper[4831]: I1204 10:52:01.558474 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhczj" event={"ID":"8d0a4013-5ccc-42e0-8361-324fa61440c0","Type":"ContainerDied","Data":"7c0d86ee7a12d36e34c0270f296686d21765a7eae8bdca969f4d0186665375a7"} Dec 04 10:52:01 crc kubenswrapper[4831]: I1204 10:52:01.558503 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhczj" event={"ID":"8d0a4013-5ccc-42e0-8361-324fa61440c0","Type":"ContainerDied","Data":"c90a9448b6c1b08a270f052ec0e0cfa46ff3854eb58144c8f7bb90250dbdb63a"} Dec 04 10:52:01 crc kubenswrapper[4831]: I1204 10:52:01.558524 4831 scope.go:117] "RemoveContainer" containerID="7c0d86ee7a12d36e34c0270f296686d21765a7eae8bdca969f4d0186665375a7" Dec 04 10:52:01 crc kubenswrapper[4831]: I1204 10:52:01.558683 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fhczj" Dec 04 10:52:01 crc kubenswrapper[4831]: I1204 10:52:01.586887 4831 scope.go:117] "RemoveContainer" containerID="7c1f2fab6a581ab5d6cbf33ce8d8968fa87b21bed91eac04b5b6c207b85417ff" Dec 04 10:52:01 crc kubenswrapper[4831]: I1204 10:52:01.596813 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fhczj"] Dec 04 10:52:01 crc kubenswrapper[4831]: I1204 10:52:01.607718 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fhczj"] Dec 04 10:52:01 crc kubenswrapper[4831]: I1204 10:52:01.619453 4831 scope.go:117] "RemoveContainer" containerID="2bc97772803d611d3a877da6d640cfccbfe5b6527650ac531ffb57d7bf52cda9" Dec 04 10:52:01 crc kubenswrapper[4831]: I1204 10:52:01.660713 4831 scope.go:117] "RemoveContainer" containerID="7c0d86ee7a12d36e34c0270f296686d21765a7eae8bdca969f4d0186665375a7" Dec 04 10:52:01 crc kubenswrapper[4831]: E1204 10:52:01.661174 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c0d86ee7a12d36e34c0270f296686d21765a7eae8bdca969f4d0186665375a7\": container with ID starting with 7c0d86ee7a12d36e34c0270f296686d21765a7eae8bdca969f4d0186665375a7 not found: ID does not exist" containerID="7c0d86ee7a12d36e34c0270f296686d21765a7eae8bdca969f4d0186665375a7" Dec 04 10:52:01 crc kubenswrapper[4831]: I1204 10:52:01.661234 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c0d86ee7a12d36e34c0270f296686d21765a7eae8bdca969f4d0186665375a7"} err="failed to get container status \"7c0d86ee7a12d36e34c0270f296686d21765a7eae8bdca969f4d0186665375a7\": rpc error: code = NotFound desc = could not find container \"7c0d86ee7a12d36e34c0270f296686d21765a7eae8bdca969f4d0186665375a7\": container with ID starting with 7c0d86ee7a12d36e34c0270f296686d21765a7eae8bdca969f4d0186665375a7 not found: ID does not exist" Dec 04 10:52:01 crc kubenswrapper[4831]: I1204 10:52:01.661268 4831 scope.go:117] "RemoveContainer" containerID="7c1f2fab6a581ab5d6cbf33ce8d8968fa87b21bed91eac04b5b6c207b85417ff" Dec 04 10:52:01 crc kubenswrapper[4831]: E1204 10:52:01.661684 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c1f2fab6a581ab5d6cbf33ce8d8968fa87b21bed91eac04b5b6c207b85417ff\": container with ID starting with 7c1f2fab6a581ab5d6cbf33ce8d8968fa87b21bed91eac04b5b6c207b85417ff not found: ID does not exist" containerID="7c1f2fab6a581ab5d6cbf33ce8d8968fa87b21bed91eac04b5b6c207b85417ff" Dec 04 10:52:01 crc kubenswrapper[4831]: I1204 10:52:01.661759 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c1f2fab6a581ab5d6cbf33ce8d8968fa87b21bed91eac04b5b6c207b85417ff"} err="failed to get container status \"7c1f2fab6a581ab5d6cbf33ce8d8968fa87b21bed91eac04b5b6c207b85417ff\": rpc error: code = NotFound desc = could not find container \"7c1f2fab6a581ab5d6cbf33ce8d8968fa87b21bed91eac04b5b6c207b85417ff\": container with ID starting with 7c1f2fab6a581ab5d6cbf33ce8d8968fa87b21bed91eac04b5b6c207b85417ff not found: ID does not exist" Dec 04 10:52:01 crc kubenswrapper[4831]: I1204 10:52:01.661817 4831 scope.go:117] "RemoveContainer" containerID="2bc97772803d611d3a877da6d640cfccbfe5b6527650ac531ffb57d7bf52cda9" Dec 04 10:52:01 crc kubenswrapper[4831]: E1204 10:52:01.662112 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bc97772803d611d3a877da6d640cfccbfe5b6527650ac531ffb57d7bf52cda9\": container with ID starting with 2bc97772803d611d3a877da6d640cfccbfe5b6527650ac531ffb57d7bf52cda9 not found: ID does not exist" containerID="2bc97772803d611d3a877da6d640cfccbfe5b6527650ac531ffb57d7bf52cda9" Dec 04 10:52:01 crc kubenswrapper[4831]: I1204 10:52:01.662136 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bc97772803d611d3a877da6d640cfccbfe5b6527650ac531ffb57d7bf52cda9"} err="failed to get container status \"2bc97772803d611d3a877da6d640cfccbfe5b6527650ac531ffb57d7bf52cda9\": rpc error: code = NotFound desc = could not find container \"2bc97772803d611d3a877da6d640cfccbfe5b6527650ac531ffb57d7bf52cda9\": container with ID starting with 2bc97772803d611d3a877da6d640cfccbfe5b6527650ac531ffb57d7bf52cda9 not found: ID does not exist" Dec 04 10:52:03 crc kubenswrapper[4831]: I1204 10:52:03.305142 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d0a4013-5ccc-42e0-8361-324fa61440c0" path="/var/lib/kubelet/pods/8d0a4013-5ccc-42e0-8361-324fa61440c0/volumes" Dec 04 10:54:21 crc kubenswrapper[4831]: I1204 10:54:21.971432 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:54:21 crc kubenswrapper[4831]: I1204 10:54:21.972046 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:54:51 crc kubenswrapper[4831]: I1204 10:54:51.971938 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:54:51 crc kubenswrapper[4831]: I1204 10:54:51.972547 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:55:21 crc kubenswrapper[4831]: I1204 10:55:21.971698 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:55:21 crc kubenswrapper[4831]: I1204 10:55:21.972209 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:55:21 crc kubenswrapper[4831]: I1204 10:55:21.972265 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" Dec 04 10:55:21 crc kubenswrapper[4831]: I1204 10:55:21.973110 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7680c56ce4695af5a3b6bef889c0877624c8a70c4075962710bf59431dcdb2e9"} pod="openshift-machine-config-operator/machine-config-daemon-g76nn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 10:55:21 crc kubenswrapper[4831]: I1204 10:55:21.973181 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" containerID="cri-o://7680c56ce4695af5a3b6bef889c0877624c8a70c4075962710bf59431dcdb2e9" gracePeriod=600 Dec 04 10:55:22 crc kubenswrapper[4831]: E1204 10:55:22.092604 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:55:22 crc kubenswrapper[4831]: I1204 10:55:22.636649 4831 generic.go:334] "Generic (PLEG): container finished" podID="8475bb26-8864-4d49-935b-db7d4cb73387" containerID="7680c56ce4695af5a3b6bef889c0877624c8a70c4075962710bf59431dcdb2e9" exitCode=0 Dec 04 10:55:22 crc kubenswrapper[4831]: I1204 10:55:22.636686 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" event={"ID":"8475bb26-8864-4d49-935b-db7d4cb73387","Type":"ContainerDied","Data":"7680c56ce4695af5a3b6bef889c0877624c8a70c4075962710bf59431dcdb2e9"} Dec 04 10:55:22 crc kubenswrapper[4831]: I1204 10:55:22.637021 4831 scope.go:117] "RemoveContainer" containerID="80db97831c4a7e8b6da9c9072594827bb567104b86c587975c95004562b3e059" Dec 04 10:55:22 crc kubenswrapper[4831]: I1204 10:55:22.637726 4831 scope.go:117] "RemoveContainer" containerID="7680c56ce4695af5a3b6bef889c0877624c8a70c4075962710bf59431dcdb2e9" Dec 04 10:55:22 crc kubenswrapper[4831]: E1204 10:55:22.637999 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:55:35 crc kubenswrapper[4831]: I1204 10:55:35.276915 4831 scope.go:117] "RemoveContainer" containerID="7680c56ce4695af5a3b6bef889c0877624c8a70c4075962710bf59431dcdb2e9" Dec 04 10:55:35 crc kubenswrapper[4831]: E1204 10:55:35.277993 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:55:46 crc kubenswrapper[4831]: I1204 10:55:46.276083 4831 scope.go:117] "RemoveContainer" containerID="7680c56ce4695af5a3b6bef889c0877624c8a70c4075962710bf59431dcdb2e9" Dec 04 10:55:46 crc kubenswrapper[4831]: E1204 10:55:46.276942 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:55:58 crc kubenswrapper[4831]: I1204 10:55:58.277075 4831 scope.go:117] "RemoveContainer" containerID="7680c56ce4695af5a3b6bef889c0877624c8a70c4075962710bf59431dcdb2e9" Dec 04 10:55:58 crc kubenswrapper[4831]: E1204 10:55:58.278170 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:56:11 crc kubenswrapper[4831]: I1204 10:56:11.276720 4831 scope.go:117] "RemoveContainer" containerID="7680c56ce4695af5a3b6bef889c0877624c8a70c4075962710bf59431dcdb2e9" Dec 04 10:56:11 crc kubenswrapper[4831]: E1204 10:56:11.277481 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:56:24 crc kubenswrapper[4831]: I1204 10:56:24.193725 4831 generic.go:334] "Generic (PLEG): container finished" podID="edce9302-713f-454f-b725-e30e8f594cd3" containerID="e29fb17e04897092881580916ddef070c474732da6f1111303d1bde3ee20210b" exitCode=0 Dec 04 10:56:24 crc kubenswrapper[4831]: I1204 10:56:24.193795 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rh529" event={"ID":"edce9302-713f-454f-b725-e30e8f594cd3","Type":"ContainerDied","Data":"e29fb17e04897092881580916ddef070c474732da6f1111303d1bde3ee20210b"} Dec 04 10:56:25 crc kubenswrapper[4831]: I1204 10:56:25.277952 4831 scope.go:117] "RemoveContainer" containerID="7680c56ce4695af5a3b6bef889c0877624c8a70c4075962710bf59431dcdb2e9" Dec 04 10:56:25 crc kubenswrapper[4831]: E1204 10:56:25.278362 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:56:25 crc kubenswrapper[4831]: I1204 10:56:25.647845 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rh529" Dec 04 10:56:25 crc kubenswrapper[4831]: I1204 10:56:25.736820 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edce9302-713f-454f-b725-e30e8f594cd3-libvirt-combined-ca-bundle\") pod \"edce9302-713f-454f-b725-e30e8f594cd3\" (UID: \"edce9302-713f-454f-b725-e30e8f594cd3\") " Dec 04 10:56:25 crc kubenswrapper[4831]: I1204 10:56:25.736870 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8f422\" (UniqueName: \"kubernetes.io/projected/edce9302-713f-454f-b725-e30e8f594cd3-kube-api-access-8f422\") pod \"edce9302-713f-454f-b725-e30e8f594cd3\" (UID: \"edce9302-713f-454f-b725-e30e8f594cd3\") " Dec 04 10:56:25 crc kubenswrapper[4831]: I1204 10:56:25.737008 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/edce9302-713f-454f-b725-e30e8f594cd3-inventory\") pod \"edce9302-713f-454f-b725-e30e8f594cd3\" (UID: \"edce9302-713f-454f-b725-e30e8f594cd3\") " Dec 04 10:56:25 crc kubenswrapper[4831]: I1204 10:56:25.737052 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/edce9302-713f-454f-b725-e30e8f594cd3-libvirt-secret-0\") pod \"edce9302-713f-454f-b725-e30e8f594cd3\" (UID: \"edce9302-713f-454f-b725-e30e8f594cd3\") " Dec 04 10:56:25 crc kubenswrapper[4831]: I1204 10:56:25.737184 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/edce9302-713f-454f-b725-e30e8f594cd3-ssh-key\") pod \"edce9302-713f-454f-b725-e30e8f594cd3\" (UID: \"edce9302-713f-454f-b725-e30e8f594cd3\") " Dec 04 10:56:25 crc kubenswrapper[4831]: I1204 10:56:25.742214 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edce9302-713f-454f-b725-e30e8f594cd3-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "edce9302-713f-454f-b725-e30e8f594cd3" (UID: "edce9302-713f-454f-b725-e30e8f594cd3"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:56:25 crc kubenswrapper[4831]: I1204 10:56:25.743343 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edce9302-713f-454f-b725-e30e8f594cd3-kube-api-access-8f422" (OuterVolumeSpecName: "kube-api-access-8f422") pod "edce9302-713f-454f-b725-e30e8f594cd3" (UID: "edce9302-713f-454f-b725-e30e8f594cd3"). InnerVolumeSpecName "kube-api-access-8f422". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:56:25 crc kubenswrapper[4831]: I1204 10:56:25.765255 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edce9302-713f-454f-b725-e30e8f594cd3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "edce9302-713f-454f-b725-e30e8f594cd3" (UID: "edce9302-713f-454f-b725-e30e8f594cd3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:56:25 crc kubenswrapper[4831]: I1204 10:56:25.765350 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edce9302-713f-454f-b725-e30e8f594cd3-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "edce9302-713f-454f-b725-e30e8f594cd3" (UID: "edce9302-713f-454f-b725-e30e8f594cd3"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:56:25 crc kubenswrapper[4831]: I1204 10:56:25.782257 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edce9302-713f-454f-b725-e30e8f594cd3-inventory" (OuterVolumeSpecName: "inventory") pod "edce9302-713f-454f-b725-e30e8f594cd3" (UID: "edce9302-713f-454f-b725-e30e8f594cd3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:56:25 crc kubenswrapper[4831]: I1204 10:56:25.839322 4831 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edce9302-713f-454f-b725-e30e8f594cd3-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:56:25 crc kubenswrapper[4831]: I1204 10:56:25.839359 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8f422\" (UniqueName: \"kubernetes.io/projected/edce9302-713f-454f-b725-e30e8f594cd3-kube-api-access-8f422\") on node \"crc\" DevicePath \"\"" Dec 04 10:56:25 crc kubenswrapper[4831]: I1204 10:56:25.839370 4831 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/edce9302-713f-454f-b725-e30e8f594cd3-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:56:25 crc kubenswrapper[4831]: I1204 10:56:25.839379 4831 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/edce9302-713f-454f-b725-e30e8f594cd3-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:56:25 crc kubenswrapper[4831]: I1204 10:56:25.839387 4831 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/edce9302-713f-454f-b725-e30e8f594cd3-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.228189 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rh529" event={"ID":"edce9302-713f-454f-b725-e30e8f594cd3","Type":"ContainerDied","Data":"39eaa24ef73996a64a1ca9cb54b44be1153a0dd863107646f262e8513cfab806"} Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.228603 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39eaa24ef73996a64a1ca9cb54b44be1153a0dd863107646f262e8513cfab806" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.228545 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rh529" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.315366 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-jmdvz"] Dec 04 10:56:26 crc kubenswrapper[4831]: E1204 10:56:26.315783 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78bc4fa0-3286-4b3e-a796-34d6c1784cdc" containerName="registry-server" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.315798 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="78bc4fa0-3286-4b3e-a796-34d6c1784cdc" containerName="registry-server" Dec 04 10:56:26 crc kubenswrapper[4831]: E1204 10:56:26.315817 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d0a4013-5ccc-42e0-8361-324fa61440c0" containerName="registry-server" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.315822 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d0a4013-5ccc-42e0-8361-324fa61440c0" containerName="registry-server" Dec 04 10:56:26 crc kubenswrapper[4831]: E1204 10:56:26.315841 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d0a4013-5ccc-42e0-8361-324fa61440c0" containerName="extract-utilities" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.315848 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d0a4013-5ccc-42e0-8361-324fa61440c0" containerName="extract-utilities" Dec 04 10:56:26 crc kubenswrapper[4831]: E1204 10:56:26.315869 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="128686bd-ffd4-4b11-b829-98a65e2ddf01" containerName="registry-server" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.315878 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="128686bd-ffd4-4b11-b829-98a65e2ddf01" containerName="registry-server" Dec 04 10:56:26 crc kubenswrapper[4831]: E1204 10:56:26.315892 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78bc4fa0-3286-4b3e-a796-34d6c1784cdc" containerName="extract-content" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.315899 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="78bc4fa0-3286-4b3e-a796-34d6c1784cdc" containerName="extract-content" Dec 04 10:56:26 crc kubenswrapper[4831]: E1204 10:56:26.315910 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="128686bd-ffd4-4b11-b829-98a65e2ddf01" containerName="extract-utilities" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.315918 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="128686bd-ffd4-4b11-b829-98a65e2ddf01" containerName="extract-utilities" Dec 04 10:56:26 crc kubenswrapper[4831]: E1204 10:56:26.315948 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78bc4fa0-3286-4b3e-a796-34d6c1784cdc" containerName="extract-utilities" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.315954 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="78bc4fa0-3286-4b3e-a796-34d6c1784cdc" containerName="extract-utilities" Dec 04 10:56:26 crc kubenswrapper[4831]: E1204 10:56:26.315968 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d0a4013-5ccc-42e0-8361-324fa61440c0" containerName="extract-content" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.315974 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d0a4013-5ccc-42e0-8361-324fa61440c0" containerName="extract-content" Dec 04 10:56:26 crc kubenswrapper[4831]: E1204 10:56:26.315987 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edce9302-713f-454f-b725-e30e8f594cd3" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.315994 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="edce9302-713f-454f-b725-e30e8f594cd3" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 04 10:56:26 crc kubenswrapper[4831]: E1204 10:56:26.316010 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="128686bd-ffd4-4b11-b829-98a65e2ddf01" containerName="extract-content" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.316016 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="128686bd-ffd4-4b11-b829-98a65e2ddf01" containerName="extract-content" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.316197 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="78bc4fa0-3286-4b3e-a796-34d6c1784cdc" containerName="registry-server" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.316210 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="128686bd-ffd4-4b11-b829-98a65e2ddf01" containerName="registry-server" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.316219 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="edce9302-713f-454f-b725-e30e8f594cd3" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.316233 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d0a4013-5ccc-42e0-8361-324fa61440c0" containerName="registry-server" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.316934 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jmdvz" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.322835 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.323472 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.324188 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-l2wn5" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.324725 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.325001 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-jmdvz"] Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.325119 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.325303 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.325439 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.349463 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f93bea5d-e88b-491b-aafb-a86d7bdfa024-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jmdvz\" (UID: \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jmdvz" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.349645 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f93bea5d-e88b-491b-aafb-a86d7bdfa024-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jmdvz\" (UID: \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jmdvz" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.349822 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f93bea5d-e88b-491b-aafb-a86d7bdfa024-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jmdvz\" (UID: \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jmdvz" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.349898 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f93bea5d-e88b-491b-aafb-a86d7bdfa024-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jmdvz\" (UID: \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jmdvz" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.349930 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9pfc\" (UniqueName: \"kubernetes.io/projected/f93bea5d-e88b-491b-aafb-a86d7bdfa024-kube-api-access-n9pfc\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jmdvz\" (UID: \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jmdvz" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.350132 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f93bea5d-e88b-491b-aafb-a86d7bdfa024-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jmdvz\" (UID: \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jmdvz" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.350276 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93bea5d-e88b-491b-aafb-a86d7bdfa024-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jmdvz\" (UID: \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jmdvz" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.350398 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f93bea5d-e88b-491b-aafb-a86d7bdfa024-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jmdvz\" (UID: \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jmdvz" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.351391 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f93bea5d-e88b-491b-aafb-a86d7bdfa024-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jmdvz\" (UID: \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jmdvz" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.453717 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f93bea5d-e88b-491b-aafb-a86d7bdfa024-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jmdvz\" (UID: \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jmdvz" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.454514 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93bea5d-e88b-491b-aafb-a86d7bdfa024-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jmdvz\" (UID: \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jmdvz" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.454587 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f93bea5d-e88b-491b-aafb-a86d7bdfa024-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jmdvz\" (UID: \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jmdvz" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.454627 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f93bea5d-e88b-491b-aafb-a86d7bdfa024-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jmdvz\" (UID: \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jmdvz" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.454682 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f93bea5d-e88b-491b-aafb-a86d7bdfa024-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jmdvz\" (UID: \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jmdvz" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.454748 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f93bea5d-e88b-491b-aafb-a86d7bdfa024-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jmdvz\" (UID: \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jmdvz" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.454779 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f93bea5d-e88b-491b-aafb-a86d7bdfa024-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jmdvz\" (UID: \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jmdvz" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.454805 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f93bea5d-e88b-491b-aafb-a86d7bdfa024-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jmdvz\" (UID: \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jmdvz" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.454825 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9pfc\" (UniqueName: \"kubernetes.io/projected/f93bea5d-e88b-491b-aafb-a86d7bdfa024-kube-api-access-n9pfc\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jmdvz\" (UID: \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jmdvz" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.455854 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f93bea5d-e88b-491b-aafb-a86d7bdfa024-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jmdvz\" (UID: \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jmdvz" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.458250 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f93bea5d-e88b-491b-aafb-a86d7bdfa024-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jmdvz\" (UID: \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jmdvz" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.458782 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93bea5d-e88b-491b-aafb-a86d7bdfa024-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jmdvz\" (UID: \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jmdvz" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.458888 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f93bea5d-e88b-491b-aafb-a86d7bdfa024-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jmdvz\" (UID: \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jmdvz" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.459449 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f93bea5d-e88b-491b-aafb-a86d7bdfa024-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jmdvz\" (UID: \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jmdvz" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.460723 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f93bea5d-e88b-491b-aafb-a86d7bdfa024-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jmdvz\" (UID: \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jmdvz" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.461348 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f93bea5d-e88b-491b-aafb-a86d7bdfa024-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jmdvz\" (UID: \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jmdvz" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.461352 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f93bea5d-e88b-491b-aafb-a86d7bdfa024-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jmdvz\" (UID: \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jmdvz" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.474109 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9pfc\" (UniqueName: \"kubernetes.io/projected/f93bea5d-e88b-491b-aafb-a86d7bdfa024-kube-api-access-n9pfc\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jmdvz\" (UID: \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jmdvz" Dec 04 10:56:26 crc kubenswrapper[4831]: I1204 10:56:26.633165 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jmdvz" Dec 04 10:56:27 crc kubenswrapper[4831]: I1204 10:56:27.203612 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-jmdvz"] Dec 04 10:56:27 crc kubenswrapper[4831]: I1204 10:56:27.242838 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jmdvz" event={"ID":"f93bea5d-e88b-491b-aafb-a86d7bdfa024","Type":"ContainerStarted","Data":"b289e149ce35dcbd7c7f740d998436d81c486d1afc82937755fdf3166be97b94"} Dec 04 10:56:28 crc kubenswrapper[4831]: I1204 10:56:28.252980 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jmdvz" event={"ID":"f93bea5d-e88b-491b-aafb-a86d7bdfa024","Type":"ContainerStarted","Data":"71f0f3db0e4644573cf6b59537d8343493d8fa03f6cef365a735b198c0dcda12"} Dec 04 10:56:28 crc kubenswrapper[4831]: I1204 10:56:28.281624 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jmdvz" podStartSLOduration=1.7822836720000002 podStartE2EDuration="2.281607112s" podCreationTimestamp="2025-12-04 10:56:26 +0000 UTC" firstStartedPulling="2025-12-04 10:56:27.20979955 +0000 UTC m=+2484.158974864" lastFinishedPulling="2025-12-04 10:56:27.70912299 +0000 UTC m=+2484.658298304" observedRunningTime="2025-12-04 10:56:28.268646126 +0000 UTC m=+2485.217821440" watchObservedRunningTime="2025-12-04 10:56:28.281607112 +0000 UTC m=+2485.230782426" Dec 04 10:56:36 crc kubenswrapper[4831]: I1204 10:56:36.276692 4831 scope.go:117] "RemoveContainer" containerID="7680c56ce4695af5a3b6bef889c0877624c8a70c4075962710bf59431dcdb2e9" Dec 04 10:56:36 crc kubenswrapper[4831]: E1204 10:56:36.277685 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:56:51 crc kubenswrapper[4831]: I1204 10:56:51.276407 4831 scope.go:117] "RemoveContainer" containerID="7680c56ce4695af5a3b6bef889c0877624c8a70c4075962710bf59431dcdb2e9" Dec 04 10:56:51 crc kubenswrapper[4831]: E1204 10:56:51.277360 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:57:02 crc kubenswrapper[4831]: I1204 10:57:02.292433 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rpbp2"] Dec 04 10:57:02 crc kubenswrapper[4831]: I1204 10:57:02.295606 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rpbp2" Dec 04 10:57:02 crc kubenswrapper[4831]: I1204 10:57:02.303496 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rpbp2"] Dec 04 10:57:02 crc kubenswrapper[4831]: I1204 10:57:02.398254 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v24x7\" (UniqueName: \"kubernetes.io/projected/110e54b6-f3e2-4557-a68c-27281d928646-kube-api-access-v24x7\") pod \"redhat-marketplace-rpbp2\" (UID: \"110e54b6-f3e2-4557-a68c-27281d928646\") " pod="openshift-marketplace/redhat-marketplace-rpbp2" Dec 04 10:57:02 crc kubenswrapper[4831]: I1204 10:57:02.398754 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/110e54b6-f3e2-4557-a68c-27281d928646-utilities\") pod \"redhat-marketplace-rpbp2\" (UID: \"110e54b6-f3e2-4557-a68c-27281d928646\") " pod="openshift-marketplace/redhat-marketplace-rpbp2" Dec 04 10:57:02 crc kubenswrapper[4831]: I1204 10:57:02.398848 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/110e54b6-f3e2-4557-a68c-27281d928646-catalog-content\") pod \"redhat-marketplace-rpbp2\" (UID: \"110e54b6-f3e2-4557-a68c-27281d928646\") " pod="openshift-marketplace/redhat-marketplace-rpbp2" Dec 04 10:57:02 crc kubenswrapper[4831]: I1204 10:57:02.501412 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/110e54b6-f3e2-4557-a68c-27281d928646-utilities\") pod \"redhat-marketplace-rpbp2\" (UID: \"110e54b6-f3e2-4557-a68c-27281d928646\") " pod="openshift-marketplace/redhat-marketplace-rpbp2" Dec 04 10:57:02 crc kubenswrapper[4831]: I1204 10:57:02.501826 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/110e54b6-f3e2-4557-a68c-27281d928646-catalog-content\") pod \"redhat-marketplace-rpbp2\" (UID: \"110e54b6-f3e2-4557-a68c-27281d928646\") " pod="openshift-marketplace/redhat-marketplace-rpbp2" Dec 04 10:57:02 crc kubenswrapper[4831]: I1204 10:57:02.501900 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v24x7\" (UniqueName: \"kubernetes.io/projected/110e54b6-f3e2-4557-a68c-27281d928646-kube-api-access-v24x7\") pod \"redhat-marketplace-rpbp2\" (UID: \"110e54b6-f3e2-4557-a68c-27281d928646\") " pod="openshift-marketplace/redhat-marketplace-rpbp2" Dec 04 10:57:02 crc kubenswrapper[4831]: I1204 10:57:02.502178 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/110e54b6-f3e2-4557-a68c-27281d928646-utilities\") pod \"redhat-marketplace-rpbp2\" (UID: \"110e54b6-f3e2-4557-a68c-27281d928646\") " pod="openshift-marketplace/redhat-marketplace-rpbp2" Dec 04 10:57:02 crc kubenswrapper[4831]: I1204 10:57:02.502347 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/110e54b6-f3e2-4557-a68c-27281d928646-catalog-content\") pod \"redhat-marketplace-rpbp2\" (UID: \"110e54b6-f3e2-4557-a68c-27281d928646\") " pod="openshift-marketplace/redhat-marketplace-rpbp2" Dec 04 10:57:02 crc kubenswrapper[4831]: I1204 10:57:02.525934 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v24x7\" (UniqueName: \"kubernetes.io/projected/110e54b6-f3e2-4557-a68c-27281d928646-kube-api-access-v24x7\") pod \"redhat-marketplace-rpbp2\" (UID: \"110e54b6-f3e2-4557-a68c-27281d928646\") " pod="openshift-marketplace/redhat-marketplace-rpbp2" Dec 04 10:57:02 crc kubenswrapper[4831]: I1204 10:57:02.621275 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rpbp2" Dec 04 10:57:03 crc kubenswrapper[4831]: I1204 10:57:03.091544 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rpbp2"] Dec 04 10:57:03 crc kubenswrapper[4831]: I1204 10:57:03.615428 4831 generic.go:334] "Generic (PLEG): container finished" podID="110e54b6-f3e2-4557-a68c-27281d928646" containerID="017d2ad613a1ec4d5ee186debdd1726d620e4c902e7da667f5a9d8a146e35547" exitCode=0 Dec 04 10:57:03 crc kubenswrapper[4831]: I1204 10:57:03.615683 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rpbp2" event={"ID":"110e54b6-f3e2-4557-a68c-27281d928646","Type":"ContainerDied","Data":"017d2ad613a1ec4d5ee186debdd1726d620e4c902e7da667f5a9d8a146e35547"} Dec 04 10:57:03 crc kubenswrapper[4831]: I1204 10:57:03.615911 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rpbp2" event={"ID":"110e54b6-f3e2-4557-a68c-27281d928646","Type":"ContainerStarted","Data":"9bdabc973756ee512d25b6cdfd14cc1096adb8dd8459f39c82cc2a716cd1d075"} Dec 04 10:57:03 crc kubenswrapper[4831]: I1204 10:57:03.620227 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 10:57:04 crc kubenswrapper[4831]: I1204 10:57:04.626285 4831 generic.go:334] "Generic (PLEG): container finished" podID="110e54b6-f3e2-4557-a68c-27281d928646" containerID="1e96c37239dab52670bc19a600d3bfdb3328065918b9ca4e92ad7520443ed13b" exitCode=0 Dec 04 10:57:04 crc kubenswrapper[4831]: I1204 10:57:04.626320 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rpbp2" event={"ID":"110e54b6-f3e2-4557-a68c-27281d928646","Type":"ContainerDied","Data":"1e96c37239dab52670bc19a600d3bfdb3328065918b9ca4e92ad7520443ed13b"} Dec 04 10:57:05 crc kubenswrapper[4831]: I1204 10:57:05.277739 4831 scope.go:117] "RemoveContainer" containerID="7680c56ce4695af5a3b6bef889c0877624c8a70c4075962710bf59431dcdb2e9" Dec 04 10:57:05 crc kubenswrapper[4831]: E1204 10:57:05.278478 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:57:05 crc kubenswrapper[4831]: I1204 10:57:05.638875 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rpbp2" event={"ID":"110e54b6-f3e2-4557-a68c-27281d928646","Type":"ContainerStarted","Data":"29c5c5911eb0396e59ee10589f6cd3621b5944fc2b1ed64d3df849d1d985e628"} Dec 04 10:57:05 crc kubenswrapper[4831]: I1204 10:57:05.674508 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rpbp2" podStartSLOduration=2.233319079 podStartE2EDuration="3.674486597s" podCreationTimestamp="2025-12-04 10:57:02 +0000 UTC" firstStartedPulling="2025-12-04 10:57:03.61998349 +0000 UTC m=+2520.569158804" lastFinishedPulling="2025-12-04 10:57:05.061151008 +0000 UTC m=+2522.010326322" observedRunningTime="2025-12-04 10:57:05.660224027 +0000 UTC m=+2522.609399351" watchObservedRunningTime="2025-12-04 10:57:05.674486597 +0000 UTC m=+2522.623661911" Dec 04 10:57:09 crc kubenswrapper[4831]: E1204 10:57:09.027564 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod110e54b6_f3e2_4557_a68c_27281d928646.slice/crio-conmon-017d2ad613a1ec4d5ee186debdd1726d620e4c902e7da667f5a9d8a146e35547.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod110e54b6_f3e2_4557_a68c_27281d928646.slice/crio-017d2ad613a1ec4d5ee186debdd1726d620e4c902e7da667f5a9d8a146e35547.scope\": RecentStats: unable to find data in memory cache]" Dec 04 10:57:12 crc kubenswrapper[4831]: I1204 10:57:12.622190 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rpbp2" Dec 04 10:57:12 crc kubenswrapper[4831]: I1204 10:57:12.622884 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rpbp2" Dec 04 10:57:12 crc kubenswrapper[4831]: I1204 10:57:12.679345 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rpbp2" Dec 04 10:57:12 crc kubenswrapper[4831]: I1204 10:57:12.759988 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rpbp2" Dec 04 10:57:12 crc kubenswrapper[4831]: I1204 10:57:12.923149 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rpbp2"] Dec 04 10:57:14 crc kubenswrapper[4831]: I1204 10:57:14.735930 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rpbp2" podUID="110e54b6-f3e2-4557-a68c-27281d928646" containerName="registry-server" containerID="cri-o://29c5c5911eb0396e59ee10589f6cd3621b5944fc2b1ed64d3df849d1d985e628" gracePeriod=2 Dec 04 10:57:15 crc kubenswrapper[4831]: I1204 10:57:15.237512 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rpbp2" Dec 04 10:57:15 crc kubenswrapper[4831]: I1204 10:57:15.370086 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/110e54b6-f3e2-4557-a68c-27281d928646-catalog-content\") pod \"110e54b6-f3e2-4557-a68c-27281d928646\" (UID: \"110e54b6-f3e2-4557-a68c-27281d928646\") " Dec 04 10:57:15 crc kubenswrapper[4831]: I1204 10:57:15.370266 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v24x7\" (UniqueName: \"kubernetes.io/projected/110e54b6-f3e2-4557-a68c-27281d928646-kube-api-access-v24x7\") pod \"110e54b6-f3e2-4557-a68c-27281d928646\" (UID: \"110e54b6-f3e2-4557-a68c-27281d928646\") " Dec 04 10:57:15 crc kubenswrapper[4831]: I1204 10:57:15.370299 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/110e54b6-f3e2-4557-a68c-27281d928646-utilities\") pod \"110e54b6-f3e2-4557-a68c-27281d928646\" (UID: \"110e54b6-f3e2-4557-a68c-27281d928646\") " Dec 04 10:57:15 crc kubenswrapper[4831]: I1204 10:57:15.371225 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/110e54b6-f3e2-4557-a68c-27281d928646-utilities" (OuterVolumeSpecName: "utilities") pod "110e54b6-f3e2-4557-a68c-27281d928646" (UID: "110e54b6-f3e2-4557-a68c-27281d928646"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:57:15 crc kubenswrapper[4831]: I1204 10:57:15.371767 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/110e54b6-f3e2-4557-a68c-27281d928646-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:57:15 crc kubenswrapper[4831]: I1204 10:57:15.380000 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/110e54b6-f3e2-4557-a68c-27281d928646-kube-api-access-v24x7" (OuterVolumeSpecName: "kube-api-access-v24x7") pod "110e54b6-f3e2-4557-a68c-27281d928646" (UID: "110e54b6-f3e2-4557-a68c-27281d928646"). InnerVolumeSpecName "kube-api-access-v24x7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:57:15 crc kubenswrapper[4831]: I1204 10:57:15.393964 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/110e54b6-f3e2-4557-a68c-27281d928646-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "110e54b6-f3e2-4557-a68c-27281d928646" (UID: "110e54b6-f3e2-4557-a68c-27281d928646"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:57:15 crc kubenswrapper[4831]: I1204 10:57:15.473996 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v24x7\" (UniqueName: \"kubernetes.io/projected/110e54b6-f3e2-4557-a68c-27281d928646-kube-api-access-v24x7\") on node \"crc\" DevicePath \"\"" Dec 04 10:57:15 crc kubenswrapper[4831]: I1204 10:57:15.474034 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/110e54b6-f3e2-4557-a68c-27281d928646-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:57:15 crc kubenswrapper[4831]: I1204 10:57:15.750086 4831 generic.go:334] "Generic (PLEG): container finished" podID="110e54b6-f3e2-4557-a68c-27281d928646" containerID="29c5c5911eb0396e59ee10589f6cd3621b5944fc2b1ed64d3df849d1d985e628" exitCode=0 Dec 04 10:57:15 crc kubenswrapper[4831]: I1204 10:57:15.750181 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rpbp2" event={"ID":"110e54b6-f3e2-4557-a68c-27281d928646","Type":"ContainerDied","Data":"29c5c5911eb0396e59ee10589f6cd3621b5944fc2b1ed64d3df849d1d985e628"} Dec 04 10:57:15 crc kubenswrapper[4831]: I1204 10:57:15.750249 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rpbp2" event={"ID":"110e54b6-f3e2-4557-a68c-27281d928646","Type":"ContainerDied","Data":"9bdabc973756ee512d25b6cdfd14cc1096adb8dd8459f39c82cc2a716cd1d075"} Dec 04 10:57:15 crc kubenswrapper[4831]: I1204 10:57:15.750277 4831 scope.go:117] "RemoveContainer" containerID="29c5c5911eb0396e59ee10589f6cd3621b5944fc2b1ed64d3df849d1d985e628" Dec 04 10:57:15 crc kubenswrapper[4831]: I1204 10:57:15.750289 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rpbp2" Dec 04 10:57:15 crc kubenswrapper[4831]: I1204 10:57:15.781880 4831 scope.go:117] "RemoveContainer" containerID="1e96c37239dab52670bc19a600d3bfdb3328065918b9ca4e92ad7520443ed13b" Dec 04 10:57:15 crc kubenswrapper[4831]: I1204 10:57:15.802223 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rpbp2"] Dec 04 10:57:15 crc kubenswrapper[4831]: I1204 10:57:15.822143 4831 scope.go:117] "RemoveContainer" containerID="017d2ad613a1ec4d5ee186debdd1726d620e4c902e7da667f5a9d8a146e35547" Dec 04 10:57:15 crc kubenswrapper[4831]: I1204 10:57:15.823577 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rpbp2"] Dec 04 10:57:15 crc kubenswrapper[4831]: I1204 10:57:15.860052 4831 scope.go:117] "RemoveContainer" containerID="29c5c5911eb0396e59ee10589f6cd3621b5944fc2b1ed64d3df849d1d985e628" Dec 04 10:57:15 crc kubenswrapper[4831]: E1204 10:57:15.860466 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29c5c5911eb0396e59ee10589f6cd3621b5944fc2b1ed64d3df849d1d985e628\": container with ID starting with 29c5c5911eb0396e59ee10589f6cd3621b5944fc2b1ed64d3df849d1d985e628 not found: ID does not exist" containerID="29c5c5911eb0396e59ee10589f6cd3621b5944fc2b1ed64d3df849d1d985e628" Dec 04 10:57:15 crc kubenswrapper[4831]: I1204 10:57:15.860500 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29c5c5911eb0396e59ee10589f6cd3621b5944fc2b1ed64d3df849d1d985e628"} err="failed to get container status \"29c5c5911eb0396e59ee10589f6cd3621b5944fc2b1ed64d3df849d1d985e628\": rpc error: code = NotFound desc = could not find container \"29c5c5911eb0396e59ee10589f6cd3621b5944fc2b1ed64d3df849d1d985e628\": container with ID starting with 29c5c5911eb0396e59ee10589f6cd3621b5944fc2b1ed64d3df849d1d985e628 not found: ID does not exist" Dec 04 10:57:15 crc kubenswrapper[4831]: I1204 10:57:15.860525 4831 scope.go:117] "RemoveContainer" containerID="1e96c37239dab52670bc19a600d3bfdb3328065918b9ca4e92ad7520443ed13b" Dec 04 10:57:15 crc kubenswrapper[4831]: E1204 10:57:15.860959 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e96c37239dab52670bc19a600d3bfdb3328065918b9ca4e92ad7520443ed13b\": container with ID starting with 1e96c37239dab52670bc19a600d3bfdb3328065918b9ca4e92ad7520443ed13b not found: ID does not exist" containerID="1e96c37239dab52670bc19a600d3bfdb3328065918b9ca4e92ad7520443ed13b" Dec 04 10:57:15 crc kubenswrapper[4831]: I1204 10:57:15.861001 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e96c37239dab52670bc19a600d3bfdb3328065918b9ca4e92ad7520443ed13b"} err="failed to get container status \"1e96c37239dab52670bc19a600d3bfdb3328065918b9ca4e92ad7520443ed13b\": rpc error: code = NotFound desc = could not find container \"1e96c37239dab52670bc19a600d3bfdb3328065918b9ca4e92ad7520443ed13b\": container with ID starting with 1e96c37239dab52670bc19a600d3bfdb3328065918b9ca4e92ad7520443ed13b not found: ID does not exist" Dec 04 10:57:15 crc kubenswrapper[4831]: I1204 10:57:15.861025 4831 scope.go:117] "RemoveContainer" containerID="017d2ad613a1ec4d5ee186debdd1726d620e4c902e7da667f5a9d8a146e35547" Dec 04 10:57:15 crc kubenswrapper[4831]: E1204 10:57:15.861219 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"017d2ad613a1ec4d5ee186debdd1726d620e4c902e7da667f5a9d8a146e35547\": container with ID starting with 017d2ad613a1ec4d5ee186debdd1726d620e4c902e7da667f5a9d8a146e35547 not found: ID does not exist" containerID="017d2ad613a1ec4d5ee186debdd1726d620e4c902e7da667f5a9d8a146e35547" Dec 04 10:57:15 crc kubenswrapper[4831]: I1204 10:57:15.861244 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"017d2ad613a1ec4d5ee186debdd1726d620e4c902e7da667f5a9d8a146e35547"} err="failed to get container status \"017d2ad613a1ec4d5ee186debdd1726d620e4c902e7da667f5a9d8a146e35547\": rpc error: code = NotFound desc = could not find container \"017d2ad613a1ec4d5ee186debdd1726d620e4c902e7da667f5a9d8a146e35547\": container with ID starting with 017d2ad613a1ec4d5ee186debdd1726d620e4c902e7da667f5a9d8a146e35547 not found: ID does not exist" Dec 04 10:57:17 crc kubenswrapper[4831]: I1204 10:57:17.288071 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="110e54b6-f3e2-4557-a68c-27281d928646" path="/var/lib/kubelet/pods/110e54b6-f3e2-4557-a68c-27281d928646/volumes" Dec 04 10:57:18 crc kubenswrapper[4831]: I1204 10:57:18.276756 4831 scope.go:117] "RemoveContainer" containerID="7680c56ce4695af5a3b6bef889c0877624c8a70c4075962710bf59431dcdb2e9" Dec 04 10:57:18 crc kubenswrapper[4831]: E1204 10:57:18.277172 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:57:19 crc kubenswrapper[4831]: E1204 10:57:19.308165 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod110e54b6_f3e2_4557_a68c_27281d928646.slice/crio-017d2ad613a1ec4d5ee186debdd1726d620e4c902e7da667f5a9d8a146e35547.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod110e54b6_f3e2_4557_a68c_27281d928646.slice/crio-conmon-017d2ad613a1ec4d5ee186debdd1726d620e4c902e7da667f5a9d8a146e35547.scope\": RecentStats: unable to find data in memory cache]" Dec 04 10:57:29 crc kubenswrapper[4831]: E1204 10:57:29.591257 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod110e54b6_f3e2_4557_a68c_27281d928646.slice/crio-conmon-017d2ad613a1ec4d5ee186debdd1726d620e4c902e7da667f5a9d8a146e35547.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod110e54b6_f3e2_4557_a68c_27281d928646.slice/crio-017d2ad613a1ec4d5ee186debdd1726d620e4c902e7da667f5a9d8a146e35547.scope\": RecentStats: unable to find data in memory cache]" Dec 04 10:57:30 crc kubenswrapper[4831]: I1204 10:57:30.277313 4831 scope.go:117] "RemoveContainer" containerID="7680c56ce4695af5a3b6bef889c0877624c8a70c4075962710bf59431dcdb2e9" Dec 04 10:57:30 crc kubenswrapper[4831]: E1204 10:57:30.278294 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:57:39 crc kubenswrapper[4831]: E1204 10:57:39.834777 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod110e54b6_f3e2_4557_a68c_27281d928646.slice/crio-conmon-017d2ad613a1ec4d5ee186debdd1726d620e4c902e7da667f5a9d8a146e35547.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod110e54b6_f3e2_4557_a68c_27281d928646.slice/crio-017d2ad613a1ec4d5ee186debdd1726d620e4c902e7da667f5a9d8a146e35547.scope\": RecentStats: unable to find data in memory cache]" Dec 04 10:57:43 crc kubenswrapper[4831]: I1204 10:57:43.285922 4831 scope.go:117] "RemoveContainer" containerID="7680c56ce4695af5a3b6bef889c0877624c8a70c4075962710bf59431dcdb2e9" Dec 04 10:57:43 crc kubenswrapper[4831]: E1204 10:57:43.286722 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:57:50 crc kubenswrapper[4831]: E1204 10:57:50.085494 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod110e54b6_f3e2_4557_a68c_27281d928646.slice/crio-017d2ad613a1ec4d5ee186debdd1726d620e4c902e7da667f5a9d8a146e35547.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod110e54b6_f3e2_4557_a68c_27281d928646.slice/crio-conmon-017d2ad613a1ec4d5ee186debdd1726d620e4c902e7da667f5a9d8a146e35547.scope\": RecentStats: unable to find data in memory cache]" Dec 04 10:57:55 crc kubenswrapper[4831]: I1204 10:57:55.276230 4831 scope.go:117] "RemoveContainer" containerID="7680c56ce4695af5a3b6bef889c0877624c8a70c4075962710bf59431dcdb2e9" Dec 04 10:57:55 crc kubenswrapper[4831]: E1204 10:57:55.276986 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:58:00 crc kubenswrapper[4831]: E1204 10:58:00.349866 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod110e54b6_f3e2_4557_a68c_27281d928646.slice/crio-conmon-017d2ad613a1ec4d5ee186debdd1726d620e4c902e7da667f5a9d8a146e35547.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod110e54b6_f3e2_4557_a68c_27281d928646.slice/crio-017d2ad613a1ec4d5ee186debdd1726d620e4c902e7da667f5a9d8a146e35547.scope\": RecentStats: unable to find data in memory cache]" Dec 04 10:58:09 crc kubenswrapper[4831]: I1204 10:58:09.277839 4831 scope.go:117] "RemoveContainer" containerID="7680c56ce4695af5a3b6bef889c0877624c8a70c4075962710bf59431dcdb2e9" Dec 04 10:58:09 crc kubenswrapper[4831]: E1204 10:58:09.278898 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:58:23 crc kubenswrapper[4831]: I1204 10:58:23.286929 4831 scope.go:117] "RemoveContainer" containerID="7680c56ce4695af5a3b6bef889c0877624c8a70c4075962710bf59431dcdb2e9" Dec 04 10:58:23 crc kubenswrapper[4831]: E1204 10:58:23.287963 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:58:36 crc kubenswrapper[4831]: I1204 10:58:36.277555 4831 scope.go:117] "RemoveContainer" containerID="7680c56ce4695af5a3b6bef889c0877624c8a70c4075962710bf59431dcdb2e9" Dec 04 10:58:36 crc kubenswrapper[4831]: E1204 10:58:36.279223 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:58:48 crc kubenswrapper[4831]: I1204 10:58:48.278305 4831 scope.go:117] "RemoveContainer" containerID="7680c56ce4695af5a3b6bef889c0877624c8a70c4075962710bf59431dcdb2e9" Dec 04 10:58:48 crc kubenswrapper[4831]: E1204 10:58:48.278993 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:59:00 crc kubenswrapper[4831]: I1204 10:59:00.276507 4831 scope.go:117] "RemoveContainer" containerID="7680c56ce4695af5a3b6bef889c0877624c8a70c4075962710bf59431dcdb2e9" Dec 04 10:59:00 crc kubenswrapper[4831]: E1204 10:59:00.277280 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:59:10 crc kubenswrapper[4831]: I1204 10:59:10.926166 4831 generic.go:334] "Generic (PLEG): container finished" podID="f93bea5d-e88b-491b-aafb-a86d7bdfa024" containerID="71f0f3db0e4644573cf6b59537d8343493d8fa03f6cef365a735b198c0dcda12" exitCode=0 Dec 04 10:59:10 crc kubenswrapper[4831]: I1204 10:59:10.926252 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jmdvz" event={"ID":"f93bea5d-e88b-491b-aafb-a86d7bdfa024","Type":"ContainerDied","Data":"71f0f3db0e4644573cf6b59537d8343493d8fa03f6cef365a735b198c0dcda12"} Dec 04 10:59:12 crc kubenswrapper[4831]: I1204 10:59:12.357871 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jmdvz" Dec 04 10:59:12 crc kubenswrapper[4831]: I1204 10:59:12.488190 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f93bea5d-e88b-491b-aafb-a86d7bdfa024-nova-cell1-compute-config-0\") pod \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\" (UID: \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\") " Dec 04 10:59:12 crc kubenswrapper[4831]: I1204 10:59:12.488683 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f93bea5d-e88b-491b-aafb-a86d7bdfa024-nova-cell1-compute-config-1\") pod \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\" (UID: \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\") " Dec 04 10:59:12 crc kubenswrapper[4831]: I1204 10:59:12.488713 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f93bea5d-e88b-491b-aafb-a86d7bdfa024-ssh-key\") pod \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\" (UID: \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\") " Dec 04 10:59:12 crc kubenswrapper[4831]: I1204 10:59:12.488740 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9pfc\" (UniqueName: \"kubernetes.io/projected/f93bea5d-e88b-491b-aafb-a86d7bdfa024-kube-api-access-n9pfc\") pod \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\" (UID: \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\") " Dec 04 10:59:12 crc kubenswrapper[4831]: I1204 10:59:12.488783 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f93bea5d-e88b-491b-aafb-a86d7bdfa024-nova-migration-ssh-key-0\") pod \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\" (UID: \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\") " Dec 04 10:59:12 crc kubenswrapper[4831]: I1204 10:59:12.488823 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93bea5d-e88b-491b-aafb-a86d7bdfa024-nova-combined-ca-bundle\") pod \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\" (UID: \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\") " Dec 04 10:59:12 crc kubenswrapper[4831]: I1204 10:59:12.488961 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f93bea5d-e88b-491b-aafb-a86d7bdfa024-nova-extra-config-0\") pod \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\" (UID: \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\") " Dec 04 10:59:12 crc kubenswrapper[4831]: I1204 10:59:12.488987 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f93bea5d-e88b-491b-aafb-a86d7bdfa024-inventory\") pod \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\" (UID: \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\") " Dec 04 10:59:12 crc kubenswrapper[4831]: I1204 10:59:12.489014 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f93bea5d-e88b-491b-aafb-a86d7bdfa024-nova-migration-ssh-key-1\") pod \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\" (UID: \"f93bea5d-e88b-491b-aafb-a86d7bdfa024\") " Dec 04 10:59:12 crc kubenswrapper[4831]: I1204 10:59:12.495065 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f93bea5d-e88b-491b-aafb-a86d7bdfa024-kube-api-access-n9pfc" (OuterVolumeSpecName: "kube-api-access-n9pfc") pod "f93bea5d-e88b-491b-aafb-a86d7bdfa024" (UID: "f93bea5d-e88b-491b-aafb-a86d7bdfa024"). InnerVolumeSpecName "kube-api-access-n9pfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:59:12 crc kubenswrapper[4831]: I1204 10:59:12.495335 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f93bea5d-e88b-491b-aafb-a86d7bdfa024-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "f93bea5d-e88b-491b-aafb-a86d7bdfa024" (UID: "f93bea5d-e88b-491b-aafb-a86d7bdfa024"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:59:12 crc kubenswrapper[4831]: I1204 10:59:12.517818 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f93bea5d-e88b-491b-aafb-a86d7bdfa024-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "f93bea5d-e88b-491b-aafb-a86d7bdfa024" (UID: "f93bea5d-e88b-491b-aafb-a86d7bdfa024"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:59:12 crc kubenswrapper[4831]: I1204 10:59:12.526092 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f93bea5d-e88b-491b-aafb-a86d7bdfa024-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "f93bea5d-e88b-491b-aafb-a86d7bdfa024" (UID: "f93bea5d-e88b-491b-aafb-a86d7bdfa024"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:59:12 crc kubenswrapper[4831]: I1204 10:59:12.529107 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f93bea5d-e88b-491b-aafb-a86d7bdfa024-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f93bea5d-e88b-491b-aafb-a86d7bdfa024" (UID: "f93bea5d-e88b-491b-aafb-a86d7bdfa024"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:59:12 crc kubenswrapper[4831]: I1204 10:59:12.529137 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f93bea5d-e88b-491b-aafb-a86d7bdfa024-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "f93bea5d-e88b-491b-aafb-a86d7bdfa024" (UID: "f93bea5d-e88b-491b-aafb-a86d7bdfa024"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:59:12 crc kubenswrapper[4831]: I1204 10:59:12.536961 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f93bea5d-e88b-491b-aafb-a86d7bdfa024-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "f93bea5d-e88b-491b-aafb-a86d7bdfa024" (UID: "f93bea5d-e88b-491b-aafb-a86d7bdfa024"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:59:12 crc kubenswrapper[4831]: I1204 10:59:12.540864 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f93bea5d-e88b-491b-aafb-a86d7bdfa024-inventory" (OuterVolumeSpecName: "inventory") pod "f93bea5d-e88b-491b-aafb-a86d7bdfa024" (UID: "f93bea5d-e88b-491b-aafb-a86d7bdfa024"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:59:12 crc kubenswrapper[4831]: I1204 10:59:12.545574 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f93bea5d-e88b-491b-aafb-a86d7bdfa024-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "f93bea5d-e88b-491b-aafb-a86d7bdfa024" (UID: "f93bea5d-e88b-491b-aafb-a86d7bdfa024"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:59:12 crc kubenswrapper[4831]: I1204 10:59:12.591466 4831 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f93bea5d-e88b-491b-aafb-a86d7bdfa024-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:59:12 crc kubenswrapper[4831]: I1204 10:59:12.591498 4831 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f93bea5d-e88b-491b-aafb-a86d7bdfa024-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 04 10:59:12 crc kubenswrapper[4831]: I1204 10:59:12.591510 4831 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f93bea5d-e88b-491b-aafb-a86d7bdfa024-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:59:12 crc kubenswrapper[4831]: I1204 10:59:12.591522 4831 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f93bea5d-e88b-491b-aafb-a86d7bdfa024-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 04 10:59:12 crc kubenswrapper[4831]: I1204 10:59:12.591533 4831 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f93bea5d-e88b-491b-aafb-a86d7bdfa024-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:59:12 crc kubenswrapper[4831]: I1204 10:59:12.591545 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9pfc\" (UniqueName: \"kubernetes.io/projected/f93bea5d-e88b-491b-aafb-a86d7bdfa024-kube-api-access-n9pfc\") on node \"crc\" DevicePath \"\"" Dec 04 10:59:12 crc kubenswrapper[4831]: I1204 10:59:12.591555 4831 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f93bea5d-e88b-491b-aafb-a86d7bdfa024-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:59:12 crc kubenswrapper[4831]: I1204 10:59:12.591563 4831 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93bea5d-e88b-491b-aafb-a86d7bdfa024-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:59:12 crc kubenswrapper[4831]: I1204 10:59:12.591572 4831 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f93bea5d-e88b-491b-aafb-a86d7bdfa024-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:59:12 crc kubenswrapper[4831]: I1204 10:59:12.950114 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jmdvz" event={"ID":"f93bea5d-e88b-491b-aafb-a86d7bdfa024","Type":"ContainerDied","Data":"b289e149ce35dcbd7c7f740d998436d81c486d1afc82937755fdf3166be97b94"} Dec 04 10:59:12 crc kubenswrapper[4831]: I1204 10:59:12.950150 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b289e149ce35dcbd7c7f740d998436d81c486d1afc82937755fdf3166be97b94" Dec 04 10:59:12 crc kubenswrapper[4831]: I1204 10:59:12.950169 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jmdvz" Dec 04 10:59:13 crc kubenswrapper[4831]: I1204 10:59:13.066714 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9"] Dec 04 10:59:13 crc kubenswrapper[4831]: E1204 10:59:13.067285 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110e54b6-f3e2-4557-a68c-27281d928646" containerName="extract-utilities" Dec 04 10:59:13 crc kubenswrapper[4831]: I1204 10:59:13.067303 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="110e54b6-f3e2-4557-a68c-27281d928646" containerName="extract-utilities" Dec 04 10:59:13 crc kubenswrapper[4831]: E1204 10:59:13.067329 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f93bea5d-e88b-491b-aafb-a86d7bdfa024" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 04 10:59:13 crc kubenswrapper[4831]: I1204 10:59:13.067337 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f93bea5d-e88b-491b-aafb-a86d7bdfa024" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 04 10:59:13 crc kubenswrapper[4831]: E1204 10:59:13.067353 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110e54b6-f3e2-4557-a68c-27281d928646" containerName="extract-content" Dec 04 10:59:13 crc kubenswrapper[4831]: I1204 10:59:13.067362 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="110e54b6-f3e2-4557-a68c-27281d928646" containerName="extract-content" Dec 04 10:59:13 crc kubenswrapper[4831]: E1204 10:59:13.067380 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110e54b6-f3e2-4557-a68c-27281d928646" containerName="registry-server" Dec 04 10:59:13 crc kubenswrapper[4831]: I1204 10:59:13.067386 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="110e54b6-f3e2-4557-a68c-27281d928646" containerName="registry-server" Dec 04 10:59:13 crc kubenswrapper[4831]: I1204 10:59:13.067697 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f93bea5d-e88b-491b-aafb-a86d7bdfa024" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 04 10:59:13 crc kubenswrapper[4831]: I1204 10:59:13.067719 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="110e54b6-f3e2-4557-a68c-27281d928646" containerName="registry-server" Dec 04 10:59:13 crc kubenswrapper[4831]: I1204 10:59:13.068522 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9" Dec 04 10:59:13 crc kubenswrapper[4831]: I1204 10:59:13.070977 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:59:13 crc kubenswrapper[4831]: I1204 10:59:13.071969 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:59:13 crc kubenswrapper[4831]: I1204 10:59:13.071977 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-l2wn5" Dec 04 10:59:13 crc kubenswrapper[4831]: I1204 10:59:13.072096 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 04 10:59:13 crc kubenswrapper[4831]: I1204 10:59:13.072182 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:59:13 crc kubenswrapper[4831]: I1204 10:59:13.078497 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9"] Dec 04 10:59:13 crc kubenswrapper[4831]: I1204 10:59:13.101760 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7c8a31d-edf2-4e59-b66f-2e5ddab99661-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9\" (UID: \"c7c8a31d-edf2-4e59-b66f-2e5ddab99661\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9" Dec 04 10:59:13 crc kubenswrapper[4831]: I1204 10:59:13.101842 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/c7c8a31d-edf2-4e59-b66f-2e5ddab99661-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9\" (UID: \"c7c8a31d-edf2-4e59-b66f-2e5ddab99661\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9" Dec 04 10:59:13 crc kubenswrapper[4831]: I1204 10:59:13.101871 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/c7c8a31d-edf2-4e59-b66f-2e5ddab99661-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9\" (UID: \"c7c8a31d-edf2-4e59-b66f-2e5ddab99661\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9" Dec 04 10:59:13 crc kubenswrapper[4831]: I1204 10:59:13.101940 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7c8a31d-edf2-4e59-b66f-2e5ddab99661-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9\" (UID: \"c7c8a31d-edf2-4e59-b66f-2e5ddab99661\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9" Dec 04 10:59:13 crc kubenswrapper[4831]: I1204 10:59:13.102018 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c7c8a31d-edf2-4e59-b66f-2e5ddab99661-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9\" (UID: \"c7c8a31d-edf2-4e59-b66f-2e5ddab99661\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9" Dec 04 10:59:13 crc kubenswrapper[4831]: I1204 10:59:13.102072 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slvgv\" (UniqueName: \"kubernetes.io/projected/c7c8a31d-edf2-4e59-b66f-2e5ddab99661-kube-api-access-slvgv\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9\" (UID: \"c7c8a31d-edf2-4e59-b66f-2e5ddab99661\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9" Dec 04 10:59:13 crc kubenswrapper[4831]: I1204 10:59:13.102121 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/c7c8a31d-edf2-4e59-b66f-2e5ddab99661-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9\" (UID: \"c7c8a31d-edf2-4e59-b66f-2e5ddab99661\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9" Dec 04 10:59:13 crc kubenswrapper[4831]: I1204 10:59:13.204313 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c7c8a31d-edf2-4e59-b66f-2e5ddab99661-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9\" (UID: \"c7c8a31d-edf2-4e59-b66f-2e5ddab99661\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9" Dec 04 10:59:13 crc kubenswrapper[4831]: I1204 10:59:13.204411 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slvgv\" (UniqueName: \"kubernetes.io/projected/c7c8a31d-edf2-4e59-b66f-2e5ddab99661-kube-api-access-slvgv\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9\" (UID: \"c7c8a31d-edf2-4e59-b66f-2e5ddab99661\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9" Dec 04 10:59:13 crc kubenswrapper[4831]: I1204 10:59:13.204468 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/c7c8a31d-edf2-4e59-b66f-2e5ddab99661-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9\" (UID: \"c7c8a31d-edf2-4e59-b66f-2e5ddab99661\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9" Dec 04 10:59:13 crc kubenswrapper[4831]: I1204 10:59:13.204536 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7c8a31d-edf2-4e59-b66f-2e5ddab99661-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9\" (UID: \"c7c8a31d-edf2-4e59-b66f-2e5ddab99661\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9" Dec 04 10:59:13 crc kubenswrapper[4831]: I1204 10:59:13.204592 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/c7c8a31d-edf2-4e59-b66f-2e5ddab99661-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9\" (UID: \"c7c8a31d-edf2-4e59-b66f-2e5ddab99661\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9" Dec 04 10:59:13 crc kubenswrapper[4831]: I1204 10:59:13.204619 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/c7c8a31d-edf2-4e59-b66f-2e5ddab99661-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9\" (UID: \"c7c8a31d-edf2-4e59-b66f-2e5ddab99661\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9" Dec 04 10:59:13 crc kubenswrapper[4831]: I1204 10:59:13.204734 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7c8a31d-edf2-4e59-b66f-2e5ddab99661-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9\" (UID: \"c7c8a31d-edf2-4e59-b66f-2e5ddab99661\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9" Dec 04 10:59:13 crc kubenswrapper[4831]: I1204 10:59:13.208132 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c7c8a31d-edf2-4e59-b66f-2e5ddab99661-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9\" (UID: \"c7c8a31d-edf2-4e59-b66f-2e5ddab99661\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9" Dec 04 10:59:13 crc kubenswrapper[4831]: I1204 10:59:13.208417 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/c7c8a31d-edf2-4e59-b66f-2e5ddab99661-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9\" (UID: \"c7c8a31d-edf2-4e59-b66f-2e5ddab99661\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9" Dec 04 10:59:13 crc kubenswrapper[4831]: I1204 10:59:13.209169 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7c8a31d-edf2-4e59-b66f-2e5ddab99661-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9\" (UID: \"c7c8a31d-edf2-4e59-b66f-2e5ddab99661\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9" Dec 04 10:59:13 crc kubenswrapper[4831]: I1204 10:59:13.209338 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/c7c8a31d-edf2-4e59-b66f-2e5ddab99661-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9\" (UID: \"c7c8a31d-edf2-4e59-b66f-2e5ddab99661\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9" Dec 04 10:59:13 crc kubenswrapper[4831]: I1204 10:59:13.215430 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/c7c8a31d-edf2-4e59-b66f-2e5ddab99661-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9\" (UID: \"c7c8a31d-edf2-4e59-b66f-2e5ddab99661\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9" Dec 04 10:59:13 crc kubenswrapper[4831]: I1204 10:59:13.216529 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7c8a31d-edf2-4e59-b66f-2e5ddab99661-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9\" (UID: \"c7c8a31d-edf2-4e59-b66f-2e5ddab99661\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9" Dec 04 10:59:13 crc kubenswrapper[4831]: I1204 10:59:13.224175 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slvgv\" (UniqueName: \"kubernetes.io/projected/c7c8a31d-edf2-4e59-b66f-2e5ddab99661-kube-api-access-slvgv\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9\" (UID: \"c7c8a31d-edf2-4e59-b66f-2e5ddab99661\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9" Dec 04 10:59:13 crc kubenswrapper[4831]: I1204 10:59:13.282381 4831 scope.go:117] "RemoveContainer" containerID="7680c56ce4695af5a3b6bef889c0877624c8a70c4075962710bf59431dcdb2e9" Dec 04 10:59:13 crc kubenswrapper[4831]: E1204 10:59:13.282737 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:59:13 crc kubenswrapper[4831]: I1204 10:59:13.391291 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9" Dec 04 10:59:13 crc kubenswrapper[4831]: I1204 10:59:13.928810 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9"] Dec 04 10:59:13 crc kubenswrapper[4831]: I1204 10:59:13.962917 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9" event={"ID":"c7c8a31d-edf2-4e59-b66f-2e5ddab99661","Type":"ContainerStarted","Data":"d57ee79c73e567c8381e3424a74ddf01ee612ded31e27dc906e195447a6408e7"} Dec 04 10:59:14 crc kubenswrapper[4831]: I1204 10:59:14.981356 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9" event={"ID":"c7c8a31d-edf2-4e59-b66f-2e5ddab99661","Type":"ContainerStarted","Data":"6eef15ad0766607a61f47e54987cf920ca7028657ecbe624a7467e01d38a5fa9"} Dec 04 10:59:15 crc kubenswrapper[4831]: I1204 10:59:15.009309 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9" podStartSLOduration=1.560703537 podStartE2EDuration="2.009279317s" podCreationTimestamp="2025-12-04 10:59:13 +0000 UTC" firstStartedPulling="2025-12-04 10:59:13.937396823 +0000 UTC m=+2650.886572147" lastFinishedPulling="2025-12-04 10:59:14.385972613 +0000 UTC m=+2651.335147927" observedRunningTime="2025-12-04 10:59:15.006022443 +0000 UTC m=+2651.955197757" watchObservedRunningTime="2025-12-04 10:59:15.009279317 +0000 UTC m=+2651.958454631" Dec 04 10:59:26 crc kubenswrapper[4831]: I1204 10:59:26.276231 4831 scope.go:117] "RemoveContainer" containerID="7680c56ce4695af5a3b6bef889c0877624c8a70c4075962710bf59431dcdb2e9" Dec 04 10:59:26 crc kubenswrapper[4831]: E1204 10:59:26.276924 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:59:37 crc kubenswrapper[4831]: I1204 10:59:37.276604 4831 scope.go:117] "RemoveContainer" containerID="7680c56ce4695af5a3b6bef889c0877624c8a70c4075962710bf59431dcdb2e9" Dec 04 10:59:37 crc kubenswrapper[4831]: E1204 10:59:37.278543 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 10:59:52 crc kubenswrapper[4831]: I1204 10:59:52.277435 4831 scope.go:117] "RemoveContainer" containerID="7680c56ce4695af5a3b6bef889c0877624c8a70c4075962710bf59431dcdb2e9" Dec 04 10:59:52 crc kubenswrapper[4831]: E1204 10:59:52.278301 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:00:00 crc kubenswrapper[4831]: I1204 11:00:00.144933 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414100-xvgt6"] Dec 04 11:00:00 crc kubenswrapper[4831]: I1204 11:00:00.147251 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-xvgt6" Dec 04 11:00:00 crc kubenswrapper[4831]: I1204 11:00:00.151225 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 11:00:00 crc kubenswrapper[4831]: I1204 11:00:00.151551 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 11:00:00 crc kubenswrapper[4831]: I1204 11:00:00.155290 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414100-xvgt6"] Dec 04 11:00:00 crc kubenswrapper[4831]: I1204 11:00:00.296371 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/56755d7b-e8d8-44a8-a26d-79482eda17ac-config-volume\") pod \"collect-profiles-29414100-xvgt6\" (UID: \"56755d7b-e8d8-44a8-a26d-79482eda17ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-xvgt6" Dec 04 11:00:00 crc kubenswrapper[4831]: I1204 11:00:00.296447 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/56755d7b-e8d8-44a8-a26d-79482eda17ac-secret-volume\") pod \"collect-profiles-29414100-xvgt6\" (UID: \"56755d7b-e8d8-44a8-a26d-79482eda17ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-xvgt6" Dec 04 11:00:00 crc kubenswrapper[4831]: I1204 11:00:00.296522 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zd8x\" (UniqueName: \"kubernetes.io/projected/56755d7b-e8d8-44a8-a26d-79482eda17ac-kube-api-access-2zd8x\") pod \"collect-profiles-29414100-xvgt6\" (UID: \"56755d7b-e8d8-44a8-a26d-79482eda17ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-xvgt6" Dec 04 11:00:00 crc kubenswrapper[4831]: I1204 11:00:00.398659 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zd8x\" (UniqueName: \"kubernetes.io/projected/56755d7b-e8d8-44a8-a26d-79482eda17ac-kube-api-access-2zd8x\") pod \"collect-profiles-29414100-xvgt6\" (UID: \"56755d7b-e8d8-44a8-a26d-79482eda17ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-xvgt6" Dec 04 11:00:00 crc kubenswrapper[4831]: I1204 11:00:00.398898 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/56755d7b-e8d8-44a8-a26d-79482eda17ac-config-volume\") pod \"collect-profiles-29414100-xvgt6\" (UID: \"56755d7b-e8d8-44a8-a26d-79482eda17ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-xvgt6" Dec 04 11:00:00 crc kubenswrapper[4831]: I1204 11:00:00.399012 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/56755d7b-e8d8-44a8-a26d-79482eda17ac-secret-volume\") pod \"collect-profiles-29414100-xvgt6\" (UID: \"56755d7b-e8d8-44a8-a26d-79482eda17ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-xvgt6" Dec 04 11:00:00 crc kubenswrapper[4831]: I1204 11:00:00.400005 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/56755d7b-e8d8-44a8-a26d-79482eda17ac-config-volume\") pod \"collect-profiles-29414100-xvgt6\" (UID: \"56755d7b-e8d8-44a8-a26d-79482eda17ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-xvgt6" Dec 04 11:00:00 crc kubenswrapper[4831]: I1204 11:00:00.405911 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/56755d7b-e8d8-44a8-a26d-79482eda17ac-secret-volume\") pod \"collect-profiles-29414100-xvgt6\" (UID: \"56755d7b-e8d8-44a8-a26d-79482eda17ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-xvgt6" Dec 04 11:00:00 crc kubenswrapper[4831]: I1204 11:00:00.418627 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zd8x\" (UniqueName: \"kubernetes.io/projected/56755d7b-e8d8-44a8-a26d-79482eda17ac-kube-api-access-2zd8x\") pod \"collect-profiles-29414100-xvgt6\" (UID: \"56755d7b-e8d8-44a8-a26d-79482eda17ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-xvgt6" Dec 04 11:00:00 crc kubenswrapper[4831]: I1204 11:00:00.479002 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-xvgt6" Dec 04 11:00:00 crc kubenswrapper[4831]: I1204 11:00:00.958617 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414100-xvgt6"] Dec 04 11:00:01 crc kubenswrapper[4831]: I1204 11:00:01.431181 4831 generic.go:334] "Generic (PLEG): container finished" podID="56755d7b-e8d8-44a8-a26d-79482eda17ac" containerID="cb50151ba535d148a6289fdefc036efc96d45e8f164d6959103f20636645b426" exitCode=0 Dec 04 11:00:01 crc kubenswrapper[4831]: I1204 11:00:01.431229 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-xvgt6" event={"ID":"56755d7b-e8d8-44a8-a26d-79482eda17ac","Type":"ContainerDied","Data":"cb50151ba535d148a6289fdefc036efc96d45e8f164d6959103f20636645b426"} Dec 04 11:00:01 crc kubenswrapper[4831]: I1204 11:00:01.431257 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-xvgt6" event={"ID":"56755d7b-e8d8-44a8-a26d-79482eda17ac","Type":"ContainerStarted","Data":"9a9cccb82c21cad3ac181534851499328bc1e1e118cae1e18bc4f7b06081ee06"} Dec 04 11:00:02 crc kubenswrapper[4831]: I1204 11:00:02.785686 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-xvgt6" Dec 04 11:00:02 crc kubenswrapper[4831]: I1204 11:00:02.946975 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zd8x\" (UniqueName: \"kubernetes.io/projected/56755d7b-e8d8-44a8-a26d-79482eda17ac-kube-api-access-2zd8x\") pod \"56755d7b-e8d8-44a8-a26d-79482eda17ac\" (UID: \"56755d7b-e8d8-44a8-a26d-79482eda17ac\") " Dec 04 11:00:02 crc kubenswrapper[4831]: I1204 11:00:02.947102 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/56755d7b-e8d8-44a8-a26d-79482eda17ac-config-volume\") pod \"56755d7b-e8d8-44a8-a26d-79482eda17ac\" (UID: \"56755d7b-e8d8-44a8-a26d-79482eda17ac\") " Dec 04 11:00:02 crc kubenswrapper[4831]: I1204 11:00:02.947154 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/56755d7b-e8d8-44a8-a26d-79482eda17ac-secret-volume\") pod \"56755d7b-e8d8-44a8-a26d-79482eda17ac\" (UID: \"56755d7b-e8d8-44a8-a26d-79482eda17ac\") " Dec 04 11:00:02 crc kubenswrapper[4831]: I1204 11:00:02.947932 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56755d7b-e8d8-44a8-a26d-79482eda17ac-config-volume" (OuterVolumeSpecName: "config-volume") pod "56755d7b-e8d8-44a8-a26d-79482eda17ac" (UID: "56755d7b-e8d8-44a8-a26d-79482eda17ac"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 11:00:02 crc kubenswrapper[4831]: I1204 11:00:02.972844 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56755d7b-e8d8-44a8-a26d-79482eda17ac-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "56755d7b-e8d8-44a8-a26d-79482eda17ac" (UID: "56755d7b-e8d8-44a8-a26d-79482eda17ac"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 11:00:02 crc kubenswrapper[4831]: I1204 11:00:02.984619 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56755d7b-e8d8-44a8-a26d-79482eda17ac-kube-api-access-2zd8x" (OuterVolumeSpecName: "kube-api-access-2zd8x") pod "56755d7b-e8d8-44a8-a26d-79482eda17ac" (UID: "56755d7b-e8d8-44a8-a26d-79482eda17ac"). InnerVolumeSpecName "kube-api-access-2zd8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:00:03 crc kubenswrapper[4831]: I1204 11:00:03.049326 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zd8x\" (UniqueName: \"kubernetes.io/projected/56755d7b-e8d8-44a8-a26d-79482eda17ac-kube-api-access-2zd8x\") on node \"crc\" DevicePath \"\"" Dec 04 11:00:03 crc kubenswrapper[4831]: I1204 11:00:03.049377 4831 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/56755d7b-e8d8-44a8-a26d-79482eda17ac-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 11:00:03 crc kubenswrapper[4831]: I1204 11:00:03.049390 4831 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/56755d7b-e8d8-44a8-a26d-79482eda17ac-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 11:00:03 crc kubenswrapper[4831]: I1204 11:00:03.450097 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-xvgt6" event={"ID":"56755d7b-e8d8-44a8-a26d-79482eda17ac","Type":"ContainerDied","Data":"9a9cccb82c21cad3ac181534851499328bc1e1e118cae1e18bc4f7b06081ee06"} Dec 04 11:00:03 crc kubenswrapper[4831]: I1204 11:00:03.450140 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a9cccb82c21cad3ac181534851499328bc1e1e118cae1e18bc4f7b06081ee06" Dec 04 11:00:03 crc kubenswrapper[4831]: I1204 11:00:03.450198 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-xvgt6" Dec 04 11:00:03 crc kubenswrapper[4831]: E1204 11:00:03.649481 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56755d7b_e8d8_44a8_a26d_79482eda17ac.slice/crio-9a9cccb82c21cad3ac181534851499328bc1e1e118cae1e18bc4f7b06081ee06\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56755d7b_e8d8_44a8_a26d_79482eda17ac.slice\": RecentStats: unable to find data in memory cache]" Dec 04 11:00:03 crc kubenswrapper[4831]: I1204 11:00:03.865660 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414055-8r2vj"] Dec 04 11:00:03 crc kubenswrapper[4831]: I1204 11:00:03.879947 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414055-8r2vj"] Dec 04 11:00:05 crc kubenswrapper[4831]: I1204 11:00:05.290546 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93463415-c819-44b0-9ee7-0de0698eb6a6" path="/var/lib/kubelet/pods/93463415-c819-44b0-9ee7-0de0698eb6a6/volumes" Dec 04 11:00:06 crc kubenswrapper[4831]: I1204 11:00:06.278108 4831 scope.go:117] "RemoveContainer" containerID="7680c56ce4695af5a3b6bef889c0877624c8a70c4075962710bf59431dcdb2e9" Dec 04 11:00:06 crc kubenswrapper[4831]: E1204 11:00:06.278835 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:00:13 crc kubenswrapper[4831]: E1204 11:00:13.885274 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56755d7b_e8d8_44a8_a26d_79482eda17ac.slice/crio-9a9cccb82c21cad3ac181534851499328bc1e1e118cae1e18bc4f7b06081ee06\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56755d7b_e8d8_44a8_a26d_79482eda17ac.slice\": RecentStats: unable to find data in memory cache]" Dec 04 11:00:20 crc kubenswrapper[4831]: I1204 11:00:20.276410 4831 scope.go:117] "RemoveContainer" containerID="7680c56ce4695af5a3b6bef889c0877624c8a70c4075962710bf59431dcdb2e9" Dec 04 11:00:20 crc kubenswrapper[4831]: E1204 11:00:20.277263 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:00:24 crc kubenswrapper[4831]: E1204 11:00:24.154250 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56755d7b_e8d8_44a8_a26d_79482eda17ac.slice/crio-9a9cccb82c21cad3ac181534851499328bc1e1e118cae1e18bc4f7b06081ee06\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56755d7b_e8d8_44a8_a26d_79482eda17ac.slice\": RecentStats: unable to find data in memory cache]" Dec 04 11:00:24 crc kubenswrapper[4831]: I1204 11:00:24.737937 4831 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-cmg79 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 11:00:24 crc kubenswrapper[4831]: I1204 11:00:24.738278 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cmg79" podUID="bf8f0aa6-641c-4258-bd46-541bf71d40b1" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 11:00:24 crc kubenswrapper[4831]: I1204 11:00:24.752422 4831 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-cmg79 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 11:00:24 crc kubenswrapper[4831]: I1204 11:00:24.752769 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cmg79" podUID="bf8f0aa6-641c-4258-bd46-541bf71d40b1" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 11:00:26 crc kubenswrapper[4831]: I1204 11:00:26.432812 4831 scope.go:117] "RemoveContainer" containerID="c90d8cfb7440730d1485d43bae49e534d471fe62033843e093914a44c8c4d6e0" Dec 04 11:00:32 crc kubenswrapper[4831]: I1204 11:00:32.277317 4831 scope.go:117] "RemoveContainer" containerID="7680c56ce4695af5a3b6bef889c0877624c8a70c4075962710bf59431dcdb2e9" Dec 04 11:00:32 crc kubenswrapper[4831]: I1204 11:00:32.718687 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" event={"ID":"8475bb26-8864-4d49-935b-db7d4cb73387","Type":"ContainerStarted","Data":"f216158f3258fd1dae0c50bdc0d7bdad944b732feabe96a8d81c113e0657e369"} Dec 04 11:00:34 crc kubenswrapper[4831]: E1204 11:00:34.416365 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56755d7b_e8d8_44a8_a26d_79482eda17ac.slice/crio-9a9cccb82c21cad3ac181534851499328bc1e1e118cae1e18bc4f7b06081ee06\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56755d7b_e8d8_44a8_a26d_79482eda17ac.slice\": RecentStats: unable to find data in memory cache]" Dec 04 11:00:44 crc kubenswrapper[4831]: E1204 11:00:44.714204 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56755d7b_e8d8_44a8_a26d_79482eda17ac.slice/crio-9a9cccb82c21cad3ac181534851499328bc1e1e118cae1e18bc4f7b06081ee06\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56755d7b_e8d8_44a8_a26d_79482eda17ac.slice\": RecentStats: unable to find data in memory cache]" Dec 04 11:00:54 crc kubenswrapper[4831]: E1204 11:00:54.972801 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56755d7b_e8d8_44a8_a26d_79482eda17ac.slice/crio-9a9cccb82c21cad3ac181534851499328bc1e1e118cae1e18bc4f7b06081ee06\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56755d7b_e8d8_44a8_a26d_79482eda17ac.slice\": RecentStats: unable to find data in memory cache]" Dec 04 11:01:00 crc kubenswrapper[4831]: I1204 11:01:00.165128 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29414101-mjsx6"] Dec 04 11:01:00 crc kubenswrapper[4831]: E1204 11:01:00.166737 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56755d7b-e8d8-44a8-a26d-79482eda17ac" containerName="collect-profiles" Dec 04 11:01:00 crc kubenswrapper[4831]: I1204 11:01:00.166768 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="56755d7b-e8d8-44a8-a26d-79482eda17ac" containerName="collect-profiles" Dec 04 11:01:00 crc kubenswrapper[4831]: I1204 11:01:00.167183 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="56755d7b-e8d8-44a8-a26d-79482eda17ac" containerName="collect-profiles" Dec 04 11:01:00 crc kubenswrapper[4831]: I1204 11:01:00.168396 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29414101-mjsx6" Dec 04 11:01:00 crc kubenswrapper[4831]: I1204 11:01:00.197240 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29414101-mjsx6"] Dec 04 11:01:00 crc kubenswrapper[4831]: I1204 11:01:00.346580 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bzgk\" (UniqueName: \"kubernetes.io/projected/5cd83458-aa47-4478-84dc-3dfeaf0829e1-kube-api-access-4bzgk\") pod \"keystone-cron-29414101-mjsx6\" (UID: \"5cd83458-aa47-4478-84dc-3dfeaf0829e1\") " pod="openstack/keystone-cron-29414101-mjsx6" Dec 04 11:01:00 crc kubenswrapper[4831]: I1204 11:01:00.346950 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cd83458-aa47-4478-84dc-3dfeaf0829e1-combined-ca-bundle\") pod \"keystone-cron-29414101-mjsx6\" (UID: \"5cd83458-aa47-4478-84dc-3dfeaf0829e1\") " pod="openstack/keystone-cron-29414101-mjsx6" Dec 04 11:01:00 crc kubenswrapper[4831]: I1204 11:01:00.347066 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cd83458-aa47-4478-84dc-3dfeaf0829e1-config-data\") pod \"keystone-cron-29414101-mjsx6\" (UID: \"5cd83458-aa47-4478-84dc-3dfeaf0829e1\") " pod="openstack/keystone-cron-29414101-mjsx6" Dec 04 11:01:00 crc kubenswrapper[4831]: I1204 11:01:00.347126 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5cd83458-aa47-4478-84dc-3dfeaf0829e1-fernet-keys\") pod \"keystone-cron-29414101-mjsx6\" (UID: \"5cd83458-aa47-4478-84dc-3dfeaf0829e1\") " pod="openstack/keystone-cron-29414101-mjsx6" Dec 04 11:01:00 crc kubenswrapper[4831]: I1204 11:01:00.448449 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cd83458-aa47-4478-84dc-3dfeaf0829e1-config-data\") pod \"keystone-cron-29414101-mjsx6\" (UID: \"5cd83458-aa47-4478-84dc-3dfeaf0829e1\") " pod="openstack/keystone-cron-29414101-mjsx6" Dec 04 11:01:00 crc kubenswrapper[4831]: I1204 11:01:00.448900 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5cd83458-aa47-4478-84dc-3dfeaf0829e1-fernet-keys\") pod \"keystone-cron-29414101-mjsx6\" (UID: \"5cd83458-aa47-4478-84dc-3dfeaf0829e1\") " pod="openstack/keystone-cron-29414101-mjsx6" Dec 04 11:01:00 crc kubenswrapper[4831]: I1204 11:01:00.449083 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bzgk\" (UniqueName: \"kubernetes.io/projected/5cd83458-aa47-4478-84dc-3dfeaf0829e1-kube-api-access-4bzgk\") pod \"keystone-cron-29414101-mjsx6\" (UID: \"5cd83458-aa47-4478-84dc-3dfeaf0829e1\") " pod="openstack/keystone-cron-29414101-mjsx6" Dec 04 11:01:00 crc kubenswrapper[4831]: I1204 11:01:00.449225 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cd83458-aa47-4478-84dc-3dfeaf0829e1-combined-ca-bundle\") pod \"keystone-cron-29414101-mjsx6\" (UID: \"5cd83458-aa47-4478-84dc-3dfeaf0829e1\") " pod="openstack/keystone-cron-29414101-mjsx6" Dec 04 11:01:00 crc kubenswrapper[4831]: I1204 11:01:00.456523 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cd83458-aa47-4478-84dc-3dfeaf0829e1-config-data\") pod \"keystone-cron-29414101-mjsx6\" (UID: \"5cd83458-aa47-4478-84dc-3dfeaf0829e1\") " pod="openstack/keystone-cron-29414101-mjsx6" Dec 04 11:01:00 crc kubenswrapper[4831]: I1204 11:01:00.457629 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cd83458-aa47-4478-84dc-3dfeaf0829e1-combined-ca-bundle\") pod \"keystone-cron-29414101-mjsx6\" (UID: \"5cd83458-aa47-4478-84dc-3dfeaf0829e1\") " pod="openstack/keystone-cron-29414101-mjsx6" Dec 04 11:01:00 crc kubenswrapper[4831]: I1204 11:01:00.460109 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5cd83458-aa47-4478-84dc-3dfeaf0829e1-fernet-keys\") pod \"keystone-cron-29414101-mjsx6\" (UID: \"5cd83458-aa47-4478-84dc-3dfeaf0829e1\") " pod="openstack/keystone-cron-29414101-mjsx6" Dec 04 11:01:00 crc kubenswrapper[4831]: I1204 11:01:00.466517 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bzgk\" (UniqueName: \"kubernetes.io/projected/5cd83458-aa47-4478-84dc-3dfeaf0829e1-kube-api-access-4bzgk\") pod \"keystone-cron-29414101-mjsx6\" (UID: \"5cd83458-aa47-4478-84dc-3dfeaf0829e1\") " pod="openstack/keystone-cron-29414101-mjsx6" Dec 04 11:01:00 crc kubenswrapper[4831]: I1204 11:01:00.492097 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29414101-mjsx6" Dec 04 11:01:00 crc kubenswrapper[4831]: I1204 11:01:00.973798 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29414101-mjsx6"] Dec 04 11:01:00 crc kubenswrapper[4831]: I1204 11:01:00.988191 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29414101-mjsx6" event={"ID":"5cd83458-aa47-4478-84dc-3dfeaf0829e1","Type":"ContainerStarted","Data":"1e155bfb088af5bf39340150fb50500a9e5107a27e55695dcc9cd4574ed758e5"} Dec 04 11:01:01 crc kubenswrapper[4831]: I1204 11:01:01.998495 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29414101-mjsx6" event={"ID":"5cd83458-aa47-4478-84dc-3dfeaf0829e1","Type":"ContainerStarted","Data":"9b30598be83482e9ac38d4cd81d552605423631ae88b1a54106147e89baa854d"} Dec 04 11:01:02 crc kubenswrapper[4831]: I1204 11:01:02.014483 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29414101-mjsx6" podStartSLOduration=2.014464509 podStartE2EDuration="2.014464509s" podCreationTimestamp="2025-12-04 11:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 11:01:02.012634091 +0000 UTC m=+2758.961809415" watchObservedRunningTime="2025-12-04 11:01:02.014464509 +0000 UTC m=+2758.963639843" Dec 04 11:01:04 crc kubenswrapper[4831]: I1204 11:01:04.016570 4831 generic.go:334] "Generic (PLEG): container finished" podID="5cd83458-aa47-4478-84dc-3dfeaf0829e1" containerID="9b30598be83482e9ac38d4cd81d552605423631ae88b1a54106147e89baa854d" exitCode=0 Dec 04 11:01:04 crc kubenswrapper[4831]: I1204 11:01:04.016641 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29414101-mjsx6" event={"ID":"5cd83458-aa47-4478-84dc-3dfeaf0829e1","Type":"ContainerDied","Data":"9b30598be83482e9ac38d4cd81d552605423631ae88b1a54106147e89baa854d"} Dec 04 11:01:05 crc kubenswrapper[4831]: I1204 11:01:05.425825 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29414101-mjsx6" Dec 04 11:01:05 crc kubenswrapper[4831]: I1204 11:01:05.566199 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cd83458-aa47-4478-84dc-3dfeaf0829e1-config-data\") pod \"5cd83458-aa47-4478-84dc-3dfeaf0829e1\" (UID: \"5cd83458-aa47-4478-84dc-3dfeaf0829e1\") " Dec 04 11:01:05 crc kubenswrapper[4831]: I1204 11:01:05.566308 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5cd83458-aa47-4478-84dc-3dfeaf0829e1-fernet-keys\") pod \"5cd83458-aa47-4478-84dc-3dfeaf0829e1\" (UID: \"5cd83458-aa47-4478-84dc-3dfeaf0829e1\") " Dec 04 11:01:05 crc kubenswrapper[4831]: I1204 11:01:05.566461 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cd83458-aa47-4478-84dc-3dfeaf0829e1-combined-ca-bundle\") pod \"5cd83458-aa47-4478-84dc-3dfeaf0829e1\" (UID: \"5cd83458-aa47-4478-84dc-3dfeaf0829e1\") " Dec 04 11:01:05 crc kubenswrapper[4831]: I1204 11:01:05.566536 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bzgk\" (UniqueName: \"kubernetes.io/projected/5cd83458-aa47-4478-84dc-3dfeaf0829e1-kube-api-access-4bzgk\") pod \"5cd83458-aa47-4478-84dc-3dfeaf0829e1\" (UID: \"5cd83458-aa47-4478-84dc-3dfeaf0829e1\") " Dec 04 11:01:05 crc kubenswrapper[4831]: I1204 11:01:05.573379 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cd83458-aa47-4478-84dc-3dfeaf0829e1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5cd83458-aa47-4478-84dc-3dfeaf0829e1" (UID: "5cd83458-aa47-4478-84dc-3dfeaf0829e1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 11:01:05 crc kubenswrapper[4831]: I1204 11:01:05.573902 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cd83458-aa47-4478-84dc-3dfeaf0829e1-kube-api-access-4bzgk" (OuterVolumeSpecName: "kube-api-access-4bzgk") pod "5cd83458-aa47-4478-84dc-3dfeaf0829e1" (UID: "5cd83458-aa47-4478-84dc-3dfeaf0829e1"). InnerVolumeSpecName "kube-api-access-4bzgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:01:05 crc kubenswrapper[4831]: I1204 11:01:05.601903 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cd83458-aa47-4478-84dc-3dfeaf0829e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5cd83458-aa47-4478-84dc-3dfeaf0829e1" (UID: "5cd83458-aa47-4478-84dc-3dfeaf0829e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 11:01:05 crc kubenswrapper[4831]: I1204 11:01:05.626030 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cd83458-aa47-4478-84dc-3dfeaf0829e1-config-data" (OuterVolumeSpecName: "config-data") pod "5cd83458-aa47-4478-84dc-3dfeaf0829e1" (UID: "5cd83458-aa47-4478-84dc-3dfeaf0829e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 11:01:05 crc kubenswrapper[4831]: I1204 11:01:05.669397 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cd83458-aa47-4478-84dc-3dfeaf0829e1-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 11:01:05 crc kubenswrapper[4831]: I1204 11:01:05.669775 4831 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5cd83458-aa47-4478-84dc-3dfeaf0829e1-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 04 11:01:05 crc kubenswrapper[4831]: I1204 11:01:05.669786 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cd83458-aa47-4478-84dc-3dfeaf0829e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 11:01:05 crc kubenswrapper[4831]: I1204 11:01:05.669797 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bzgk\" (UniqueName: \"kubernetes.io/projected/5cd83458-aa47-4478-84dc-3dfeaf0829e1-kube-api-access-4bzgk\") on node \"crc\" DevicePath \"\"" Dec 04 11:01:06 crc kubenswrapper[4831]: I1204 11:01:06.039388 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29414101-mjsx6" event={"ID":"5cd83458-aa47-4478-84dc-3dfeaf0829e1","Type":"ContainerDied","Data":"1e155bfb088af5bf39340150fb50500a9e5107a27e55695dcc9cd4574ed758e5"} Dec 04 11:01:06 crc kubenswrapper[4831]: I1204 11:01:06.039428 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e155bfb088af5bf39340150fb50500a9e5107a27e55695dcc9cd4574ed758e5" Dec 04 11:01:06 crc kubenswrapper[4831]: I1204 11:01:06.039830 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29414101-mjsx6" Dec 04 11:01:23 crc kubenswrapper[4831]: I1204 11:01:23.204159 4831 generic.go:334] "Generic (PLEG): container finished" podID="c7c8a31d-edf2-4e59-b66f-2e5ddab99661" containerID="6eef15ad0766607a61f47e54987cf920ca7028657ecbe624a7467e01d38a5fa9" exitCode=0 Dec 04 11:01:23 crc kubenswrapper[4831]: I1204 11:01:23.204291 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9" event={"ID":"c7c8a31d-edf2-4e59-b66f-2e5ddab99661","Type":"ContainerDied","Data":"6eef15ad0766607a61f47e54987cf920ca7028657ecbe624a7467e01d38a5fa9"} Dec 04 11:01:24 crc kubenswrapper[4831]: I1204 11:01:24.701738 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9" Dec 04 11:01:24 crc kubenswrapper[4831]: I1204 11:01:24.858333 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/c7c8a31d-edf2-4e59-b66f-2e5ddab99661-ceilometer-compute-config-data-1\") pod \"c7c8a31d-edf2-4e59-b66f-2e5ddab99661\" (UID: \"c7c8a31d-edf2-4e59-b66f-2e5ddab99661\") " Dec 04 11:01:24 crc kubenswrapper[4831]: I1204 11:01:24.858424 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7c8a31d-edf2-4e59-b66f-2e5ddab99661-telemetry-combined-ca-bundle\") pod \"c7c8a31d-edf2-4e59-b66f-2e5ddab99661\" (UID: \"c7c8a31d-edf2-4e59-b66f-2e5ddab99661\") " Dec 04 11:01:24 crc kubenswrapper[4831]: I1204 11:01:24.858466 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/c7c8a31d-edf2-4e59-b66f-2e5ddab99661-ceilometer-compute-config-data-0\") pod \"c7c8a31d-edf2-4e59-b66f-2e5ddab99661\" (UID: \"c7c8a31d-edf2-4e59-b66f-2e5ddab99661\") " Dec 04 11:01:24 crc kubenswrapper[4831]: I1204 11:01:24.858622 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7c8a31d-edf2-4e59-b66f-2e5ddab99661-inventory\") pod \"c7c8a31d-edf2-4e59-b66f-2e5ddab99661\" (UID: \"c7c8a31d-edf2-4e59-b66f-2e5ddab99661\") " Dec 04 11:01:24 crc kubenswrapper[4831]: I1204 11:01:24.858735 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slvgv\" (UniqueName: \"kubernetes.io/projected/c7c8a31d-edf2-4e59-b66f-2e5ddab99661-kube-api-access-slvgv\") pod \"c7c8a31d-edf2-4e59-b66f-2e5ddab99661\" (UID: \"c7c8a31d-edf2-4e59-b66f-2e5ddab99661\") " Dec 04 11:01:24 crc kubenswrapper[4831]: I1204 11:01:24.858805 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/c7c8a31d-edf2-4e59-b66f-2e5ddab99661-ceilometer-compute-config-data-2\") pod \"c7c8a31d-edf2-4e59-b66f-2e5ddab99661\" (UID: \"c7c8a31d-edf2-4e59-b66f-2e5ddab99661\") " Dec 04 11:01:24 crc kubenswrapper[4831]: I1204 11:01:24.858854 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c7c8a31d-edf2-4e59-b66f-2e5ddab99661-ssh-key\") pod \"c7c8a31d-edf2-4e59-b66f-2e5ddab99661\" (UID: \"c7c8a31d-edf2-4e59-b66f-2e5ddab99661\") " Dec 04 11:01:24 crc kubenswrapper[4831]: I1204 11:01:24.863495 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7c8a31d-edf2-4e59-b66f-2e5ddab99661-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "c7c8a31d-edf2-4e59-b66f-2e5ddab99661" (UID: "c7c8a31d-edf2-4e59-b66f-2e5ddab99661"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 11:01:24 crc kubenswrapper[4831]: I1204 11:01:24.878102 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7c8a31d-edf2-4e59-b66f-2e5ddab99661-kube-api-access-slvgv" (OuterVolumeSpecName: "kube-api-access-slvgv") pod "c7c8a31d-edf2-4e59-b66f-2e5ddab99661" (UID: "c7c8a31d-edf2-4e59-b66f-2e5ddab99661"). InnerVolumeSpecName "kube-api-access-slvgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:01:24 crc kubenswrapper[4831]: I1204 11:01:24.892492 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7c8a31d-edf2-4e59-b66f-2e5ddab99661-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "c7c8a31d-edf2-4e59-b66f-2e5ddab99661" (UID: "c7c8a31d-edf2-4e59-b66f-2e5ddab99661"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 11:01:24 crc kubenswrapper[4831]: I1204 11:01:24.894157 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7c8a31d-edf2-4e59-b66f-2e5ddab99661-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c7c8a31d-edf2-4e59-b66f-2e5ddab99661" (UID: "c7c8a31d-edf2-4e59-b66f-2e5ddab99661"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 11:01:24 crc kubenswrapper[4831]: I1204 11:01:24.896149 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7c8a31d-edf2-4e59-b66f-2e5ddab99661-inventory" (OuterVolumeSpecName: "inventory") pod "c7c8a31d-edf2-4e59-b66f-2e5ddab99661" (UID: "c7c8a31d-edf2-4e59-b66f-2e5ddab99661"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 11:01:24 crc kubenswrapper[4831]: I1204 11:01:24.896574 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7c8a31d-edf2-4e59-b66f-2e5ddab99661-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "c7c8a31d-edf2-4e59-b66f-2e5ddab99661" (UID: "c7c8a31d-edf2-4e59-b66f-2e5ddab99661"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 11:01:24 crc kubenswrapper[4831]: I1204 11:01:24.906907 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7c8a31d-edf2-4e59-b66f-2e5ddab99661-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "c7c8a31d-edf2-4e59-b66f-2e5ddab99661" (UID: "c7c8a31d-edf2-4e59-b66f-2e5ddab99661"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 11:01:24 crc kubenswrapper[4831]: I1204 11:01:24.961596 4831 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/c7c8a31d-edf2-4e59-b66f-2e5ddab99661-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 04 11:01:24 crc kubenswrapper[4831]: I1204 11:01:24.961727 4831 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7c8a31d-edf2-4e59-b66f-2e5ddab99661-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 11:01:24 crc kubenswrapper[4831]: I1204 11:01:24.961797 4831 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/c7c8a31d-edf2-4e59-b66f-2e5ddab99661-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 04 11:01:24 crc kubenswrapper[4831]: I1204 11:01:24.961863 4831 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7c8a31d-edf2-4e59-b66f-2e5ddab99661-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 11:01:24 crc kubenswrapper[4831]: I1204 11:01:24.961925 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slvgv\" (UniqueName: \"kubernetes.io/projected/c7c8a31d-edf2-4e59-b66f-2e5ddab99661-kube-api-access-slvgv\") on node \"crc\" DevicePath \"\"" Dec 04 11:01:24 crc kubenswrapper[4831]: I1204 11:01:24.961987 4831 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/c7c8a31d-edf2-4e59-b66f-2e5ddab99661-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 04 11:01:24 crc kubenswrapper[4831]: I1204 11:01:24.962040 4831 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c7c8a31d-edf2-4e59-b66f-2e5ddab99661-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 11:01:25 crc kubenswrapper[4831]: I1204 11:01:25.233449 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9" event={"ID":"c7c8a31d-edf2-4e59-b66f-2e5ddab99661","Type":"ContainerDied","Data":"d57ee79c73e567c8381e3424a74ddf01ee612ded31e27dc906e195447a6408e7"} Dec 04 11:01:25 crc kubenswrapper[4831]: I1204 11:01:25.233515 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d57ee79c73e567c8381e3424a74ddf01ee612ded31e27dc906e195447a6408e7" Dec 04 11:01:25 crc kubenswrapper[4831]: I1204 11:01:25.233574 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9" Dec 04 11:02:01 crc kubenswrapper[4831]: I1204 11:02:01.841424 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Dec 04 11:02:01 crc kubenswrapper[4831]: E1204 11:02:01.842387 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cd83458-aa47-4478-84dc-3dfeaf0829e1" containerName="keystone-cron" Dec 04 11:02:01 crc kubenswrapper[4831]: I1204 11:02:01.842404 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cd83458-aa47-4478-84dc-3dfeaf0829e1" containerName="keystone-cron" Dec 04 11:02:01 crc kubenswrapper[4831]: E1204 11:02:01.842418 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7c8a31d-edf2-4e59-b66f-2e5ddab99661" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 04 11:02:01 crc kubenswrapper[4831]: I1204 11:02:01.842430 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7c8a31d-edf2-4e59-b66f-2e5ddab99661" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 04 11:02:01 crc kubenswrapper[4831]: I1204 11:02:01.842653 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7c8a31d-edf2-4e59-b66f-2e5ddab99661" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 04 11:02:01 crc kubenswrapper[4831]: I1204 11:02:01.842693 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cd83458-aa47-4478-84dc-3dfeaf0829e1" containerName="keystone-cron" Dec 04 11:02:01 crc kubenswrapper[4831]: I1204 11:02:01.844051 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 04 11:02:01 crc kubenswrapper[4831]: I1204 11:02:01.845896 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Dec 04 11:02:01 crc kubenswrapper[4831]: I1204 11:02:01.875905 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Dec 04 11:02:01 crc kubenswrapper[4831]: I1204 11:02:01.900559 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Dec 04 11:02:01 crc kubenswrapper[4831]: I1204 11:02:01.902304 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:01 crc kubenswrapper[4831]: I1204 11:02:01.906114 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-nfs-2-config-data" Dec 04 11:02:01 crc kubenswrapper[4831]: I1204 11:02:01.917980 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Dec 04 11:02:01 crc kubenswrapper[4831]: I1204 11:02:01.946584 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-nfs-0"] Dec 04 11:02:01 crc kubenswrapper[4831]: I1204 11:02:01.948375 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:01 crc kubenswrapper[4831]: I1204 11:02:01.950154 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-nfs-config-data" Dec 04 11:02:01 crc kubenswrapper[4831]: I1204 11:02:01.964857 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-0"] Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.008053 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c5605310-5110-4a33-84d3-56518bd49d56-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.008110 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6f75845a-f441-4d6a-a971-e24a8010d7fe-sys\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.008148 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/6f75845a-f441-4d6a-a971-e24a8010d7fe-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.008172 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/610ff7e3-10d8-460e-a7a1-2ad48221b858-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.008195 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6f75845a-f441-4d6a-a971-e24a8010d7fe-run\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.008212 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f75845a-f441-4d6a-a971-e24a8010d7fe-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.008237 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5605310-5110-4a33-84d3-56518bd49d56-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.008256 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c5605310-5110-4a33-84d3-56518bd49d56-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.008278 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c5605310-5110-4a33-84d3-56518bd49d56-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.008300 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/610ff7e3-10d8-460e-a7a1-2ad48221b858-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.008337 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/610ff7e3-10d8-460e-a7a1-2ad48221b858-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.008401 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f64zx\" (UniqueName: \"kubernetes.io/projected/c5605310-5110-4a33-84d3-56518bd49d56-kube-api-access-f64zx\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.008428 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/610ff7e3-10d8-460e-a7a1-2ad48221b858-sys\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.008444 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6f75845a-f441-4d6a-a971-e24a8010d7fe-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.008478 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c5605310-5110-4a33-84d3-56518bd49d56-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.008503 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c5605310-5110-4a33-84d3-56518bd49d56-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.008519 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/610ff7e3-10d8-460e-a7a1-2ad48221b858-scripts\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.008542 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94nnl\" (UniqueName: \"kubernetes.io/projected/6f75845a-f441-4d6a-a971-e24a8010d7fe-kube-api-access-94nnl\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.008565 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5605310-5110-4a33-84d3-56518bd49d56-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.008578 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/610ff7e3-10d8-460e-a7a1-2ad48221b858-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.008597 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c5605310-5110-4a33-84d3-56518bd49d56-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.008616 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/6f75845a-f441-4d6a-a971-e24a8010d7fe-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.008633 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c5605310-5110-4a33-84d3-56518bd49d56-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.008647 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c5605310-5110-4a33-84d3-56518bd49d56-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.008738 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/610ff7e3-10d8-460e-a7a1-2ad48221b858-config-data-custom\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.008755 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/610ff7e3-10d8-460e-a7a1-2ad48221b858-dev\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.008770 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f75845a-f441-4d6a-a971-e24a8010d7fe-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.008787 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f75845a-f441-4d6a-a971-e24a8010d7fe-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.008801 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/610ff7e3-10d8-460e-a7a1-2ad48221b858-config-data\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.008818 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6f75845a-f441-4d6a-a971-e24a8010d7fe-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.008832 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6f75845a-f441-4d6a-a971-e24a8010d7fe-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.008847 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/610ff7e3-10d8-460e-a7a1-2ad48221b858-lib-modules\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.008862 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c5605310-5110-4a33-84d3-56518bd49d56-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.008885 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f75845a-f441-4d6a-a971-e24a8010d7fe-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.008901 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/610ff7e3-10d8-460e-a7a1-2ad48221b858-etc-nvme\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.008928 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6f75845a-f441-4d6a-a971-e24a8010d7fe-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.008949 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6f75845a-f441-4d6a-a971-e24a8010d7fe-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.008967 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c5605310-5110-4a33-84d3-56518bd49d56-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.008981 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5605310-5110-4a33-84d3-56518bd49d56-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.008996 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/610ff7e3-10d8-460e-a7a1-2ad48221b858-run\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.009052 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5605310-5110-4a33-84d3-56518bd49d56-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.009142 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6f75845a-f441-4d6a-a971-e24a8010d7fe-dev\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.009184 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdcz4\" (UniqueName: \"kubernetes.io/projected/610ff7e3-10d8-460e-a7a1-2ad48221b858-kube-api-access-sdcz4\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.009241 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/610ff7e3-10d8-460e-a7a1-2ad48221b858-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.009297 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/610ff7e3-10d8-460e-a7a1-2ad48221b858-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.121108 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/610ff7e3-10d8-460e-a7a1-2ad48221b858-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.121167 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6f75845a-f441-4d6a-a971-e24a8010d7fe-run\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.121193 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f75845a-f441-4d6a-a971-e24a8010d7fe-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.121223 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5605310-5110-4a33-84d3-56518bd49d56-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.121240 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c5605310-5110-4a33-84d3-56518bd49d56-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.121264 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c5605310-5110-4a33-84d3-56518bd49d56-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.121284 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/610ff7e3-10d8-460e-a7a1-2ad48221b858-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.121329 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/610ff7e3-10d8-460e-a7a1-2ad48221b858-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.121358 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f64zx\" (UniqueName: \"kubernetes.io/projected/c5605310-5110-4a33-84d3-56518bd49d56-kube-api-access-f64zx\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.121379 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/610ff7e3-10d8-460e-a7a1-2ad48221b858-sys\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.121397 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6f75845a-f441-4d6a-a971-e24a8010d7fe-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.121420 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c5605310-5110-4a33-84d3-56518bd49d56-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.121448 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c5605310-5110-4a33-84d3-56518bd49d56-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.121466 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/610ff7e3-10d8-460e-a7a1-2ad48221b858-scripts\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.121503 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94nnl\" (UniqueName: \"kubernetes.io/projected/6f75845a-f441-4d6a-a971-e24a8010d7fe-kube-api-access-94nnl\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.121546 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5605310-5110-4a33-84d3-56518bd49d56-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.121569 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/610ff7e3-10d8-460e-a7a1-2ad48221b858-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.121609 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c5605310-5110-4a33-84d3-56518bd49d56-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.121630 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/6f75845a-f441-4d6a-a971-e24a8010d7fe-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.121696 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c5605310-5110-4a33-84d3-56518bd49d56-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.121722 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c5605310-5110-4a33-84d3-56518bd49d56-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.121752 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/610ff7e3-10d8-460e-a7a1-2ad48221b858-config-data-custom\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.121776 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/610ff7e3-10d8-460e-a7a1-2ad48221b858-dev\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.121797 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f75845a-f441-4d6a-a971-e24a8010d7fe-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.121821 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f75845a-f441-4d6a-a971-e24a8010d7fe-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.121845 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/610ff7e3-10d8-460e-a7a1-2ad48221b858-config-data\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.121870 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6f75845a-f441-4d6a-a971-e24a8010d7fe-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.121890 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6f75845a-f441-4d6a-a971-e24a8010d7fe-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.121914 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/610ff7e3-10d8-460e-a7a1-2ad48221b858-lib-modules\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.121938 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c5605310-5110-4a33-84d3-56518bd49d56-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.121985 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f75845a-f441-4d6a-a971-e24a8010d7fe-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.122010 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/610ff7e3-10d8-460e-a7a1-2ad48221b858-etc-nvme\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.122059 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6f75845a-f441-4d6a-a971-e24a8010d7fe-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.122090 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6f75845a-f441-4d6a-a971-e24a8010d7fe-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.122114 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c5605310-5110-4a33-84d3-56518bd49d56-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.122160 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5605310-5110-4a33-84d3-56518bd49d56-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.122181 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/610ff7e3-10d8-460e-a7a1-2ad48221b858-run\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.122200 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5605310-5110-4a33-84d3-56518bd49d56-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.122260 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6f75845a-f441-4d6a-a971-e24a8010d7fe-dev\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.122291 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdcz4\" (UniqueName: \"kubernetes.io/projected/610ff7e3-10d8-460e-a7a1-2ad48221b858-kube-api-access-sdcz4\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.122323 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/610ff7e3-10d8-460e-a7a1-2ad48221b858-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.122351 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/610ff7e3-10d8-460e-a7a1-2ad48221b858-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.122378 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c5605310-5110-4a33-84d3-56518bd49d56-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.122404 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6f75845a-f441-4d6a-a971-e24a8010d7fe-sys\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.122445 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/6f75845a-f441-4d6a-a971-e24a8010d7fe-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.123064 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/610ff7e3-10d8-460e-a7a1-2ad48221b858-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.123411 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/6f75845a-f441-4d6a-a971-e24a8010d7fe-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.123521 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/610ff7e3-10d8-460e-a7a1-2ad48221b858-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.123574 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c5605310-5110-4a33-84d3-56518bd49d56-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.123616 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/6f75845a-f441-4d6a-a971-e24a8010d7fe-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.123651 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c5605310-5110-4a33-84d3-56518bd49d56-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.123729 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6f75845a-f441-4d6a-a971-e24a8010d7fe-run\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.123827 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c5605310-5110-4a33-84d3-56518bd49d56-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.123921 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6f75845a-f441-4d6a-a971-e24a8010d7fe-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.123961 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c5605310-5110-4a33-84d3-56518bd49d56-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.124337 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6f75845a-f441-4d6a-a971-e24a8010d7fe-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.124743 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c5605310-5110-4a33-84d3-56518bd49d56-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.124901 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6f75845a-f441-4d6a-a971-e24a8010d7fe-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.124985 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/610ff7e3-10d8-460e-a7a1-2ad48221b858-dev\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.125076 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/610ff7e3-10d8-460e-a7a1-2ad48221b858-sys\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.125121 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6f75845a-f441-4d6a-a971-e24a8010d7fe-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.128199 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c5605310-5110-4a33-84d3-56518bd49d56-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.128272 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6f75845a-f441-4d6a-a971-e24a8010d7fe-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.128297 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/610ff7e3-10d8-460e-a7a1-2ad48221b858-lib-modules\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.130003 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/610ff7e3-10d8-460e-a7a1-2ad48221b858-run\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.134866 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c5605310-5110-4a33-84d3-56518bd49d56-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.141896 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c5605310-5110-4a33-84d3-56518bd49d56-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.142038 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c5605310-5110-4a33-84d3-56518bd49d56-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.142128 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/610ff7e3-10d8-460e-a7a1-2ad48221b858-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.142300 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/610ff7e3-10d8-460e-a7a1-2ad48221b858-etc-nvme\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.142568 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5605310-5110-4a33-84d3-56518bd49d56-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.142636 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/610ff7e3-10d8-460e-a7a1-2ad48221b858-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.142657 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6f75845a-f441-4d6a-a971-e24a8010d7fe-dev\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.142748 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c5605310-5110-4a33-84d3-56518bd49d56-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.142779 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/610ff7e3-10d8-460e-a7a1-2ad48221b858-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.142825 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6f75845a-f441-4d6a-a971-e24a8010d7fe-sys\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.143473 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5605310-5110-4a33-84d3-56518bd49d56-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.145626 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/610ff7e3-10d8-460e-a7a1-2ad48221b858-config-data\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.153409 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f75845a-f441-4d6a-a971-e24a8010d7fe-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.160432 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5605310-5110-4a33-84d3-56518bd49d56-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.167087 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f75845a-f441-4d6a-a971-e24a8010d7fe-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.169150 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/610ff7e3-10d8-460e-a7a1-2ad48221b858-config-data-custom\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.180389 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/610ff7e3-10d8-460e-a7a1-2ad48221b858-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.183398 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f75845a-f441-4d6a-a971-e24a8010d7fe-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.191211 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f75845a-f441-4d6a-a971-e24a8010d7fe-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.195458 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5605310-5110-4a33-84d3-56518bd49d56-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.196001 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/610ff7e3-10d8-460e-a7a1-2ad48221b858-scripts\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.203432 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdcz4\" (UniqueName: \"kubernetes.io/projected/610ff7e3-10d8-460e-a7a1-2ad48221b858-kube-api-access-sdcz4\") pod \"cinder-backup-0\" (UID: \"610ff7e3-10d8-460e-a7a1-2ad48221b858\") " pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.204023 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f64zx\" (UniqueName: \"kubernetes.io/projected/c5605310-5110-4a33-84d3-56518bd49d56-kube-api-access-f64zx\") pod \"cinder-volume-nfs-2-0\" (UID: \"c5605310-5110-4a33-84d3-56518bd49d56\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.204018 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94nnl\" (UniqueName: \"kubernetes.io/projected/6f75845a-f441-4d6a-a971-e24a8010d7fe-kube-api-access-94nnl\") pod \"cinder-volume-nfs-0\" (UID: \"6f75845a-f441-4d6a-a971-e24a8010d7fe\") " pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.231363 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.276140 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.474424 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 04 11:02:02 crc kubenswrapper[4831]: I1204 11:02:02.968873 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Dec 04 11:02:03 crc kubenswrapper[4831]: I1204 11:02:03.155505 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Dec 04 11:02:03 crc kubenswrapper[4831]: I1204 11:02:03.507544 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-0"] Dec 04 11:02:03 crc kubenswrapper[4831]: I1204 11:02:03.613404 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"6f75845a-f441-4d6a-a971-e24a8010d7fe","Type":"ContainerStarted","Data":"4b6e9d3cc4e41956dc09104d721a26cc8a49229729965a017cb0320394de3f40"} Dec 04 11:02:03 crc kubenswrapper[4831]: I1204 11:02:03.617046 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"c5605310-5110-4a33-84d3-56518bd49d56","Type":"ContainerStarted","Data":"12284fb9da9da18f061ac888b6ccc2e0404c7b4fe62ac84b88bfe7a121e73aa2"} Dec 04 11:02:03 crc kubenswrapper[4831]: I1204 11:02:03.618130 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"610ff7e3-10d8-460e-a7a1-2ad48221b858","Type":"ContainerStarted","Data":"9e639f3fc4af429a9a19c7b7f7004e735974131ae0e5cdc3918f00f2c4930308"} Dec 04 11:02:05 crc kubenswrapper[4831]: I1204 11:02:05.648256 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"6f75845a-f441-4d6a-a971-e24a8010d7fe","Type":"ContainerStarted","Data":"7e8d58ecca3388e30066f376781eb10372f4c119466980421bb09afa6cc30d4e"} Dec 04 11:02:05 crc kubenswrapper[4831]: I1204 11:02:05.651842 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"c5605310-5110-4a33-84d3-56518bd49d56","Type":"ContainerStarted","Data":"14d35def6180b558e6db24d8b10a65d72d94f08845d6eb47021bbf003f9718de"} Dec 04 11:02:05 crc kubenswrapper[4831]: I1204 11:02:05.651895 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"c5605310-5110-4a33-84d3-56518bd49d56","Type":"ContainerStarted","Data":"90b9d3848d1a48464d6d4227f05b1cf1c0d463d6e54355cb50a7cf52bb6366fc"} Dec 04 11:02:05 crc kubenswrapper[4831]: I1204 11:02:05.666181 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"610ff7e3-10d8-460e-a7a1-2ad48221b858","Type":"ContainerStarted","Data":"1815c12d981a49f25ff09081949ebdbaeb44acb4f2a8746a0a3e529f3681941d"} Dec 04 11:02:05 crc kubenswrapper[4831]: I1204 11:02:05.688348 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-nfs-2-0" podStartSLOduration=3.252550946 podStartE2EDuration="4.688324678s" podCreationTimestamp="2025-12-04 11:02:01 +0000 UTC" firstStartedPulling="2025-12-04 11:02:02.977117356 +0000 UTC m=+2819.926292670" lastFinishedPulling="2025-12-04 11:02:04.412891088 +0000 UTC m=+2821.362066402" observedRunningTime="2025-12-04 11:02:05.672454556 +0000 UTC m=+2822.621629870" watchObservedRunningTime="2025-12-04 11:02:05.688324678 +0000 UTC m=+2822.637500002" Dec 04 11:02:06 crc kubenswrapper[4831]: I1204 11:02:06.685362 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"610ff7e3-10d8-460e-a7a1-2ad48221b858","Type":"ContainerStarted","Data":"9911f51c6870b53e94055107e801e9b9813ee7e22af1d47c21cf4c51b0ecd6fd"} Dec 04 11:02:06 crc kubenswrapper[4831]: I1204 11:02:06.689498 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"6f75845a-f441-4d6a-a971-e24a8010d7fe","Type":"ContainerStarted","Data":"e5698963d3d0b09e9d05dbdfc0e28dcbe94e7fcee5d5f34125ff8bac45acccde"} Dec 04 11:02:06 crc kubenswrapper[4831]: I1204 11:02:06.716885 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=4.285594803 podStartE2EDuration="5.716865495s" podCreationTimestamp="2025-12-04 11:02:01 +0000 UTC" firstStartedPulling="2025-12-04 11:02:03.161476166 +0000 UTC m=+2820.110651480" lastFinishedPulling="2025-12-04 11:02:04.592746858 +0000 UTC m=+2821.541922172" observedRunningTime="2025-12-04 11:02:06.707843285 +0000 UTC m=+2823.657018619" watchObservedRunningTime="2025-12-04 11:02:06.716865495 +0000 UTC m=+2823.666040809" Dec 04 11:02:06 crc kubenswrapper[4831]: I1204 11:02:06.739853 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-nfs-0" podStartSLOduration=4.840096451 podStartE2EDuration="5.7396476s" podCreationTimestamp="2025-12-04 11:02:01 +0000 UTC" firstStartedPulling="2025-12-04 11:02:03.515302751 +0000 UTC m=+2820.464478065" lastFinishedPulling="2025-12-04 11:02:04.4148539 +0000 UTC m=+2821.364029214" observedRunningTime="2025-12-04 11:02:06.737057671 +0000 UTC m=+2823.686232985" watchObservedRunningTime="2025-12-04 11:02:06.7396476 +0000 UTC m=+2823.688822914" Dec 04 11:02:07 crc kubenswrapper[4831]: I1204 11:02:07.232268 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:07 crc kubenswrapper[4831]: I1204 11:02:07.289133 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:07 crc kubenswrapper[4831]: I1204 11:02:07.475548 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Dec 04 11:02:09 crc kubenswrapper[4831]: I1204 11:02:09.366725 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-l7q2b"] Dec 04 11:02:09 crc kubenswrapper[4831]: I1204 11:02:09.369720 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l7q2b" Dec 04 11:02:09 crc kubenswrapper[4831]: I1204 11:02:09.383816 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l7q2b"] Dec 04 11:02:09 crc kubenswrapper[4831]: I1204 11:02:09.401141 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58ffbd66-7af3-47f9-b260-5689251e8c1a-catalog-content\") pod \"redhat-operators-l7q2b\" (UID: \"58ffbd66-7af3-47f9-b260-5689251e8c1a\") " pod="openshift-marketplace/redhat-operators-l7q2b" Dec 04 11:02:09 crc kubenswrapper[4831]: I1204 11:02:09.401253 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twwsl\" (UniqueName: \"kubernetes.io/projected/58ffbd66-7af3-47f9-b260-5689251e8c1a-kube-api-access-twwsl\") pod \"redhat-operators-l7q2b\" (UID: \"58ffbd66-7af3-47f9-b260-5689251e8c1a\") " pod="openshift-marketplace/redhat-operators-l7q2b" Dec 04 11:02:09 crc kubenswrapper[4831]: I1204 11:02:09.401697 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58ffbd66-7af3-47f9-b260-5689251e8c1a-utilities\") pod \"redhat-operators-l7q2b\" (UID: \"58ffbd66-7af3-47f9-b260-5689251e8c1a\") " pod="openshift-marketplace/redhat-operators-l7q2b" Dec 04 11:02:09 crc kubenswrapper[4831]: I1204 11:02:09.503634 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58ffbd66-7af3-47f9-b260-5689251e8c1a-utilities\") pod \"redhat-operators-l7q2b\" (UID: \"58ffbd66-7af3-47f9-b260-5689251e8c1a\") " pod="openshift-marketplace/redhat-operators-l7q2b" Dec 04 11:02:09 crc kubenswrapper[4831]: I1204 11:02:09.503730 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58ffbd66-7af3-47f9-b260-5689251e8c1a-catalog-content\") pod \"redhat-operators-l7q2b\" (UID: \"58ffbd66-7af3-47f9-b260-5689251e8c1a\") " pod="openshift-marketplace/redhat-operators-l7q2b" Dec 04 11:02:09 crc kubenswrapper[4831]: I1204 11:02:09.503814 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twwsl\" (UniqueName: \"kubernetes.io/projected/58ffbd66-7af3-47f9-b260-5689251e8c1a-kube-api-access-twwsl\") pod \"redhat-operators-l7q2b\" (UID: \"58ffbd66-7af3-47f9-b260-5689251e8c1a\") " pod="openshift-marketplace/redhat-operators-l7q2b" Dec 04 11:02:09 crc kubenswrapper[4831]: I1204 11:02:09.504508 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58ffbd66-7af3-47f9-b260-5689251e8c1a-utilities\") pod \"redhat-operators-l7q2b\" (UID: \"58ffbd66-7af3-47f9-b260-5689251e8c1a\") " pod="openshift-marketplace/redhat-operators-l7q2b" Dec 04 11:02:09 crc kubenswrapper[4831]: I1204 11:02:09.504577 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58ffbd66-7af3-47f9-b260-5689251e8c1a-catalog-content\") pod \"redhat-operators-l7q2b\" (UID: \"58ffbd66-7af3-47f9-b260-5689251e8c1a\") " pod="openshift-marketplace/redhat-operators-l7q2b" Dec 04 11:02:09 crc kubenswrapper[4831]: I1204 11:02:09.528496 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twwsl\" (UniqueName: \"kubernetes.io/projected/58ffbd66-7af3-47f9-b260-5689251e8c1a-kube-api-access-twwsl\") pod \"redhat-operators-l7q2b\" (UID: \"58ffbd66-7af3-47f9-b260-5689251e8c1a\") " pod="openshift-marketplace/redhat-operators-l7q2b" Dec 04 11:02:09 crc kubenswrapper[4831]: I1204 11:02:09.692334 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l7q2b" Dec 04 11:02:10 crc kubenswrapper[4831]: I1204 11:02:10.194641 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l7q2b"] Dec 04 11:02:10 crc kubenswrapper[4831]: I1204 11:02:10.742920 4831 generic.go:334] "Generic (PLEG): container finished" podID="58ffbd66-7af3-47f9-b260-5689251e8c1a" containerID="806de3e37ca754391b2a36c27491ad60916a204c2537519fb84e515df48dfa2d" exitCode=0 Dec 04 11:02:10 crc kubenswrapper[4831]: I1204 11:02:10.743018 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l7q2b" event={"ID":"58ffbd66-7af3-47f9-b260-5689251e8c1a","Type":"ContainerDied","Data":"806de3e37ca754391b2a36c27491ad60916a204c2537519fb84e515df48dfa2d"} Dec 04 11:02:10 crc kubenswrapper[4831]: I1204 11:02:10.746107 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l7q2b" event={"ID":"58ffbd66-7af3-47f9-b260-5689251e8c1a","Type":"ContainerStarted","Data":"0774a2a1110dd6a163bcdaa80bbbf56d6cef92b572389fba79ba54b46b7c1aa3"} Dec 04 11:02:10 crc kubenswrapper[4831]: I1204 11:02:10.745831 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 11:02:12 crc kubenswrapper[4831]: I1204 11:02:12.442014 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-nfs-0" Dec 04 11:02:12 crc kubenswrapper[4831]: I1204 11:02:12.445332 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-nfs-2-0" Dec 04 11:02:12 crc kubenswrapper[4831]: I1204 11:02:12.712018 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Dec 04 11:02:12 crc kubenswrapper[4831]: I1204 11:02:12.783082 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l7q2b" event={"ID":"58ffbd66-7af3-47f9-b260-5689251e8c1a","Type":"ContainerStarted","Data":"d7724de71e96789da1f147804ddf9d7424ab19ce1bee88b1816b2880b04d344c"} Dec 04 11:02:14 crc kubenswrapper[4831]: I1204 11:02:14.712798 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qp2f7"] Dec 04 11:02:14 crc kubenswrapper[4831]: I1204 11:02:14.716114 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qp2f7" Dec 04 11:02:14 crc kubenswrapper[4831]: I1204 11:02:14.721908 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qp2f7"] Dec 04 11:02:14 crc kubenswrapper[4831]: I1204 11:02:14.736701 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5285a80-97cc-4a93-9d72-98889f1bd500-utilities\") pod \"certified-operators-qp2f7\" (UID: \"c5285a80-97cc-4a93-9d72-98889f1bd500\") " pod="openshift-marketplace/certified-operators-qp2f7" Dec 04 11:02:14 crc kubenswrapper[4831]: I1204 11:02:14.736757 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5285a80-97cc-4a93-9d72-98889f1bd500-catalog-content\") pod \"certified-operators-qp2f7\" (UID: \"c5285a80-97cc-4a93-9d72-98889f1bd500\") " pod="openshift-marketplace/certified-operators-qp2f7" Dec 04 11:02:14 crc kubenswrapper[4831]: I1204 11:02:14.736837 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skwqm\" (UniqueName: \"kubernetes.io/projected/c5285a80-97cc-4a93-9d72-98889f1bd500-kube-api-access-skwqm\") pod \"certified-operators-qp2f7\" (UID: \"c5285a80-97cc-4a93-9d72-98889f1bd500\") " pod="openshift-marketplace/certified-operators-qp2f7" Dec 04 11:02:14 crc kubenswrapper[4831]: I1204 11:02:14.838546 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5285a80-97cc-4a93-9d72-98889f1bd500-utilities\") pod \"certified-operators-qp2f7\" (UID: \"c5285a80-97cc-4a93-9d72-98889f1bd500\") " pod="openshift-marketplace/certified-operators-qp2f7" Dec 04 11:02:14 crc kubenswrapper[4831]: I1204 11:02:14.838841 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5285a80-97cc-4a93-9d72-98889f1bd500-catalog-content\") pod \"certified-operators-qp2f7\" (UID: \"c5285a80-97cc-4a93-9d72-98889f1bd500\") " pod="openshift-marketplace/certified-operators-qp2f7" Dec 04 11:02:14 crc kubenswrapper[4831]: I1204 11:02:14.838982 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skwqm\" (UniqueName: \"kubernetes.io/projected/c5285a80-97cc-4a93-9d72-98889f1bd500-kube-api-access-skwqm\") pod \"certified-operators-qp2f7\" (UID: \"c5285a80-97cc-4a93-9d72-98889f1bd500\") " pod="openshift-marketplace/certified-operators-qp2f7" Dec 04 11:02:14 crc kubenswrapper[4831]: I1204 11:02:14.840237 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5285a80-97cc-4a93-9d72-98889f1bd500-utilities\") pod \"certified-operators-qp2f7\" (UID: \"c5285a80-97cc-4a93-9d72-98889f1bd500\") " pod="openshift-marketplace/certified-operators-qp2f7" Dec 04 11:02:14 crc kubenswrapper[4831]: I1204 11:02:14.841177 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5285a80-97cc-4a93-9d72-98889f1bd500-catalog-content\") pod \"certified-operators-qp2f7\" (UID: \"c5285a80-97cc-4a93-9d72-98889f1bd500\") " pod="openshift-marketplace/certified-operators-qp2f7" Dec 04 11:02:14 crc kubenswrapper[4831]: I1204 11:02:14.859406 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skwqm\" (UniqueName: \"kubernetes.io/projected/c5285a80-97cc-4a93-9d72-98889f1bd500-kube-api-access-skwqm\") pod \"certified-operators-qp2f7\" (UID: \"c5285a80-97cc-4a93-9d72-98889f1bd500\") " pod="openshift-marketplace/certified-operators-qp2f7" Dec 04 11:02:15 crc kubenswrapper[4831]: I1204 11:02:15.047150 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qp2f7" Dec 04 11:02:15 crc kubenswrapper[4831]: W1204 11:02:15.832272 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5285a80_97cc_4a93_9d72_98889f1bd500.slice/crio-4a19a1dea19bbb2afeb49e452214b193d457bf7b09311990fcd2a7148b77cf27 WatchSource:0}: Error finding container 4a19a1dea19bbb2afeb49e452214b193d457bf7b09311990fcd2a7148b77cf27: Status 404 returned error can't find the container with id 4a19a1dea19bbb2afeb49e452214b193d457bf7b09311990fcd2a7148b77cf27 Dec 04 11:02:15 crc kubenswrapper[4831]: I1204 11:02:15.845846 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qp2f7"] Dec 04 11:02:16 crc kubenswrapper[4831]: I1204 11:02:16.825693 4831 generic.go:334] "Generic (PLEG): container finished" podID="c5285a80-97cc-4a93-9d72-98889f1bd500" containerID="57a35f5b188eda5744d6e81a3cbf6048ede23711263d50dc61ebdb8d5bbd7444" exitCode=0 Dec 04 11:02:16 crc kubenswrapper[4831]: I1204 11:02:16.825826 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qp2f7" event={"ID":"c5285a80-97cc-4a93-9d72-98889f1bd500","Type":"ContainerDied","Data":"57a35f5b188eda5744d6e81a3cbf6048ede23711263d50dc61ebdb8d5bbd7444"} Dec 04 11:02:16 crc kubenswrapper[4831]: I1204 11:02:16.826215 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qp2f7" event={"ID":"c5285a80-97cc-4a93-9d72-98889f1bd500","Type":"ContainerStarted","Data":"4a19a1dea19bbb2afeb49e452214b193d457bf7b09311990fcd2a7148b77cf27"} Dec 04 11:02:17 crc kubenswrapper[4831]: E1204 11:02:17.266588 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58ffbd66_7af3_47f9_b260_5689251e8c1a.slice/crio-d7724de71e96789da1f147804ddf9d7424ab19ce1bee88b1816b2880b04d344c.scope\": RecentStats: unable to find data in memory cache]" Dec 04 11:02:18 crc kubenswrapper[4831]: I1204 11:02:18.854285 4831 generic.go:334] "Generic (PLEG): container finished" podID="58ffbd66-7af3-47f9-b260-5689251e8c1a" containerID="d7724de71e96789da1f147804ddf9d7424ab19ce1bee88b1816b2880b04d344c" exitCode=0 Dec 04 11:02:18 crc kubenswrapper[4831]: I1204 11:02:18.855768 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l7q2b" event={"ID":"58ffbd66-7af3-47f9-b260-5689251e8c1a","Type":"ContainerDied","Data":"d7724de71e96789da1f147804ddf9d7424ab19ce1bee88b1816b2880b04d344c"} Dec 04 11:02:18 crc kubenswrapper[4831]: I1204 11:02:18.865746 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qp2f7" event={"ID":"c5285a80-97cc-4a93-9d72-98889f1bd500","Type":"ContainerStarted","Data":"0e69a294f48d8d50aa70e1608ada42c88a0fdc8efd05ecd472b8354ee604ca27"} Dec 04 11:02:21 crc kubenswrapper[4831]: I1204 11:02:21.897330 4831 generic.go:334] "Generic (PLEG): container finished" podID="c5285a80-97cc-4a93-9d72-98889f1bd500" containerID="0e69a294f48d8d50aa70e1608ada42c88a0fdc8efd05ecd472b8354ee604ca27" exitCode=0 Dec 04 11:02:21 crc kubenswrapper[4831]: I1204 11:02:21.897426 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qp2f7" event={"ID":"c5285a80-97cc-4a93-9d72-98889f1bd500","Type":"ContainerDied","Data":"0e69a294f48d8d50aa70e1608ada42c88a0fdc8efd05ecd472b8354ee604ca27"} Dec 04 11:02:21 crc kubenswrapper[4831]: I1204 11:02:21.899539 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l7q2b" event={"ID":"58ffbd66-7af3-47f9-b260-5689251e8c1a","Type":"ContainerStarted","Data":"5285a5bccc192cae891df31600e60d7d49863943711f490442d78dca3596c822"} Dec 04 11:02:21 crc kubenswrapper[4831]: I1204 11:02:21.943085 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-l7q2b" podStartSLOduration=2.440591646 podStartE2EDuration="12.94306684s" podCreationTimestamp="2025-12-04 11:02:09 +0000 UTC" firstStartedPulling="2025-12-04 11:02:10.745368288 +0000 UTC m=+2827.694543602" lastFinishedPulling="2025-12-04 11:02:21.247843442 +0000 UTC m=+2838.197018796" observedRunningTime="2025-12-04 11:02:21.934281397 +0000 UTC m=+2838.883456721" watchObservedRunningTime="2025-12-04 11:02:21.94306684 +0000 UTC m=+2838.892242154" Dec 04 11:02:22 crc kubenswrapper[4831]: I1204 11:02:22.911045 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qp2f7" event={"ID":"c5285a80-97cc-4a93-9d72-98889f1bd500","Type":"ContainerStarted","Data":"f4089b006d64313d6f901bf99e9b351beb5c438e851d5d69b3222b67cc15fa8a"} Dec 04 11:02:22 crc kubenswrapper[4831]: I1204 11:02:22.941952 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qp2f7" podStartSLOduration=3.19006958 podStartE2EDuration="8.941927789s" podCreationTimestamp="2025-12-04 11:02:14 +0000 UTC" firstStartedPulling="2025-12-04 11:02:16.829293631 +0000 UTC m=+2833.778468945" lastFinishedPulling="2025-12-04 11:02:22.58115184 +0000 UTC m=+2839.530327154" observedRunningTime="2025-12-04 11:02:22.927001622 +0000 UTC m=+2839.876176946" watchObservedRunningTime="2025-12-04 11:02:22.941927789 +0000 UTC m=+2839.891103113" Dec 04 11:02:25 crc kubenswrapper[4831]: I1204 11:02:25.048021 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qp2f7" Dec 04 11:02:25 crc kubenswrapper[4831]: I1204 11:02:25.048447 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qp2f7" Dec 04 11:02:25 crc kubenswrapper[4831]: I1204 11:02:25.119846 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qp2f7" Dec 04 11:02:29 crc kubenswrapper[4831]: I1204 11:02:29.693125 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l7q2b" Dec 04 11:02:29 crc kubenswrapper[4831]: I1204 11:02:29.693651 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-l7q2b" Dec 04 11:02:30 crc kubenswrapper[4831]: I1204 11:02:30.745064 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l7q2b" podUID="58ffbd66-7af3-47f9-b260-5689251e8c1a" containerName="registry-server" probeResult="failure" output=< Dec 04 11:02:30 crc kubenswrapper[4831]: timeout: failed to connect service ":50051" within 1s Dec 04 11:02:30 crc kubenswrapper[4831]: > Dec 04 11:02:35 crc kubenswrapper[4831]: I1204 11:02:35.106291 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qp2f7" Dec 04 11:02:35 crc kubenswrapper[4831]: I1204 11:02:35.151098 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qp2f7"] Dec 04 11:02:36 crc kubenswrapper[4831]: I1204 11:02:36.041396 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qp2f7" podUID="c5285a80-97cc-4a93-9d72-98889f1bd500" containerName="registry-server" containerID="cri-o://f4089b006d64313d6f901bf99e9b351beb5c438e851d5d69b3222b67cc15fa8a" gracePeriod=2 Dec 04 11:02:36 crc kubenswrapper[4831]: I1204 11:02:36.547554 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qp2f7" Dec 04 11:02:36 crc kubenswrapper[4831]: I1204 11:02:36.716402 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5285a80-97cc-4a93-9d72-98889f1bd500-catalog-content\") pod \"c5285a80-97cc-4a93-9d72-98889f1bd500\" (UID: \"c5285a80-97cc-4a93-9d72-98889f1bd500\") " Dec 04 11:02:36 crc kubenswrapper[4831]: I1204 11:02:36.716548 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skwqm\" (UniqueName: \"kubernetes.io/projected/c5285a80-97cc-4a93-9d72-98889f1bd500-kube-api-access-skwqm\") pod \"c5285a80-97cc-4a93-9d72-98889f1bd500\" (UID: \"c5285a80-97cc-4a93-9d72-98889f1bd500\") " Dec 04 11:02:36 crc kubenswrapper[4831]: I1204 11:02:36.716716 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5285a80-97cc-4a93-9d72-98889f1bd500-utilities\") pod \"c5285a80-97cc-4a93-9d72-98889f1bd500\" (UID: \"c5285a80-97cc-4a93-9d72-98889f1bd500\") " Dec 04 11:02:36 crc kubenswrapper[4831]: I1204 11:02:36.717865 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5285a80-97cc-4a93-9d72-98889f1bd500-utilities" (OuterVolumeSpecName: "utilities") pod "c5285a80-97cc-4a93-9d72-98889f1bd500" (UID: "c5285a80-97cc-4a93-9d72-98889f1bd500"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:02:36 crc kubenswrapper[4831]: I1204 11:02:36.724945 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5285a80-97cc-4a93-9d72-98889f1bd500-kube-api-access-skwqm" (OuterVolumeSpecName: "kube-api-access-skwqm") pod "c5285a80-97cc-4a93-9d72-98889f1bd500" (UID: "c5285a80-97cc-4a93-9d72-98889f1bd500"). InnerVolumeSpecName "kube-api-access-skwqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:02:36 crc kubenswrapper[4831]: I1204 11:02:36.772727 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5285a80-97cc-4a93-9d72-98889f1bd500-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c5285a80-97cc-4a93-9d72-98889f1bd500" (UID: "c5285a80-97cc-4a93-9d72-98889f1bd500"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:02:36 crc kubenswrapper[4831]: I1204 11:02:36.819698 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skwqm\" (UniqueName: \"kubernetes.io/projected/c5285a80-97cc-4a93-9d72-98889f1bd500-kube-api-access-skwqm\") on node \"crc\" DevicePath \"\"" Dec 04 11:02:36 crc kubenswrapper[4831]: I1204 11:02:36.819746 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5285a80-97cc-4a93-9d72-98889f1bd500-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 11:02:36 crc kubenswrapper[4831]: I1204 11:02:36.819759 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5285a80-97cc-4a93-9d72-98889f1bd500-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 11:02:37 crc kubenswrapper[4831]: I1204 11:02:37.052831 4831 generic.go:334] "Generic (PLEG): container finished" podID="c5285a80-97cc-4a93-9d72-98889f1bd500" containerID="f4089b006d64313d6f901bf99e9b351beb5c438e851d5d69b3222b67cc15fa8a" exitCode=0 Dec 04 11:02:37 crc kubenswrapper[4831]: I1204 11:02:37.052888 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qp2f7" event={"ID":"c5285a80-97cc-4a93-9d72-98889f1bd500","Type":"ContainerDied","Data":"f4089b006d64313d6f901bf99e9b351beb5c438e851d5d69b3222b67cc15fa8a"} Dec 04 11:02:37 crc kubenswrapper[4831]: I1204 11:02:37.052922 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qp2f7" event={"ID":"c5285a80-97cc-4a93-9d72-98889f1bd500","Type":"ContainerDied","Data":"4a19a1dea19bbb2afeb49e452214b193d457bf7b09311990fcd2a7148b77cf27"} Dec 04 11:02:37 crc kubenswrapper[4831]: I1204 11:02:37.052943 4831 scope.go:117] "RemoveContainer" containerID="f4089b006d64313d6f901bf99e9b351beb5c438e851d5d69b3222b67cc15fa8a" Dec 04 11:02:37 crc kubenswrapper[4831]: I1204 11:02:37.052961 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qp2f7" Dec 04 11:02:37 crc kubenswrapper[4831]: I1204 11:02:37.081187 4831 scope.go:117] "RemoveContainer" containerID="0e69a294f48d8d50aa70e1608ada42c88a0fdc8efd05ecd472b8354ee604ca27" Dec 04 11:02:37 crc kubenswrapper[4831]: I1204 11:02:37.126533 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qp2f7"] Dec 04 11:02:37 crc kubenswrapper[4831]: I1204 11:02:37.143075 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qp2f7"] Dec 04 11:02:37 crc kubenswrapper[4831]: I1204 11:02:37.149521 4831 scope.go:117] "RemoveContainer" containerID="57a35f5b188eda5744d6e81a3cbf6048ede23711263d50dc61ebdb8d5bbd7444" Dec 04 11:02:37 crc kubenswrapper[4831]: I1204 11:02:37.184174 4831 scope.go:117] "RemoveContainer" containerID="f4089b006d64313d6f901bf99e9b351beb5c438e851d5d69b3222b67cc15fa8a" Dec 04 11:02:37 crc kubenswrapper[4831]: E1204 11:02:37.185616 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4089b006d64313d6f901bf99e9b351beb5c438e851d5d69b3222b67cc15fa8a\": container with ID starting with f4089b006d64313d6f901bf99e9b351beb5c438e851d5d69b3222b67cc15fa8a not found: ID does not exist" containerID="f4089b006d64313d6f901bf99e9b351beb5c438e851d5d69b3222b67cc15fa8a" Dec 04 11:02:37 crc kubenswrapper[4831]: I1204 11:02:37.185681 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4089b006d64313d6f901bf99e9b351beb5c438e851d5d69b3222b67cc15fa8a"} err="failed to get container status \"f4089b006d64313d6f901bf99e9b351beb5c438e851d5d69b3222b67cc15fa8a\": rpc error: code = NotFound desc = could not find container \"f4089b006d64313d6f901bf99e9b351beb5c438e851d5d69b3222b67cc15fa8a\": container with ID starting with f4089b006d64313d6f901bf99e9b351beb5c438e851d5d69b3222b67cc15fa8a not found: ID does not exist" Dec 04 11:02:37 crc kubenswrapper[4831]: I1204 11:02:37.185706 4831 scope.go:117] "RemoveContainer" containerID="0e69a294f48d8d50aa70e1608ada42c88a0fdc8efd05ecd472b8354ee604ca27" Dec 04 11:02:37 crc kubenswrapper[4831]: E1204 11:02:37.186023 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e69a294f48d8d50aa70e1608ada42c88a0fdc8efd05ecd472b8354ee604ca27\": container with ID starting with 0e69a294f48d8d50aa70e1608ada42c88a0fdc8efd05ecd472b8354ee604ca27 not found: ID does not exist" containerID="0e69a294f48d8d50aa70e1608ada42c88a0fdc8efd05ecd472b8354ee604ca27" Dec 04 11:02:37 crc kubenswrapper[4831]: I1204 11:02:37.186047 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e69a294f48d8d50aa70e1608ada42c88a0fdc8efd05ecd472b8354ee604ca27"} err="failed to get container status \"0e69a294f48d8d50aa70e1608ada42c88a0fdc8efd05ecd472b8354ee604ca27\": rpc error: code = NotFound desc = could not find container \"0e69a294f48d8d50aa70e1608ada42c88a0fdc8efd05ecd472b8354ee604ca27\": container with ID starting with 0e69a294f48d8d50aa70e1608ada42c88a0fdc8efd05ecd472b8354ee604ca27 not found: ID does not exist" Dec 04 11:02:37 crc kubenswrapper[4831]: I1204 11:02:37.186059 4831 scope.go:117] "RemoveContainer" containerID="57a35f5b188eda5744d6e81a3cbf6048ede23711263d50dc61ebdb8d5bbd7444" Dec 04 11:02:37 crc kubenswrapper[4831]: E1204 11:02:37.186409 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57a35f5b188eda5744d6e81a3cbf6048ede23711263d50dc61ebdb8d5bbd7444\": container with ID starting with 57a35f5b188eda5744d6e81a3cbf6048ede23711263d50dc61ebdb8d5bbd7444 not found: ID does not exist" containerID="57a35f5b188eda5744d6e81a3cbf6048ede23711263d50dc61ebdb8d5bbd7444" Dec 04 11:02:37 crc kubenswrapper[4831]: I1204 11:02:37.186451 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57a35f5b188eda5744d6e81a3cbf6048ede23711263d50dc61ebdb8d5bbd7444"} err="failed to get container status \"57a35f5b188eda5744d6e81a3cbf6048ede23711263d50dc61ebdb8d5bbd7444\": rpc error: code = NotFound desc = could not find container \"57a35f5b188eda5744d6e81a3cbf6048ede23711263d50dc61ebdb8d5bbd7444\": container with ID starting with 57a35f5b188eda5744d6e81a3cbf6048ede23711263d50dc61ebdb8d5bbd7444 not found: ID does not exist" Dec 04 11:02:37 crc kubenswrapper[4831]: I1204 11:02:37.293524 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5285a80-97cc-4a93-9d72-98889f1bd500" path="/var/lib/kubelet/pods/c5285a80-97cc-4a93-9d72-98889f1bd500/volumes" Dec 04 11:02:39 crc kubenswrapper[4831]: I1204 11:02:39.744474 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-l7q2b" Dec 04 11:02:39 crc kubenswrapper[4831]: I1204 11:02:39.796846 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-l7q2b" Dec 04 11:02:40 crc kubenswrapper[4831]: I1204 11:02:40.745928 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l7q2b"] Dec 04 11:02:41 crc kubenswrapper[4831]: I1204 11:02:41.094738 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-l7q2b" podUID="58ffbd66-7af3-47f9-b260-5689251e8c1a" containerName="registry-server" containerID="cri-o://5285a5bccc192cae891df31600e60d7d49863943711f490442d78dca3596c822" gracePeriod=2 Dec 04 11:02:41 crc kubenswrapper[4831]: I1204 11:02:41.548710 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l7q2b" Dec 04 11:02:41 crc kubenswrapper[4831]: I1204 11:02:41.622496 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twwsl\" (UniqueName: \"kubernetes.io/projected/58ffbd66-7af3-47f9-b260-5689251e8c1a-kube-api-access-twwsl\") pod \"58ffbd66-7af3-47f9-b260-5689251e8c1a\" (UID: \"58ffbd66-7af3-47f9-b260-5689251e8c1a\") " Dec 04 11:02:41 crc kubenswrapper[4831]: I1204 11:02:41.622643 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58ffbd66-7af3-47f9-b260-5689251e8c1a-catalog-content\") pod \"58ffbd66-7af3-47f9-b260-5689251e8c1a\" (UID: \"58ffbd66-7af3-47f9-b260-5689251e8c1a\") " Dec 04 11:02:41 crc kubenswrapper[4831]: I1204 11:02:41.622738 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58ffbd66-7af3-47f9-b260-5689251e8c1a-utilities\") pod \"58ffbd66-7af3-47f9-b260-5689251e8c1a\" (UID: \"58ffbd66-7af3-47f9-b260-5689251e8c1a\") " Dec 04 11:02:41 crc kubenswrapper[4831]: I1204 11:02:41.623701 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58ffbd66-7af3-47f9-b260-5689251e8c1a-utilities" (OuterVolumeSpecName: "utilities") pod "58ffbd66-7af3-47f9-b260-5689251e8c1a" (UID: "58ffbd66-7af3-47f9-b260-5689251e8c1a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:02:41 crc kubenswrapper[4831]: I1204 11:02:41.630302 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58ffbd66-7af3-47f9-b260-5689251e8c1a-kube-api-access-twwsl" (OuterVolumeSpecName: "kube-api-access-twwsl") pod "58ffbd66-7af3-47f9-b260-5689251e8c1a" (UID: "58ffbd66-7af3-47f9-b260-5689251e8c1a"). InnerVolumeSpecName "kube-api-access-twwsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:02:41 crc kubenswrapper[4831]: I1204 11:02:41.724910 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twwsl\" (UniqueName: \"kubernetes.io/projected/58ffbd66-7af3-47f9-b260-5689251e8c1a-kube-api-access-twwsl\") on node \"crc\" DevicePath \"\"" Dec 04 11:02:41 crc kubenswrapper[4831]: I1204 11:02:41.724956 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58ffbd66-7af3-47f9-b260-5689251e8c1a-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 11:02:41 crc kubenswrapper[4831]: I1204 11:02:41.753839 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58ffbd66-7af3-47f9-b260-5689251e8c1a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "58ffbd66-7af3-47f9-b260-5689251e8c1a" (UID: "58ffbd66-7af3-47f9-b260-5689251e8c1a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:02:41 crc kubenswrapper[4831]: I1204 11:02:41.826591 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58ffbd66-7af3-47f9-b260-5689251e8c1a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 11:02:42 crc kubenswrapper[4831]: I1204 11:02:42.105352 4831 generic.go:334] "Generic (PLEG): container finished" podID="58ffbd66-7af3-47f9-b260-5689251e8c1a" containerID="5285a5bccc192cae891df31600e60d7d49863943711f490442d78dca3596c822" exitCode=0 Dec 04 11:02:42 crc kubenswrapper[4831]: I1204 11:02:42.105398 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l7q2b" event={"ID":"58ffbd66-7af3-47f9-b260-5689251e8c1a","Type":"ContainerDied","Data":"5285a5bccc192cae891df31600e60d7d49863943711f490442d78dca3596c822"} Dec 04 11:02:42 crc kubenswrapper[4831]: I1204 11:02:42.105436 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l7q2b" event={"ID":"58ffbd66-7af3-47f9-b260-5689251e8c1a","Type":"ContainerDied","Data":"0774a2a1110dd6a163bcdaa80bbbf56d6cef92b572389fba79ba54b46b7c1aa3"} Dec 04 11:02:42 crc kubenswrapper[4831]: I1204 11:02:42.105457 4831 scope.go:117] "RemoveContainer" containerID="5285a5bccc192cae891df31600e60d7d49863943711f490442d78dca3596c822" Dec 04 11:02:42 crc kubenswrapper[4831]: I1204 11:02:42.105559 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l7q2b" Dec 04 11:02:42 crc kubenswrapper[4831]: I1204 11:02:42.142651 4831 scope.go:117] "RemoveContainer" containerID="d7724de71e96789da1f147804ddf9d7424ab19ce1bee88b1816b2880b04d344c" Dec 04 11:02:42 crc kubenswrapper[4831]: I1204 11:02:42.155725 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l7q2b"] Dec 04 11:02:42 crc kubenswrapper[4831]: I1204 11:02:42.167826 4831 scope.go:117] "RemoveContainer" containerID="806de3e37ca754391b2a36c27491ad60916a204c2537519fb84e515df48dfa2d" Dec 04 11:02:42 crc kubenswrapper[4831]: I1204 11:02:42.172928 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-l7q2b"] Dec 04 11:02:42 crc kubenswrapper[4831]: I1204 11:02:42.214654 4831 scope.go:117] "RemoveContainer" containerID="5285a5bccc192cae891df31600e60d7d49863943711f490442d78dca3596c822" Dec 04 11:02:42 crc kubenswrapper[4831]: E1204 11:02:42.215305 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5285a5bccc192cae891df31600e60d7d49863943711f490442d78dca3596c822\": container with ID starting with 5285a5bccc192cae891df31600e60d7d49863943711f490442d78dca3596c822 not found: ID does not exist" containerID="5285a5bccc192cae891df31600e60d7d49863943711f490442d78dca3596c822" Dec 04 11:02:42 crc kubenswrapper[4831]: I1204 11:02:42.215357 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5285a5bccc192cae891df31600e60d7d49863943711f490442d78dca3596c822"} err="failed to get container status \"5285a5bccc192cae891df31600e60d7d49863943711f490442d78dca3596c822\": rpc error: code = NotFound desc = could not find container \"5285a5bccc192cae891df31600e60d7d49863943711f490442d78dca3596c822\": container with ID starting with 5285a5bccc192cae891df31600e60d7d49863943711f490442d78dca3596c822 not found: ID does not exist" Dec 04 11:02:42 crc kubenswrapper[4831]: I1204 11:02:42.215388 4831 scope.go:117] "RemoveContainer" containerID="d7724de71e96789da1f147804ddf9d7424ab19ce1bee88b1816b2880b04d344c" Dec 04 11:02:42 crc kubenswrapper[4831]: E1204 11:02:42.215784 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7724de71e96789da1f147804ddf9d7424ab19ce1bee88b1816b2880b04d344c\": container with ID starting with d7724de71e96789da1f147804ddf9d7424ab19ce1bee88b1816b2880b04d344c not found: ID does not exist" containerID="d7724de71e96789da1f147804ddf9d7424ab19ce1bee88b1816b2880b04d344c" Dec 04 11:02:42 crc kubenswrapper[4831]: I1204 11:02:42.215837 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7724de71e96789da1f147804ddf9d7424ab19ce1bee88b1816b2880b04d344c"} err="failed to get container status \"d7724de71e96789da1f147804ddf9d7424ab19ce1bee88b1816b2880b04d344c\": rpc error: code = NotFound desc = could not find container \"d7724de71e96789da1f147804ddf9d7424ab19ce1bee88b1816b2880b04d344c\": container with ID starting with d7724de71e96789da1f147804ddf9d7424ab19ce1bee88b1816b2880b04d344c not found: ID does not exist" Dec 04 11:02:42 crc kubenswrapper[4831]: I1204 11:02:42.215866 4831 scope.go:117] "RemoveContainer" containerID="806de3e37ca754391b2a36c27491ad60916a204c2537519fb84e515df48dfa2d" Dec 04 11:02:42 crc kubenswrapper[4831]: E1204 11:02:42.216119 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"806de3e37ca754391b2a36c27491ad60916a204c2537519fb84e515df48dfa2d\": container with ID starting with 806de3e37ca754391b2a36c27491ad60916a204c2537519fb84e515df48dfa2d not found: ID does not exist" containerID="806de3e37ca754391b2a36c27491ad60916a204c2537519fb84e515df48dfa2d" Dec 04 11:02:42 crc kubenswrapper[4831]: I1204 11:02:42.216165 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"806de3e37ca754391b2a36c27491ad60916a204c2537519fb84e515df48dfa2d"} err="failed to get container status \"806de3e37ca754391b2a36c27491ad60916a204c2537519fb84e515df48dfa2d\": rpc error: code = NotFound desc = could not find container \"806de3e37ca754391b2a36c27491ad60916a204c2537519fb84e515df48dfa2d\": container with ID starting with 806de3e37ca754391b2a36c27491ad60916a204c2537519fb84e515df48dfa2d not found: ID does not exist" Dec 04 11:02:43 crc kubenswrapper[4831]: I1204 11:02:43.288858 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58ffbd66-7af3-47f9-b260-5689251e8c1a" path="/var/lib/kubelet/pods/58ffbd66-7af3-47f9-b260-5689251e8c1a/volumes" Dec 04 11:02:51 crc kubenswrapper[4831]: I1204 11:02:51.975353 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:02:51 crc kubenswrapper[4831]: I1204 11:02:51.976025 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:03:06 crc kubenswrapper[4831]: I1204 11:03:06.029857 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 04 11:03:06 crc kubenswrapper[4831]: I1204 11:03:06.030618 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="44484889-57d0-478f-a19a-2e78f913faf3" containerName="prometheus" containerID="cri-o://da005076936568b2afdf2c6e7de2b3a8ed1bae2bb63a31f2b72e623077246a19" gracePeriod=600 Dec 04 11:03:06 crc kubenswrapper[4831]: I1204 11:03:06.030791 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="44484889-57d0-478f-a19a-2e78f913faf3" containerName="thanos-sidecar" containerID="cri-o://0558a93511fc21a680303c1455178512d22324550297919104375cbc5ea001a6" gracePeriod=600 Dec 04 11:03:06 crc kubenswrapper[4831]: I1204 11:03:06.030791 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="44484889-57d0-478f-a19a-2e78f913faf3" containerName="config-reloader" containerID="cri-o://3e76e66e07641bf5c8832f9a8bc91dc4b61d75d659d278bd05fa0d513a7eb75a" gracePeriod=600 Dec 04 11:03:06 crc kubenswrapper[4831]: I1204 11:03:06.270506 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-r2xx8"] Dec 04 11:03:06 crc kubenswrapper[4831]: E1204 11:03:06.271205 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58ffbd66-7af3-47f9-b260-5689251e8c1a" containerName="registry-server" Dec 04 11:03:06 crc kubenswrapper[4831]: I1204 11:03:06.271222 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="58ffbd66-7af3-47f9-b260-5689251e8c1a" containerName="registry-server" Dec 04 11:03:06 crc kubenswrapper[4831]: E1204 11:03:06.271243 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5285a80-97cc-4a93-9d72-98889f1bd500" containerName="extract-utilities" Dec 04 11:03:06 crc kubenswrapper[4831]: I1204 11:03:06.271249 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5285a80-97cc-4a93-9d72-98889f1bd500" containerName="extract-utilities" Dec 04 11:03:06 crc kubenswrapper[4831]: E1204 11:03:06.271261 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5285a80-97cc-4a93-9d72-98889f1bd500" containerName="extract-content" Dec 04 11:03:06 crc kubenswrapper[4831]: I1204 11:03:06.271268 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5285a80-97cc-4a93-9d72-98889f1bd500" containerName="extract-content" Dec 04 11:03:06 crc kubenswrapper[4831]: E1204 11:03:06.271286 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58ffbd66-7af3-47f9-b260-5689251e8c1a" containerName="extract-content" Dec 04 11:03:06 crc kubenswrapper[4831]: I1204 11:03:06.271291 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="58ffbd66-7af3-47f9-b260-5689251e8c1a" containerName="extract-content" Dec 04 11:03:06 crc kubenswrapper[4831]: E1204 11:03:06.271322 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5285a80-97cc-4a93-9d72-98889f1bd500" containerName="registry-server" Dec 04 11:03:06 crc kubenswrapper[4831]: I1204 11:03:06.271327 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5285a80-97cc-4a93-9d72-98889f1bd500" containerName="registry-server" Dec 04 11:03:06 crc kubenswrapper[4831]: E1204 11:03:06.271335 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58ffbd66-7af3-47f9-b260-5689251e8c1a" containerName="extract-utilities" Dec 04 11:03:06 crc kubenswrapper[4831]: I1204 11:03:06.271343 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="58ffbd66-7af3-47f9-b260-5689251e8c1a" containerName="extract-utilities" Dec 04 11:03:06 crc kubenswrapper[4831]: I1204 11:03:06.271527 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="58ffbd66-7af3-47f9-b260-5689251e8c1a" containerName="registry-server" Dec 04 11:03:06 crc kubenswrapper[4831]: I1204 11:03:06.271543 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5285a80-97cc-4a93-9d72-98889f1bd500" containerName="registry-server" Dec 04 11:03:06 crc kubenswrapper[4831]: I1204 11:03:06.326844 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r2xx8" Dec 04 11:03:06 crc kubenswrapper[4831]: I1204 11:03:06.337327 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r2xx8"] Dec 04 11:03:06 crc kubenswrapper[4831]: I1204 11:03:06.378220 4831 generic.go:334] "Generic (PLEG): container finished" podID="44484889-57d0-478f-a19a-2e78f913faf3" containerID="0558a93511fc21a680303c1455178512d22324550297919104375cbc5ea001a6" exitCode=0 Dec 04 11:03:06 crc kubenswrapper[4831]: I1204 11:03:06.378502 4831 generic.go:334] "Generic (PLEG): container finished" podID="44484889-57d0-478f-a19a-2e78f913faf3" containerID="da005076936568b2afdf2c6e7de2b3a8ed1bae2bb63a31f2b72e623077246a19" exitCode=0 Dec 04 11:03:06 crc kubenswrapper[4831]: I1204 11:03:06.378628 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"44484889-57d0-478f-a19a-2e78f913faf3","Type":"ContainerDied","Data":"0558a93511fc21a680303c1455178512d22324550297919104375cbc5ea001a6"} Dec 04 11:03:06 crc kubenswrapper[4831]: I1204 11:03:06.378887 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"44484889-57d0-478f-a19a-2e78f913faf3","Type":"ContainerDied","Data":"da005076936568b2afdf2c6e7de2b3a8ed1bae2bb63a31f2b72e623077246a19"} Dec 04 11:03:06 crc kubenswrapper[4831]: I1204 11:03:06.450371 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55wht\" (UniqueName: \"kubernetes.io/projected/1a5f794f-9b0c-4c82-88ac-e0626e5c79f9-kube-api-access-55wht\") pod \"community-operators-r2xx8\" (UID: \"1a5f794f-9b0c-4c82-88ac-e0626e5c79f9\") " pod="openshift-marketplace/community-operators-r2xx8" Dec 04 11:03:06 crc kubenswrapper[4831]: I1204 11:03:06.450546 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a5f794f-9b0c-4c82-88ac-e0626e5c79f9-utilities\") pod \"community-operators-r2xx8\" (UID: \"1a5f794f-9b0c-4c82-88ac-e0626e5c79f9\") " pod="openshift-marketplace/community-operators-r2xx8" Dec 04 11:03:06 crc kubenswrapper[4831]: I1204 11:03:06.450635 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a5f794f-9b0c-4c82-88ac-e0626e5c79f9-catalog-content\") pod \"community-operators-r2xx8\" (UID: \"1a5f794f-9b0c-4c82-88ac-e0626e5c79f9\") " pod="openshift-marketplace/community-operators-r2xx8" Dec 04 11:03:06 crc kubenswrapper[4831]: I1204 11:03:06.552017 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55wht\" (UniqueName: \"kubernetes.io/projected/1a5f794f-9b0c-4c82-88ac-e0626e5c79f9-kube-api-access-55wht\") pod \"community-operators-r2xx8\" (UID: \"1a5f794f-9b0c-4c82-88ac-e0626e5c79f9\") " pod="openshift-marketplace/community-operators-r2xx8" Dec 04 11:03:06 crc kubenswrapper[4831]: I1204 11:03:06.552101 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a5f794f-9b0c-4c82-88ac-e0626e5c79f9-utilities\") pod \"community-operators-r2xx8\" (UID: \"1a5f794f-9b0c-4c82-88ac-e0626e5c79f9\") " pod="openshift-marketplace/community-operators-r2xx8" Dec 04 11:03:06 crc kubenswrapper[4831]: I1204 11:03:06.552131 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a5f794f-9b0c-4c82-88ac-e0626e5c79f9-catalog-content\") pod \"community-operators-r2xx8\" (UID: \"1a5f794f-9b0c-4c82-88ac-e0626e5c79f9\") " pod="openshift-marketplace/community-operators-r2xx8" Dec 04 11:03:06 crc kubenswrapper[4831]: I1204 11:03:06.552792 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a5f794f-9b0c-4c82-88ac-e0626e5c79f9-catalog-content\") pod \"community-operators-r2xx8\" (UID: \"1a5f794f-9b0c-4c82-88ac-e0626e5c79f9\") " pod="openshift-marketplace/community-operators-r2xx8" Dec 04 11:03:06 crc kubenswrapper[4831]: I1204 11:03:06.552857 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a5f794f-9b0c-4c82-88ac-e0626e5c79f9-utilities\") pod \"community-operators-r2xx8\" (UID: \"1a5f794f-9b0c-4c82-88ac-e0626e5c79f9\") " pod="openshift-marketplace/community-operators-r2xx8" Dec 04 11:03:06 crc kubenswrapper[4831]: I1204 11:03:06.576144 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55wht\" (UniqueName: \"kubernetes.io/projected/1a5f794f-9b0c-4c82-88ac-e0626e5c79f9-kube-api-access-55wht\") pod \"community-operators-r2xx8\" (UID: \"1a5f794f-9b0c-4c82-88ac-e0626e5c79f9\") " pod="openshift-marketplace/community-operators-r2xx8" Dec 04 11:03:06 crc kubenswrapper[4831]: I1204 11:03:06.651281 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r2xx8" Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.152650 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="44484889-57d0-478f-a19a-2e78f913faf3" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.131:9090/-/ready\": dial tcp 10.217.0.131:9090: connect: connection refused" Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.205585 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r2xx8"] Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.391451 4831 generic.go:334] "Generic (PLEG): container finished" podID="44484889-57d0-478f-a19a-2e78f913faf3" containerID="3e76e66e07641bf5c8832f9a8bc91dc4b61d75d659d278bd05fa0d513a7eb75a" exitCode=0 Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.391517 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"44484889-57d0-478f-a19a-2e78f913faf3","Type":"ContainerDied","Data":"3e76e66e07641bf5c8832f9a8bc91dc4b61d75d659d278bd05fa0d513a7eb75a"} Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.391542 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"44484889-57d0-478f-a19a-2e78f913faf3","Type":"ContainerDied","Data":"513900fbf669dc79ceb241084fd1b200a072a12239ce24a633bbaed1680cd8ff"} Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.391551 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="513900fbf669dc79ceb241084fd1b200a072a12239ce24a633bbaed1680cd8ff" Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.394963 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r2xx8" event={"ID":"1a5f794f-9b0c-4c82-88ac-e0626e5c79f9","Type":"ContainerStarted","Data":"85c1a8c601d5ae409a27bd94a0575a782b632c022cc1ad25ac1b97fab740755f"} Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.480129 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.580591 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/44484889-57d0-478f-a19a-2e78f913faf3-web-config\") pod \"44484889-57d0-478f-a19a-2e78f913faf3\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.580982 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/44484889-57d0-478f-a19a-2e78f913faf3-config-out\") pod \"44484889-57d0-478f-a19a-2e78f913faf3\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.581024 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44484889-57d0-478f-a19a-2e78f913faf3-secret-combined-ca-bundle\") pod \"44484889-57d0-478f-a19a-2e78f913faf3\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.581061 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/44484889-57d0-478f-a19a-2e78f913faf3-config\") pod \"44484889-57d0-478f-a19a-2e78f913faf3\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.581397 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ll6xr\" (UniqueName: \"kubernetes.io/projected/44484889-57d0-478f-a19a-2e78f913faf3-kube-api-access-ll6xr\") pod \"44484889-57d0-478f-a19a-2e78f913faf3\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.581421 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/44484889-57d0-478f-a19a-2e78f913faf3-prometheus-metric-storage-rulefiles-0\") pod \"44484889-57d0-478f-a19a-2e78f913faf3\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.581695 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b44ca90-3490-4c8d-99fe-c1474d342303\") pod \"44484889-57d0-478f-a19a-2e78f913faf3\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.581746 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/44484889-57d0-478f-a19a-2e78f913faf3-thanos-prometheus-http-client-file\") pod \"44484889-57d0-478f-a19a-2e78f913faf3\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.581825 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/44484889-57d0-478f-a19a-2e78f913faf3-tls-assets\") pod \"44484889-57d0-478f-a19a-2e78f913faf3\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.581874 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/44484889-57d0-478f-a19a-2e78f913faf3-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"44484889-57d0-478f-a19a-2e78f913faf3\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.581908 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/44484889-57d0-478f-a19a-2e78f913faf3-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"44484889-57d0-478f-a19a-2e78f913faf3\" (UID: \"44484889-57d0-478f-a19a-2e78f913faf3\") " Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.582534 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44484889-57d0-478f-a19a-2e78f913faf3-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "44484889-57d0-478f-a19a-2e78f913faf3" (UID: "44484889-57d0-478f-a19a-2e78f913faf3"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.590774 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44484889-57d0-478f-a19a-2e78f913faf3-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "44484889-57d0-478f-a19a-2e78f913faf3" (UID: "44484889-57d0-478f-a19a-2e78f913faf3"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.591297 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44484889-57d0-478f-a19a-2e78f913faf3-config-out" (OuterVolumeSpecName: "config-out") pod "44484889-57d0-478f-a19a-2e78f913faf3" (UID: "44484889-57d0-478f-a19a-2e78f913faf3"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.591318 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44484889-57d0-478f-a19a-2e78f913faf3-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "44484889-57d0-478f-a19a-2e78f913faf3" (UID: "44484889-57d0-478f-a19a-2e78f913faf3"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.593296 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44484889-57d0-478f-a19a-2e78f913faf3-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "44484889-57d0-478f-a19a-2e78f913faf3" (UID: "44484889-57d0-478f-a19a-2e78f913faf3"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.593331 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44484889-57d0-478f-a19a-2e78f913faf3-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "44484889-57d0-478f-a19a-2e78f913faf3" (UID: "44484889-57d0-478f-a19a-2e78f913faf3"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.594822 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44484889-57d0-478f-a19a-2e78f913faf3-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "44484889-57d0-478f-a19a-2e78f913faf3" (UID: "44484889-57d0-478f-a19a-2e78f913faf3"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.595980 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44484889-57d0-478f-a19a-2e78f913faf3-config" (OuterVolumeSpecName: "config") pod "44484889-57d0-478f-a19a-2e78f913faf3" (UID: "44484889-57d0-478f-a19a-2e78f913faf3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.596555 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44484889-57d0-478f-a19a-2e78f913faf3-kube-api-access-ll6xr" (OuterVolumeSpecName: "kube-api-access-ll6xr") pod "44484889-57d0-478f-a19a-2e78f913faf3" (UID: "44484889-57d0-478f-a19a-2e78f913faf3"). InnerVolumeSpecName "kube-api-access-ll6xr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.621605 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b44ca90-3490-4c8d-99fe-c1474d342303" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "44484889-57d0-478f-a19a-2e78f913faf3" (UID: "44484889-57d0-478f-a19a-2e78f913faf3"). InnerVolumeSpecName "pvc-7b44ca90-3490-4c8d-99fe-c1474d342303". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.684176 4831 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/44484889-57d0-478f-a19a-2e78f913faf3-tls-assets\") on node \"crc\" DevicePath \"\"" Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.684215 4831 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/44484889-57d0-478f-a19a-2e78f913faf3-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.684228 4831 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/44484889-57d0-478f-a19a-2e78f913faf3-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.684238 4831 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/44484889-57d0-478f-a19a-2e78f913faf3-config-out\") on node \"crc\" DevicePath \"\"" Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.684252 4831 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44484889-57d0-478f-a19a-2e78f913faf3-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.684260 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/44484889-57d0-478f-a19a-2e78f913faf3-config\") on node \"crc\" DevicePath \"\"" Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.684268 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ll6xr\" (UniqueName: \"kubernetes.io/projected/44484889-57d0-478f-a19a-2e78f913faf3-kube-api-access-ll6xr\") on node \"crc\" DevicePath \"\"" Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.684276 4831 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/44484889-57d0-478f-a19a-2e78f913faf3-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.684307 4831 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-7b44ca90-3490-4c8d-99fe-c1474d342303\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b44ca90-3490-4c8d-99fe-c1474d342303\") on node \"crc\" " Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.684320 4831 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/44484889-57d0-478f-a19a-2e78f913faf3-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.702735 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44484889-57d0-478f-a19a-2e78f913faf3-web-config" (OuterVolumeSpecName: "web-config") pod "44484889-57d0-478f-a19a-2e78f913faf3" (UID: "44484889-57d0-478f-a19a-2e78f913faf3"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.714355 4831 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.715115 4831 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-7b44ca90-3490-4c8d-99fe-c1474d342303" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b44ca90-3490-4c8d-99fe-c1474d342303") on node "crc" Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.786193 4831 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/44484889-57d0-478f-a19a-2e78f913faf3-web-config\") on node \"crc\" DevicePath \"\"" Dec 04 11:03:07 crc kubenswrapper[4831]: I1204 11:03:07.786240 4831 reconciler_common.go:293] "Volume detached for volume \"pvc-7b44ca90-3490-4c8d-99fe-c1474d342303\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b44ca90-3490-4c8d-99fe-c1474d342303\") on node \"crc\" DevicePath \"\"" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.432912 4831 generic.go:334] "Generic (PLEG): container finished" podID="1a5f794f-9b0c-4c82-88ac-e0626e5c79f9" containerID="0cfdf154b47ed740177aeb89151f29ba49f929bc44801f4b8273fe75edd18a69" exitCode=0 Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.433229 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.437950 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r2xx8" event={"ID":"1a5f794f-9b0c-4c82-88ac-e0626e5c79f9","Type":"ContainerDied","Data":"0cfdf154b47ed740177aeb89151f29ba49f929bc44801f4b8273fe75edd18a69"} Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.507101 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.515864 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.539531 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 04 11:03:08 crc kubenswrapper[4831]: E1204 11:03:08.540049 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44484889-57d0-478f-a19a-2e78f913faf3" containerName="thanos-sidecar" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.540069 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="44484889-57d0-478f-a19a-2e78f913faf3" containerName="thanos-sidecar" Dec 04 11:03:08 crc kubenswrapper[4831]: E1204 11:03:08.540093 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44484889-57d0-478f-a19a-2e78f913faf3" containerName="init-config-reloader" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.540101 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="44484889-57d0-478f-a19a-2e78f913faf3" containerName="init-config-reloader" Dec 04 11:03:08 crc kubenswrapper[4831]: E1204 11:03:08.540121 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44484889-57d0-478f-a19a-2e78f913faf3" containerName="config-reloader" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.540128 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="44484889-57d0-478f-a19a-2e78f913faf3" containerName="config-reloader" Dec 04 11:03:08 crc kubenswrapper[4831]: E1204 11:03:08.540142 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44484889-57d0-478f-a19a-2e78f913faf3" containerName="prometheus" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.540147 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="44484889-57d0-478f-a19a-2e78f913faf3" containerName="prometheus" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.540328 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="44484889-57d0-478f-a19a-2e78f913faf3" containerName="prometheus" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.540346 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="44484889-57d0-478f-a19a-2e78f913faf3" containerName="thanos-sidecar" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.540365 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="44484889-57d0-478f-a19a-2e78f913faf3" containerName="config-reloader" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.542519 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.546630 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.546898 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.547018 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-6p9wz" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.548021 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.548150 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.552809 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.562843 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.622127 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/625718e4-29fd-4886-ae65-76091a6def3c-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"625718e4-29fd-4886-ae65-76091a6def3c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.622174 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/625718e4-29fd-4886-ae65-76091a6def3c-config\") pod \"prometheus-metric-storage-0\" (UID: \"625718e4-29fd-4886-ae65-76091a6def3c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.622217 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8c4x\" (UniqueName: \"kubernetes.io/projected/625718e4-29fd-4886-ae65-76091a6def3c-kube-api-access-f8c4x\") pod \"prometheus-metric-storage-0\" (UID: \"625718e4-29fd-4886-ae65-76091a6def3c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.622281 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/625718e4-29fd-4886-ae65-76091a6def3c-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"625718e4-29fd-4886-ae65-76091a6def3c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.622356 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7b44ca90-3490-4c8d-99fe-c1474d342303\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b44ca90-3490-4c8d-99fe-c1474d342303\") pod \"prometheus-metric-storage-0\" (UID: \"625718e4-29fd-4886-ae65-76091a6def3c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.622387 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/625718e4-29fd-4886-ae65-76091a6def3c-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"625718e4-29fd-4886-ae65-76091a6def3c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.622460 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/625718e4-29fd-4886-ae65-76091a6def3c-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"625718e4-29fd-4886-ae65-76091a6def3c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.622507 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/625718e4-29fd-4886-ae65-76091a6def3c-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"625718e4-29fd-4886-ae65-76091a6def3c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.622544 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/625718e4-29fd-4886-ae65-76091a6def3c-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"625718e4-29fd-4886-ae65-76091a6def3c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.622571 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/625718e4-29fd-4886-ae65-76091a6def3c-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"625718e4-29fd-4886-ae65-76091a6def3c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.622588 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/625718e4-29fd-4886-ae65-76091a6def3c-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"625718e4-29fd-4886-ae65-76091a6def3c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 11:03:08 crc kubenswrapper[4831]: E1204 11:03:08.712135 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44484889_57d0_478f_a19a_2e78f913faf3.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44484889_57d0_478f_a19a_2e78f913faf3.slice/crio-513900fbf669dc79ceb241084fd1b200a072a12239ce24a633bbaed1680cd8ff\": RecentStats: unable to find data in memory cache]" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.733920 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/625718e4-29fd-4886-ae65-76091a6def3c-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"625718e4-29fd-4886-ae65-76091a6def3c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.734008 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/625718e4-29fd-4886-ae65-76091a6def3c-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"625718e4-29fd-4886-ae65-76091a6def3c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.734053 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/625718e4-29fd-4886-ae65-76091a6def3c-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"625718e4-29fd-4886-ae65-76091a6def3c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.734082 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/625718e4-29fd-4886-ae65-76091a6def3c-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"625718e4-29fd-4886-ae65-76091a6def3c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.734234 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/625718e4-29fd-4886-ae65-76091a6def3c-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"625718e4-29fd-4886-ae65-76091a6def3c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.734298 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/625718e4-29fd-4886-ae65-76091a6def3c-config\") pod \"prometheus-metric-storage-0\" (UID: \"625718e4-29fd-4886-ae65-76091a6def3c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.734366 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8c4x\" (UniqueName: \"kubernetes.io/projected/625718e4-29fd-4886-ae65-76091a6def3c-kube-api-access-f8c4x\") pod \"prometheus-metric-storage-0\" (UID: \"625718e4-29fd-4886-ae65-76091a6def3c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.734394 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/625718e4-29fd-4886-ae65-76091a6def3c-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"625718e4-29fd-4886-ae65-76091a6def3c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.734438 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7b44ca90-3490-4c8d-99fe-c1474d342303\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b44ca90-3490-4c8d-99fe-c1474d342303\") pod \"prometheus-metric-storage-0\" (UID: \"625718e4-29fd-4886-ae65-76091a6def3c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.734460 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/625718e4-29fd-4886-ae65-76091a6def3c-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"625718e4-29fd-4886-ae65-76091a6def3c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.734513 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/625718e4-29fd-4886-ae65-76091a6def3c-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"625718e4-29fd-4886-ae65-76091a6def3c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.736352 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/625718e4-29fd-4886-ae65-76091a6def3c-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"625718e4-29fd-4886-ae65-76091a6def3c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.749060 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/625718e4-29fd-4886-ae65-76091a6def3c-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"625718e4-29fd-4886-ae65-76091a6def3c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.760969 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/625718e4-29fd-4886-ae65-76091a6def3c-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"625718e4-29fd-4886-ae65-76091a6def3c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.761881 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/625718e4-29fd-4886-ae65-76091a6def3c-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"625718e4-29fd-4886-ae65-76091a6def3c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.767134 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/625718e4-29fd-4886-ae65-76091a6def3c-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"625718e4-29fd-4886-ae65-76091a6def3c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.768699 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/625718e4-29fd-4886-ae65-76091a6def3c-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"625718e4-29fd-4886-ae65-76091a6def3c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.771393 4831 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.771448 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7b44ca90-3490-4c8d-99fe-c1474d342303\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b44ca90-3490-4c8d-99fe-c1474d342303\") pod \"prometheus-metric-storage-0\" (UID: \"625718e4-29fd-4886-ae65-76091a6def3c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c51bcc27fcdd05e3b872f332894f06756ce616ce82fb1b91e1091af3050124aa/globalmount\"" pod="openstack/prometheus-metric-storage-0" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.772415 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/625718e4-29fd-4886-ae65-76091a6def3c-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"625718e4-29fd-4886-ae65-76091a6def3c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.777373 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/625718e4-29fd-4886-ae65-76091a6def3c-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"625718e4-29fd-4886-ae65-76091a6def3c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.784886 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8c4x\" (UniqueName: \"kubernetes.io/projected/625718e4-29fd-4886-ae65-76091a6def3c-kube-api-access-f8c4x\") pod \"prometheus-metric-storage-0\" (UID: \"625718e4-29fd-4886-ae65-76091a6def3c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.807507 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/625718e4-29fd-4886-ae65-76091a6def3c-config\") pod \"prometheus-metric-storage-0\" (UID: \"625718e4-29fd-4886-ae65-76091a6def3c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.822943 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7b44ca90-3490-4c8d-99fe-c1474d342303\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b44ca90-3490-4c8d-99fe-c1474d342303\") pod \"prometheus-metric-storage-0\" (UID: \"625718e4-29fd-4886-ae65-76091a6def3c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 11:03:08 crc kubenswrapper[4831]: I1204 11:03:08.870147 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 04 11:03:09 crc kubenswrapper[4831]: I1204 11:03:09.289414 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44484889-57d0-478f-a19a-2e78f913faf3" path="/var/lib/kubelet/pods/44484889-57d0-478f-a19a-2e78f913faf3/volumes" Dec 04 11:03:09 crc kubenswrapper[4831]: I1204 11:03:09.330153 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 04 11:03:09 crc kubenswrapper[4831]: I1204 11:03:09.452752 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"625718e4-29fd-4886-ae65-76091a6def3c","Type":"ContainerStarted","Data":"790201073eb94913de410295d5f314474d24dfad4170144b68f486526c09e8e0"} Dec 04 11:03:10 crc kubenswrapper[4831]: I1204 11:03:10.463165 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r2xx8" event={"ID":"1a5f794f-9b0c-4c82-88ac-e0626e5c79f9","Type":"ContainerStarted","Data":"37e9bab6467aa1082e675dc0dc982c034e7e438858d50537d93e3615d8802e68"} Dec 04 11:03:11 crc kubenswrapper[4831]: I1204 11:03:11.474119 4831 generic.go:334] "Generic (PLEG): container finished" podID="1a5f794f-9b0c-4c82-88ac-e0626e5c79f9" containerID="37e9bab6467aa1082e675dc0dc982c034e7e438858d50537d93e3615d8802e68" exitCode=0 Dec 04 11:03:11 crc kubenswrapper[4831]: I1204 11:03:11.474178 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r2xx8" event={"ID":"1a5f794f-9b0c-4c82-88ac-e0626e5c79f9","Type":"ContainerDied","Data":"37e9bab6467aa1082e675dc0dc982c034e7e438858d50537d93e3615d8802e68"} Dec 04 11:03:13 crc kubenswrapper[4831]: I1204 11:03:13.492626 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r2xx8" event={"ID":"1a5f794f-9b0c-4c82-88ac-e0626e5c79f9","Type":"ContainerStarted","Data":"de6431545ae85e94d29d08a407ab9c88cfbf63e9241a893dfc1a57b351d629cf"} Dec 04 11:03:13 crc kubenswrapper[4831]: I1204 11:03:13.494687 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"625718e4-29fd-4886-ae65-76091a6def3c","Type":"ContainerStarted","Data":"d7a142e5e3ee4e7a82b95d8419f2a7f1a5d3c4c149a01b0bc087d9d0606d4c32"} Dec 04 11:03:13 crc kubenswrapper[4831]: I1204 11:03:13.517323 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-r2xx8" podStartSLOduration=3.867007144 podStartE2EDuration="7.517305805s" podCreationTimestamp="2025-12-04 11:03:06 +0000 UTC" firstStartedPulling="2025-12-04 11:03:08.443990522 +0000 UTC m=+2885.393165836" lastFinishedPulling="2025-12-04 11:03:12.094289193 +0000 UTC m=+2889.043464497" observedRunningTime="2025-12-04 11:03:13.510901095 +0000 UTC m=+2890.460076409" watchObservedRunningTime="2025-12-04 11:03:13.517305805 +0000 UTC m=+2890.466481119" Dec 04 11:03:16 crc kubenswrapper[4831]: I1204 11:03:16.653185 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-r2xx8" Dec 04 11:03:16 crc kubenswrapper[4831]: I1204 11:03:16.653875 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-r2xx8" Dec 04 11:03:16 crc kubenswrapper[4831]: I1204 11:03:16.703930 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-r2xx8" Dec 04 11:03:17 crc kubenswrapper[4831]: I1204 11:03:17.585735 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-r2xx8" Dec 04 11:03:17 crc kubenswrapper[4831]: I1204 11:03:17.670398 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r2xx8"] Dec 04 11:03:19 crc kubenswrapper[4831]: I1204 11:03:19.561185 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-r2xx8" podUID="1a5f794f-9b0c-4c82-88ac-e0626e5c79f9" containerName="registry-server" containerID="cri-o://de6431545ae85e94d29d08a407ab9c88cfbf63e9241a893dfc1a57b351d629cf" gracePeriod=2 Dec 04 11:03:20 crc kubenswrapper[4831]: I1204 11:03:20.576421 4831 generic.go:334] "Generic (PLEG): container finished" podID="625718e4-29fd-4886-ae65-76091a6def3c" containerID="d7a142e5e3ee4e7a82b95d8419f2a7f1a5d3c4c149a01b0bc087d9d0606d4c32" exitCode=0 Dec 04 11:03:20 crc kubenswrapper[4831]: I1204 11:03:20.576504 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"625718e4-29fd-4886-ae65-76091a6def3c","Type":"ContainerDied","Data":"d7a142e5e3ee4e7a82b95d8419f2a7f1a5d3c4c149a01b0bc087d9d0606d4c32"} Dec 04 11:03:20 crc kubenswrapper[4831]: I1204 11:03:20.583147 4831 generic.go:334] "Generic (PLEG): container finished" podID="1a5f794f-9b0c-4c82-88ac-e0626e5c79f9" containerID="de6431545ae85e94d29d08a407ab9c88cfbf63e9241a893dfc1a57b351d629cf" exitCode=0 Dec 04 11:03:20 crc kubenswrapper[4831]: I1204 11:03:20.583191 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r2xx8" event={"ID":"1a5f794f-9b0c-4c82-88ac-e0626e5c79f9","Type":"ContainerDied","Data":"de6431545ae85e94d29d08a407ab9c88cfbf63e9241a893dfc1a57b351d629cf"} Dec 04 11:03:20 crc kubenswrapper[4831]: I1204 11:03:20.583220 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r2xx8" event={"ID":"1a5f794f-9b0c-4c82-88ac-e0626e5c79f9","Type":"ContainerDied","Data":"85c1a8c601d5ae409a27bd94a0575a782b632c022cc1ad25ac1b97fab740755f"} Dec 04 11:03:20 crc kubenswrapper[4831]: I1204 11:03:20.583231 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85c1a8c601d5ae409a27bd94a0575a782b632c022cc1ad25ac1b97fab740755f" Dec 04 11:03:20 crc kubenswrapper[4831]: I1204 11:03:20.710629 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r2xx8" Dec 04 11:03:20 crc kubenswrapper[4831]: I1204 11:03:20.785810 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a5f794f-9b0c-4c82-88ac-e0626e5c79f9-utilities\") pod \"1a5f794f-9b0c-4c82-88ac-e0626e5c79f9\" (UID: \"1a5f794f-9b0c-4c82-88ac-e0626e5c79f9\") " Dec 04 11:03:20 crc kubenswrapper[4831]: I1204 11:03:20.785989 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55wht\" (UniqueName: \"kubernetes.io/projected/1a5f794f-9b0c-4c82-88ac-e0626e5c79f9-kube-api-access-55wht\") pod \"1a5f794f-9b0c-4c82-88ac-e0626e5c79f9\" (UID: \"1a5f794f-9b0c-4c82-88ac-e0626e5c79f9\") " Dec 04 11:03:20 crc kubenswrapper[4831]: I1204 11:03:20.786023 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a5f794f-9b0c-4c82-88ac-e0626e5c79f9-catalog-content\") pod \"1a5f794f-9b0c-4c82-88ac-e0626e5c79f9\" (UID: \"1a5f794f-9b0c-4c82-88ac-e0626e5c79f9\") " Dec 04 11:03:20 crc kubenswrapper[4831]: I1204 11:03:20.786692 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a5f794f-9b0c-4c82-88ac-e0626e5c79f9-utilities" (OuterVolumeSpecName: "utilities") pod "1a5f794f-9b0c-4c82-88ac-e0626e5c79f9" (UID: "1a5f794f-9b0c-4c82-88ac-e0626e5c79f9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:03:20 crc kubenswrapper[4831]: I1204 11:03:20.786796 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a5f794f-9b0c-4c82-88ac-e0626e5c79f9-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 11:03:20 crc kubenswrapper[4831]: I1204 11:03:20.794005 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a5f794f-9b0c-4c82-88ac-e0626e5c79f9-kube-api-access-55wht" (OuterVolumeSpecName: "kube-api-access-55wht") pod "1a5f794f-9b0c-4c82-88ac-e0626e5c79f9" (UID: "1a5f794f-9b0c-4c82-88ac-e0626e5c79f9"). InnerVolumeSpecName "kube-api-access-55wht". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:03:20 crc kubenswrapper[4831]: I1204 11:03:20.888655 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55wht\" (UniqueName: \"kubernetes.io/projected/1a5f794f-9b0c-4c82-88ac-e0626e5c79f9-kube-api-access-55wht\") on node \"crc\" DevicePath \"\"" Dec 04 11:03:21 crc kubenswrapper[4831]: I1204 11:03:21.595997 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r2xx8" Dec 04 11:03:21 crc kubenswrapper[4831]: I1204 11:03:21.941752 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a5f794f-9b0c-4c82-88ac-e0626e5c79f9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1a5f794f-9b0c-4c82-88ac-e0626e5c79f9" (UID: "1a5f794f-9b0c-4c82-88ac-e0626e5c79f9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:03:21 crc kubenswrapper[4831]: I1204 11:03:21.977620 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:03:21 crc kubenswrapper[4831]: I1204 11:03:21.977705 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:03:22 crc kubenswrapper[4831]: I1204 11:03:22.026388 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a5f794f-9b0c-4c82-88ac-e0626e5c79f9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 11:03:22 crc kubenswrapper[4831]: I1204 11:03:22.256506 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r2xx8"] Dec 04 11:03:22 crc kubenswrapper[4831]: I1204 11:03:22.270968 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-r2xx8"] Dec 04 11:03:22 crc kubenswrapper[4831]: I1204 11:03:22.609564 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"625718e4-29fd-4886-ae65-76091a6def3c","Type":"ContainerStarted","Data":"41c13b60e762892852548cd95cdad76991ee3dff761ccfd807234b7672e0250f"} Dec 04 11:03:23 crc kubenswrapper[4831]: I1204 11:03:23.291874 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a5f794f-9b0c-4c82-88ac-e0626e5c79f9" path="/var/lib/kubelet/pods/1a5f794f-9b0c-4c82-88ac-e0626e5c79f9/volumes" Dec 04 11:03:26 crc kubenswrapper[4831]: I1204 11:03:26.537916 4831 scope.go:117] "RemoveContainer" containerID="da005076936568b2afdf2c6e7de2b3a8ed1bae2bb63a31f2b72e623077246a19" Dec 04 11:03:26 crc kubenswrapper[4831]: I1204 11:03:26.558868 4831 scope.go:117] "RemoveContainer" containerID="0558a93511fc21a680303c1455178512d22324550297919104375cbc5ea001a6" Dec 04 11:03:26 crc kubenswrapper[4831]: I1204 11:03:26.580697 4831 scope.go:117] "RemoveContainer" containerID="3e76e66e07641bf5c8832f9a8bc91dc4b61d75d659d278bd05fa0d513a7eb75a" Dec 04 11:03:26 crc kubenswrapper[4831]: I1204 11:03:26.602498 4831 scope.go:117] "RemoveContainer" containerID="29d94e9ef7d95da5ed21b153b30bb46ff0865fe591891f5e9d44bf1656bd0e05" Dec 04 11:03:26 crc kubenswrapper[4831]: I1204 11:03:26.932287 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"625718e4-29fd-4886-ae65-76091a6def3c","Type":"ContainerStarted","Data":"1f66306d15f79f0d10bc00409cd0dcb44b1e8b070b3398e0238f9ed2e78e28be"} Dec 04 11:03:26 crc kubenswrapper[4831]: I1204 11:03:26.932592 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"625718e4-29fd-4886-ae65-76091a6def3c","Type":"ContainerStarted","Data":"c894c6309c6fbe2c6d16e0939217768dbeab5c63e6b863a6b2476d1f20dda139"} Dec 04 11:03:26 crc kubenswrapper[4831]: I1204 11:03:26.960003 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=18.959984566 podStartE2EDuration="18.959984566s" podCreationTimestamp="2025-12-04 11:03:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 11:03:26.958303771 +0000 UTC m=+2903.907479095" watchObservedRunningTime="2025-12-04 11:03:26.959984566 +0000 UTC m=+2903.909159880" Dec 04 11:03:28 crc kubenswrapper[4831]: I1204 11:03:28.871291 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 04 11:03:38 crc kubenswrapper[4831]: I1204 11:03:38.871420 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 04 11:03:38 crc kubenswrapper[4831]: I1204 11:03:38.877371 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 04 11:03:39 crc kubenswrapper[4831]: I1204 11:03:39.040453 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 04 11:03:50 crc kubenswrapper[4831]: E1204 11:03:50.108963 4831 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.146:57814->38.102.83.146:38009: write tcp 38.102.83.146:57814->38.102.83.146:38009: write: broken pipe Dec 04 11:03:51 crc kubenswrapper[4831]: I1204 11:03:51.972143 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:03:51 crc kubenswrapper[4831]: I1204 11:03:51.972729 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:03:51 crc kubenswrapper[4831]: I1204 11:03:51.972830 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" Dec 04 11:03:51 crc kubenswrapper[4831]: I1204 11:03:51.974141 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f216158f3258fd1dae0c50bdc0d7bdad944b732feabe96a8d81c113e0657e369"} pod="openshift-machine-config-operator/machine-config-daemon-g76nn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 11:03:51 crc kubenswrapper[4831]: I1204 11:03:51.974265 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" containerID="cri-o://f216158f3258fd1dae0c50bdc0d7bdad944b732feabe96a8d81c113e0657e369" gracePeriod=600 Dec 04 11:03:53 crc kubenswrapper[4831]: I1204 11:03:53.175001 4831 generic.go:334] "Generic (PLEG): container finished" podID="8475bb26-8864-4d49-935b-db7d4cb73387" containerID="f216158f3258fd1dae0c50bdc0d7bdad944b732feabe96a8d81c113e0657e369" exitCode=0 Dec 04 11:03:53 crc kubenswrapper[4831]: I1204 11:03:53.175308 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" event={"ID":"8475bb26-8864-4d49-935b-db7d4cb73387","Type":"ContainerDied","Data":"f216158f3258fd1dae0c50bdc0d7bdad944b732feabe96a8d81c113e0657e369"} Dec 04 11:03:53 crc kubenswrapper[4831]: I1204 11:03:53.175334 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" event={"ID":"8475bb26-8864-4d49-935b-db7d4cb73387","Type":"ContainerStarted","Data":"7b5d270b8a8f031ed2b37a53c176bca264e70c00ff17665ac0f58a0f5614d78e"} Dec 04 11:03:53 crc kubenswrapper[4831]: I1204 11:03:53.175352 4831 scope.go:117] "RemoveContainer" containerID="7680c56ce4695af5a3b6bef889c0877624c8a70c4075962710bf59431dcdb2e9" Dec 04 11:04:02 crc kubenswrapper[4831]: I1204 11:04:02.640746 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 04 11:04:02 crc kubenswrapper[4831]: E1204 11:04:02.641537 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a5f794f-9b0c-4c82-88ac-e0626e5c79f9" containerName="extract-utilities" Dec 04 11:04:02 crc kubenswrapper[4831]: I1204 11:04:02.641551 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a5f794f-9b0c-4c82-88ac-e0626e5c79f9" containerName="extract-utilities" Dec 04 11:04:02 crc kubenswrapper[4831]: E1204 11:04:02.641569 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a5f794f-9b0c-4c82-88ac-e0626e5c79f9" containerName="extract-content" Dec 04 11:04:02 crc kubenswrapper[4831]: I1204 11:04:02.641576 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a5f794f-9b0c-4c82-88ac-e0626e5c79f9" containerName="extract-content" Dec 04 11:04:02 crc kubenswrapper[4831]: E1204 11:04:02.641610 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a5f794f-9b0c-4c82-88ac-e0626e5c79f9" containerName="registry-server" Dec 04 11:04:02 crc kubenswrapper[4831]: I1204 11:04:02.641617 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a5f794f-9b0c-4c82-88ac-e0626e5c79f9" containerName="registry-server" Dec 04 11:04:02 crc kubenswrapper[4831]: I1204 11:04:02.642185 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a5f794f-9b0c-4c82-88ac-e0626e5c79f9" containerName="registry-server" Dec 04 11:04:02 crc kubenswrapper[4831]: I1204 11:04:02.642913 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 04 11:04:02 crc kubenswrapper[4831]: I1204 11:04:02.646473 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 04 11:04:02 crc kubenswrapper[4831]: I1204 11:04:02.646514 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-9bj9r" Dec 04 11:04:02 crc kubenswrapper[4831]: I1204 11:04:02.646710 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 04 11:04:02 crc kubenswrapper[4831]: I1204 11:04:02.646766 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 04 11:04:02 crc kubenswrapper[4831]: I1204 11:04:02.677875 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 04 11:04:02 crc kubenswrapper[4831]: I1204 11:04:02.735495 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b3441a94-3bf3-4956-8d5b-0b88f451404b-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"b3441a94-3bf3-4956-8d5b-0b88f451404b\") " pod="openstack/tempest-tests-tempest" Dec 04 11:04:02 crc kubenswrapper[4831]: I1204 11:04:02.735576 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b3441a94-3bf3-4956-8d5b-0b88f451404b-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"b3441a94-3bf3-4956-8d5b-0b88f451404b\") " pod="openstack/tempest-tests-tempest" Dec 04 11:04:02 crc kubenswrapper[4831]: I1204 11:04:02.735645 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxkw7\" (UniqueName: \"kubernetes.io/projected/b3441a94-3bf3-4956-8d5b-0b88f451404b-kube-api-access-cxkw7\") pod \"tempest-tests-tempest\" (UID: \"b3441a94-3bf3-4956-8d5b-0b88f451404b\") " pod="openstack/tempest-tests-tempest" Dec 04 11:04:02 crc kubenswrapper[4831]: I1204 11:04:02.735739 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b3441a94-3bf3-4956-8d5b-0b88f451404b-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"b3441a94-3bf3-4956-8d5b-0b88f451404b\") " pod="openstack/tempest-tests-tempest" Dec 04 11:04:02 crc kubenswrapper[4831]: I1204 11:04:02.735815 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b3441a94-3bf3-4956-8d5b-0b88f451404b-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"b3441a94-3bf3-4956-8d5b-0b88f451404b\") " pod="openstack/tempest-tests-tempest" Dec 04 11:04:02 crc kubenswrapper[4831]: I1204 11:04:02.735849 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b3441a94-3bf3-4956-8d5b-0b88f451404b-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"b3441a94-3bf3-4956-8d5b-0b88f451404b\") " pod="openstack/tempest-tests-tempest" Dec 04 11:04:02 crc kubenswrapper[4831]: I1204 11:04:02.736055 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b3441a94-3bf3-4956-8d5b-0b88f451404b-config-data\") pod \"tempest-tests-tempest\" (UID: \"b3441a94-3bf3-4956-8d5b-0b88f451404b\") " pod="openstack/tempest-tests-tempest" Dec 04 11:04:02 crc kubenswrapper[4831]: I1204 11:04:02.736160 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b3441a94-3bf3-4956-8d5b-0b88f451404b-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"b3441a94-3bf3-4956-8d5b-0b88f451404b\") " pod="openstack/tempest-tests-tempest" Dec 04 11:04:02 crc kubenswrapper[4831]: I1204 11:04:02.736202 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"b3441a94-3bf3-4956-8d5b-0b88f451404b\") " pod="openstack/tempest-tests-tempest" Dec 04 11:04:02 crc kubenswrapper[4831]: I1204 11:04:02.838566 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b3441a94-3bf3-4956-8d5b-0b88f451404b-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"b3441a94-3bf3-4956-8d5b-0b88f451404b\") " pod="openstack/tempest-tests-tempest" Dec 04 11:04:02 crc kubenswrapper[4831]: I1204 11:04:02.838975 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b3441a94-3bf3-4956-8d5b-0b88f451404b-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"b3441a94-3bf3-4956-8d5b-0b88f451404b\") " pod="openstack/tempest-tests-tempest" Dec 04 11:04:02 crc kubenswrapper[4831]: I1204 11:04:02.839003 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b3441a94-3bf3-4956-8d5b-0b88f451404b-config-data\") pod \"tempest-tests-tempest\" (UID: \"b3441a94-3bf3-4956-8d5b-0b88f451404b\") " pod="openstack/tempest-tests-tempest" Dec 04 11:04:02 crc kubenswrapper[4831]: I1204 11:04:02.839030 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b3441a94-3bf3-4956-8d5b-0b88f451404b-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"b3441a94-3bf3-4956-8d5b-0b88f451404b\") " pod="openstack/tempest-tests-tempest" Dec 04 11:04:02 crc kubenswrapper[4831]: I1204 11:04:02.839053 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"b3441a94-3bf3-4956-8d5b-0b88f451404b\") " pod="openstack/tempest-tests-tempest" Dec 04 11:04:02 crc kubenswrapper[4831]: I1204 11:04:02.839095 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b3441a94-3bf3-4956-8d5b-0b88f451404b-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"b3441a94-3bf3-4956-8d5b-0b88f451404b\") " pod="openstack/tempest-tests-tempest" Dec 04 11:04:02 crc kubenswrapper[4831]: I1204 11:04:02.839105 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b3441a94-3bf3-4956-8d5b-0b88f451404b-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"b3441a94-3bf3-4956-8d5b-0b88f451404b\") " pod="openstack/tempest-tests-tempest" Dec 04 11:04:02 crc kubenswrapper[4831]: I1204 11:04:02.839184 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b3441a94-3bf3-4956-8d5b-0b88f451404b-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"b3441a94-3bf3-4956-8d5b-0b88f451404b\") " pod="openstack/tempest-tests-tempest" Dec 04 11:04:02 crc kubenswrapper[4831]: I1204 11:04:02.839238 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxkw7\" (UniqueName: \"kubernetes.io/projected/b3441a94-3bf3-4956-8d5b-0b88f451404b-kube-api-access-cxkw7\") pod \"tempest-tests-tempest\" (UID: \"b3441a94-3bf3-4956-8d5b-0b88f451404b\") " pod="openstack/tempest-tests-tempest" Dec 04 11:04:02 crc kubenswrapper[4831]: I1204 11:04:02.839306 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b3441a94-3bf3-4956-8d5b-0b88f451404b-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"b3441a94-3bf3-4956-8d5b-0b88f451404b\") " pod="openstack/tempest-tests-tempest" Dec 04 11:04:02 crc kubenswrapper[4831]: I1204 11:04:02.839626 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b3441a94-3bf3-4956-8d5b-0b88f451404b-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"b3441a94-3bf3-4956-8d5b-0b88f451404b\") " pod="openstack/tempest-tests-tempest" Dec 04 11:04:02 crc kubenswrapper[4831]: I1204 11:04:02.839988 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"b3441a94-3bf3-4956-8d5b-0b88f451404b\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/tempest-tests-tempest" Dec 04 11:04:02 crc kubenswrapper[4831]: I1204 11:04:02.840253 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b3441a94-3bf3-4956-8d5b-0b88f451404b-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"b3441a94-3bf3-4956-8d5b-0b88f451404b\") " pod="openstack/tempest-tests-tempest" Dec 04 11:04:02 crc kubenswrapper[4831]: I1204 11:04:02.841246 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b3441a94-3bf3-4956-8d5b-0b88f451404b-config-data\") pod \"tempest-tests-tempest\" (UID: \"b3441a94-3bf3-4956-8d5b-0b88f451404b\") " pod="openstack/tempest-tests-tempest" Dec 04 11:04:02 crc kubenswrapper[4831]: I1204 11:04:02.845452 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b3441a94-3bf3-4956-8d5b-0b88f451404b-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"b3441a94-3bf3-4956-8d5b-0b88f451404b\") " pod="openstack/tempest-tests-tempest" Dec 04 11:04:02 crc kubenswrapper[4831]: I1204 11:04:02.852216 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b3441a94-3bf3-4956-8d5b-0b88f451404b-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"b3441a94-3bf3-4956-8d5b-0b88f451404b\") " pod="openstack/tempest-tests-tempest" Dec 04 11:04:02 crc kubenswrapper[4831]: I1204 11:04:02.852543 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b3441a94-3bf3-4956-8d5b-0b88f451404b-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"b3441a94-3bf3-4956-8d5b-0b88f451404b\") " pod="openstack/tempest-tests-tempest" Dec 04 11:04:02 crc kubenswrapper[4831]: I1204 11:04:02.862502 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxkw7\" (UniqueName: \"kubernetes.io/projected/b3441a94-3bf3-4956-8d5b-0b88f451404b-kube-api-access-cxkw7\") pod \"tempest-tests-tempest\" (UID: \"b3441a94-3bf3-4956-8d5b-0b88f451404b\") " pod="openstack/tempest-tests-tempest" Dec 04 11:04:02 crc kubenswrapper[4831]: I1204 11:04:02.870991 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"b3441a94-3bf3-4956-8d5b-0b88f451404b\") " pod="openstack/tempest-tests-tempest" Dec 04 11:04:02 crc kubenswrapper[4831]: I1204 11:04:02.988253 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 04 11:04:03 crc kubenswrapper[4831]: I1204 11:04:03.483074 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 04 11:04:04 crc kubenswrapper[4831]: I1204 11:04:04.360837 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"b3441a94-3bf3-4956-8d5b-0b88f451404b","Type":"ContainerStarted","Data":"929459f7aab8ed6e5e675e06ef649edf6b9472533590b0cbf9f6e0bbec2c230e"} Dec 04 11:04:18 crc kubenswrapper[4831]: I1204 11:04:18.590116 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 04 11:04:20 crc kubenswrapper[4831]: I1204 11:04:20.550052 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"b3441a94-3bf3-4956-8d5b-0b88f451404b","Type":"ContainerStarted","Data":"e62e569f51abaa1285cdede070c8e6bd68d1d47f60fa8a97297ccad898a14d98"} Dec 04 11:04:20 crc kubenswrapper[4831]: I1204 11:04:20.571358 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.469698041 podStartE2EDuration="19.571339844s" podCreationTimestamp="2025-12-04 11:04:01 +0000 UTC" firstStartedPulling="2025-12-04 11:04:03.485696998 +0000 UTC m=+2940.434872312" lastFinishedPulling="2025-12-04 11:04:18.587338801 +0000 UTC m=+2955.536514115" observedRunningTime="2025-12-04 11:04:20.570245345 +0000 UTC m=+2957.519420669" watchObservedRunningTime="2025-12-04 11:04:20.571339844 +0000 UTC m=+2957.520515158" Dec 04 11:06:21 crc kubenswrapper[4831]: I1204 11:06:21.972069 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:06:21 crc kubenswrapper[4831]: I1204 11:06:21.972558 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:06:51 crc kubenswrapper[4831]: I1204 11:06:51.971628 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:06:51 crc kubenswrapper[4831]: I1204 11:06:51.972505 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:07:21 crc kubenswrapper[4831]: I1204 11:07:21.971742 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:07:21 crc kubenswrapper[4831]: I1204 11:07:21.972488 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:07:21 crc kubenswrapper[4831]: I1204 11:07:21.972550 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" Dec 04 11:07:21 crc kubenswrapper[4831]: I1204 11:07:21.973570 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7b5d270b8a8f031ed2b37a53c176bca264e70c00ff17665ac0f58a0f5614d78e"} pod="openshift-machine-config-operator/machine-config-daemon-g76nn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 11:07:21 crc kubenswrapper[4831]: I1204 11:07:21.973631 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" containerID="cri-o://7b5d270b8a8f031ed2b37a53c176bca264e70c00ff17665ac0f58a0f5614d78e" gracePeriod=600 Dec 04 11:07:22 crc kubenswrapper[4831]: E1204 11:07:22.098024 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:07:22 crc kubenswrapper[4831]: I1204 11:07:22.433562 4831 generic.go:334] "Generic (PLEG): container finished" podID="8475bb26-8864-4d49-935b-db7d4cb73387" containerID="7b5d270b8a8f031ed2b37a53c176bca264e70c00ff17665ac0f58a0f5614d78e" exitCode=0 Dec 04 11:07:22 crc kubenswrapper[4831]: I1204 11:07:22.433610 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" event={"ID":"8475bb26-8864-4d49-935b-db7d4cb73387","Type":"ContainerDied","Data":"7b5d270b8a8f031ed2b37a53c176bca264e70c00ff17665ac0f58a0f5614d78e"} Dec 04 11:07:22 crc kubenswrapper[4831]: I1204 11:07:22.433645 4831 scope.go:117] "RemoveContainer" containerID="f216158f3258fd1dae0c50bdc0d7bdad944b732feabe96a8d81c113e0657e369" Dec 04 11:07:22 crc kubenswrapper[4831]: I1204 11:07:22.434061 4831 scope.go:117] "RemoveContainer" containerID="7b5d270b8a8f031ed2b37a53c176bca264e70c00ff17665ac0f58a0f5614d78e" Dec 04 11:07:22 crc kubenswrapper[4831]: E1204 11:07:22.434370 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:07:35 crc kubenswrapper[4831]: I1204 11:07:35.280511 4831 scope.go:117] "RemoveContainer" containerID="7b5d270b8a8f031ed2b37a53c176bca264e70c00ff17665ac0f58a0f5614d78e" Dec 04 11:07:35 crc kubenswrapper[4831]: E1204 11:07:35.281386 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:07:49 crc kubenswrapper[4831]: I1204 11:07:49.277135 4831 scope.go:117] "RemoveContainer" containerID="7b5d270b8a8f031ed2b37a53c176bca264e70c00ff17665ac0f58a0f5614d78e" Dec 04 11:07:49 crc kubenswrapper[4831]: E1204 11:07:49.277901 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:08:03 crc kubenswrapper[4831]: I1204 11:08:03.286316 4831 scope.go:117] "RemoveContainer" containerID="7b5d270b8a8f031ed2b37a53c176bca264e70c00ff17665ac0f58a0f5614d78e" Dec 04 11:08:03 crc kubenswrapper[4831]: E1204 11:08:03.287205 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:08:15 crc kubenswrapper[4831]: I1204 11:08:15.277866 4831 scope.go:117] "RemoveContainer" containerID="7b5d270b8a8f031ed2b37a53c176bca264e70c00ff17665ac0f58a0f5614d78e" Dec 04 11:08:15 crc kubenswrapper[4831]: E1204 11:08:15.279200 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:08:28 crc kubenswrapper[4831]: I1204 11:08:28.276459 4831 scope.go:117] "RemoveContainer" containerID="7b5d270b8a8f031ed2b37a53c176bca264e70c00ff17665ac0f58a0f5614d78e" Dec 04 11:08:28 crc kubenswrapper[4831]: E1204 11:08:28.277256 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:08:42 crc kubenswrapper[4831]: I1204 11:08:42.276862 4831 scope.go:117] "RemoveContainer" containerID="7b5d270b8a8f031ed2b37a53c176bca264e70c00ff17665ac0f58a0f5614d78e" Dec 04 11:08:42 crc kubenswrapper[4831]: E1204 11:08:42.277532 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:08:56 crc kubenswrapper[4831]: I1204 11:08:56.276258 4831 scope.go:117] "RemoveContainer" containerID="7b5d270b8a8f031ed2b37a53c176bca264e70c00ff17665ac0f58a0f5614d78e" Dec 04 11:08:56 crc kubenswrapper[4831]: E1204 11:08:56.277963 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:09:10 crc kubenswrapper[4831]: I1204 11:09:10.276773 4831 scope.go:117] "RemoveContainer" containerID="7b5d270b8a8f031ed2b37a53c176bca264e70c00ff17665ac0f58a0f5614d78e" Dec 04 11:09:10 crc kubenswrapper[4831]: E1204 11:09:10.277479 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:09:21 crc kubenswrapper[4831]: I1204 11:09:21.276897 4831 scope.go:117] "RemoveContainer" containerID="7b5d270b8a8f031ed2b37a53c176bca264e70c00ff17665ac0f58a0f5614d78e" Dec 04 11:09:21 crc kubenswrapper[4831]: E1204 11:09:21.277777 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:09:26 crc kubenswrapper[4831]: I1204 11:09:26.809610 4831 scope.go:117] "RemoveContainer" containerID="de6431545ae85e94d29d08a407ab9c88cfbf63e9241a893dfc1a57b351d629cf" Dec 04 11:09:26 crc kubenswrapper[4831]: I1204 11:09:26.845497 4831 scope.go:117] "RemoveContainer" containerID="37e9bab6467aa1082e675dc0dc982c034e7e438858d50537d93e3615d8802e68" Dec 04 11:09:26 crc kubenswrapper[4831]: I1204 11:09:26.873359 4831 scope.go:117] "RemoveContainer" containerID="0cfdf154b47ed740177aeb89151f29ba49f929bc44801f4b8273fe75edd18a69" Dec 04 11:09:35 crc kubenswrapper[4831]: I1204 11:09:35.280281 4831 scope.go:117] "RemoveContainer" containerID="7b5d270b8a8f031ed2b37a53c176bca264e70c00ff17665ac0f58a0f5614d78e" Dec 04 11:09:35 crc kubenswrapper[4831]: E1204 11:09:35.281376 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:09:49 crc kubenswrapper[4831]: I1204 11:09:49.276280 4831 scope.go:117] "RemoveContainer" containerID="7b5d270b8a8f031ed2b37a53c176bca264e70c00ff17665ac0f58a0f5614d78e" Dec 04 11:09:49 crc kubenswrapper[4831]: E1204 11:09:49.277303 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:09:58 crc kubenswrapper[4831]: I1204 11:09:58.343210 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/neutron-66c4c75c85-69mpg" podUID="2f2179f9-7122-438d-85cc-012b724ccae8" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Dec 04 11:10:00 crc kubenswrapper[4831]: I1204 11:10:00.276789 4831 scope.go:117] "RemoveContainer" containerID="7b5d270b8a8f031ed2b37a53c176bca264e70c00ff17665ac0f58a0f5614d78e" Dec 04 11:10:00 crc kubenswrapper[4831]: E1204 11:10:00.277453 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:10:14 crc kubenswrapper[4831]: I1204 11:10:14.276891 4831 scope.go:117] "RemoveContainer" containerID="7b5d270b8a8f031ed2b37a53c176bca264e70c00ff17665ac0f58a0f5614d78e" Dec 04 11:10:14 crc kubenswrapper[4831]: E1204 11:10:14.277739 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:10:27 crc kubenswrapper[4831]: I1204 11:10:27.276856 4831 scope.go:117] "RemoveContainer" containerID="7b5d270b8a8f031ed2b37a53c176bca264e70c00ff17665ac0f58a0f5614d78e" Dec 04 11:10:27 crc kubenswrapper[4831]: E1204 11:10:27.277776 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:10:38 crc kubenswrapper[4831]: I1204 11:10:38.276561 4831 scope.go:117] "RemoveContainer" containerID="7b5d270b8a8f031ed2b37a53c176bca264e70c00ff17665ac0f58a0f5614d78e" Dec 04 11:10:38 crc kubenswrapper[4831]: E1204 11:10:38.277197 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:10:50 crc kubenswrapper[4831]: I1204 11:10:50.276944 4831 scope.go:117] "RemoveContainer" containerID="7b5d270b8a8f031ed2b37a53c176bca264e70c00ff17665ac0f58a0f5614d78e" Dec 04 11:10:50 crc kubenswrapper[4831]: E1204 11:10:50.277648 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:11:03 crc kubenswrapper[4831]: I1204 11:11:03.285147 4831 scope.go:117] "RemoveContainer" containerID="7b5d270b8a8f031ed2b37a53c176bca264e70c00ff17665ac0f58a0f5614d78e" Dec 04 11:11:03 crc kubenswrapper[4831]: E1204 11:11:03.286114 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:11:18 crc kubenswrapper[4831]: I1204 11:11:18.276237 4831 scope.go:117] "RemoveContainer" containerID="7b5d270b8a8f031ed2b37a53c176bca264e70c00ff17665ac0f58a0f5614d78e" Dec 04 11:11:18 crc kubenswrapper[4831]: E1204 11:11:18.277086 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:11:32 crc kubenswrapper[4831]: I1204 11:11:32.276910 4831 scope.go:117] "RemoveContainer" containerID="7b5d270b8a8f031ed2b37a53c176bca264e70c00ff17665ac0f58a0f5614d78e" Dec 04 11:11:32 crc kubenswrapper[4831]: E1204 11:11:32.277716 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:11:45 crc kubenswrapper[4831]: I1204 11:11:45.276533 4831 scope.go:117] "RemoveContainer" containerID="7b5d270b8a8f031ed2b37a53c176bca264e70c00ff17665ac0f58a0f5614d78e" Dec 04 11:11:45 crc kubenswrapper[4831]: E1204 11:11:45.277302 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:11:59 crc kubenswrapper[4831]: I1204 11:11:59.276464 4831 scope.go:117] "RemoveContainer" containerID="7b5d270b8a8f031ed2b37a53c176bca264e70c00ff17665ac0f58a0f5614d78e" Dec 04 11:11:59 crc kubenswrapper[4831]: E1204 11:11:59.277279 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:12:14 crc kubenswrapper[4831]: I1204 11:12:14.276094 4831 scope.go:117] "RemoveContainer" containerID="7b5d270b8a8f031ed2b37a53c176bca264e70c00ff17665ac0f58a0f5614d78e" Dec 04 11:12:14 crc kubenswrapper[4831]: E1204 11:12:14.277051 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:12:26 crc kubenswrapper[4831]: I1204 11:12:26.277083 4831 scope.go:117] "RemoveContainer" containerID="7b5d270b8a8f031ed2b37a53c176bca264e70c00ff17665ac0f58a0f5614d78e" Dec 04 11:12:27 crc kubenswrapper[4831]: I1204 11:12:27.076418 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" event={"ID":"8475bb26-8864-4d49-935b-db7d4cb73387","Type":"ContainerStarted","Data":"57e437ff5859ec9f82a8e1fe2572501527333499dc8755f22ed7b112ac0dbdce"} Dec 04 11:12:29 crc kubenswrapper[4831]: I1204 11:12:29.386496 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7zzf4"] Dec 04 11:12:29 crc kubenswrapper[4831]: I1204 11:12:29.389780 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7zzf4" Dec 04 11:12:29 crc kubenswrapper[4831]: I1204 11:12:29.403380 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7zzf4"] Dec 04 11:12:29 crc kubenswrapper[4831]: I1204 11:12:29.491443 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66f3bb3f-4862-4db4-8afe-edd235beb24a-catalog-content\") pod \"redhat-operators-7zzf4\" (UID: \"66f3bb3f-4862-4db4-8afe-edd235beb24a\") " pod="openshift-marketplace/redhat-operators-7zzf4" Dec 04 11:12:29 crc kubenswrapper[4831]: I1204 11:12:29.491861 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66f3bb3f-4862-4db4-8afe-edd235beb24a-utilities\") pod \"redhat-operators-7zzf4\" (UID: \"66f3bb3f-4862-4db4-8afe-edd235beb24a\") " pod="openshift-marketplace/redhat-operators-7zzf4" Dec 04 11:12:29 crc kubenswrapper[4831]: I1204 11:12:29.492002 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phrvr\" (UniqueName: \"kubernetes.io/projected/66f3bb3f-4862-4db4-8afe-edd235beb24a-kube-api-access-phrvr\") pod \"redhat-operators-7zzf4\" (UID: \"66f3bb3f-4862-4db4-8afe-edd235beb24a\") " pod="openshift-marketplace/redhat-operators-7zzf4" Dec 04 11:12:29 crc kubenswrapper[4831]: I1204 11:12:29.594009 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66f3bb3f-4862-4db4-8afe-edd235beb24a-catalog-content\") pod \"redhat-operators-7zzf4\" (UID: \"66f3bb3f-4862-4db4-8afe-edd235beb24a\") " pod="openshift-marketplace/redhat-operators-7zzf4" Dec 04 11:12:29 crc kubenswrapper[4831]: I1204 11:12:29.594130 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66f3bb3f-4862-4db4-8afe-edd235beb24a-utilities\") pod \"redhat-operators-7zzf4\" (UID: \"66f3bb3f-4862-4db4-8afe-edd235beb24a\") " pod="openshift-marketplace/redhat-operators-7zzf4" Dec 04 11:12:29 crc kubenswrapper[4831]: I1204 11:12:29.594159 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phrvr\" (UniqueName: \"kubernetes.io/projected/66f3bb3f-4862-4db4-8afe-edd235beb24a-kube-api-access-phrvr\") pod \"redhat-operators-7zzf4\" (UID: \"66f3bb3f-4862-4db4-8afe-edd235beb24a\") " pod="openshift-marketplace/redhat-operators-7zzf4" Dec 04 11:12:29 crc kubenswrapper[4831]: I1204 11:12:29.594541 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66f3bb3f-4862-4db4-8afe-edd235beb24a-catalog-content\") pod \"redhat-operators-7zzf4\" (UID: \"66f3bb3f-4862-4db4-8afe-edd235beb24a\") " pod="openshift-marketplace/redhat-operators-7zzf4" Dec 04 11:12:29 crc kubenswrapper[4831]: I1204 11:12:29.594582 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66f3bb3f-4862-4db4-8afe-edd235beb24a-utilities\") pod \"redhat-operators-7zzf4\" (UID: \"66f3bb3f-4862-4db4-8afe-edd235beb24a\") " pod="openshift-marketplace/redhat-operators-7zzf4" Dec 04 11:12:29 crc kubenswrapper[4831]: I1204 11:12:29.620790 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phrvr\" (UniqueName: \"kubernetes.io/projected/66f3bb3f-4862-4db4-8afe-edd235beb24a-kube-api-access-phrvr\") pod \"redhat-operators-7zzf4\" (UID: \"66f3bb3f-4862-4db4-8afe-edd235beb24a\") " pod="openshift-marketplace/redhat-operators-7zzf4" Dec 04 11:12:29 crc kubenswrapper[4831]: I1204 11:12:29.716994 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7zzf4" Dec 04 11:12:30 crc kubenswrapper[4831]: I1204 11:12:30.203056 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7zzf4"] Dec 04 11:12:31 crc kubenswrapper[4831]: I1204 11:12:31.120953 4831 generic.go:334] "Generic (PLEG): container finished" podID="66f3bb3f-4862-4db4-8afe-edd235beb24a" containerID="4f078b742d5ec2884b9952edb7f73a9f28ca2b543abfa7b4c9e1150a8c882ab7" exitCode=0 Dec 04 11:12:31 crc kubenswrapper[4831]: I1204 11:12:31.121001 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7zzf4" event={"ID":"66f3bb3f-4862-4db4-8afe-edd235beb24a","Type":"ContainerDied","Data":"4f078b742d5ec2884b9952edb7f73a9f28ca2b543abfa7b4c9e1150a8c882ab7"} Dec 04 11:12:31 crc kubenswrapper[4831]: I1204 11:12:31.121026 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7zzf4" event={"ID":"66f3bb3f-4862-4db4-8afe-edd235beb24a","Type":"ContainerStarted","Data":"8c22e415a53bae2bdc53192cebc4d882f0b430b222e8c53daca336e07180d4e9"} Dec 04 11:12:31 crc kubenswrapper[4831]: I1204 11:12:31.124638 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 11:12:32 crc kubenswrapper[4831]: I1204 11:12:32.133570 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7zzf4" event={"ID":"66f3bb3f-4862-4db4-8afe-edd235beb24a","Type":"ContainerStarted","Data":"0426b2caa4381e1f9fa0fce3a80d8ba5ac21e53734d3b087b680d7031819d8c5"} Dec 04 11:12:33 crc kubenswrapper[4831]: I1204 11:12:33.355124 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hdlfb"] Dec 04 11:12:33 crc kubenswrapper[4831]: I1204 11:12:33.357913 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hdlfb" Dec 04 11:12:33 crc kubenswrapper[4831]: I1204 11:12:33.366096 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hdlfb"] Dec 04 11:12:33 crc kubenswrapper[4831]: I1204 11:12:33.501004 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnzbg\" (UniqueName: \"kubernetes.io/projected/1018d840-50ec-4676-abde-d52186c7d09b-kube-api-access-mnzbg\") pod \"certified-operators-hdlfb\" (UID: \"1018d840-50ec-4676-abde-d52186c7d09b\") " pod="openshift-marketplace/certified-operators-hdlfb" Dec 04 11:12:33 crc kubenswrapper[4831]: I1204 11:12:33.501128 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1018d840-50ec-4676-abde-d52186c7d09b-catalog-content\") pod \"certified-operators-hdlfb\" (UID: \"1018d840-50ec-4676-abde-d52186c7d09b\") " pod="openshift-marketplace/certified-operators-hdlfb" Dec 04 11:12:33 crc kubenswrapper[4831]: I1204 11:12:33.501216 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1018d840-50ec-4676-abde-d52186c7d09b-utilities\") pod \"certified-operators-hdlfb\" (UID: \"1018d840-50ec-4676-abde-d52186c7d09b\") " pod="openshift-marketplace/certified-operators-hdlfb" Dec 04 11:12:33 crc kubenswrapper[4831]: I1204 11:12:33.603358 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnzbg\" (UniqueName: \"kubernetes.io/projected/1018d840-50ec-4676-abde-d52186c7d09b-kube-api-access-mnzbg\") pod \"certified-operators-hdlfb\" (UID: \"1018d840-50ec-4676-abde-d52186c7d09b\") " pod="openshift-marketplace/certified-operators-hdlfb" Dec 04 11:12:33 crc kubenswrapper[4831]: I1204 11:12:33.603460 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1018d840-50ec-4676-abde-d52186c7d09b-catalog-content\") pod \"certified-operators-hdlfb\" (UID: \"1018d840-50ec-4676-abde-d52186c7d09b\") " pod="openshift-marketplace/certified-operators-hdlfb" Dec 04 11:12:33 crc kubenswrapper[4831]: I1204 11:12:33.603503 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1018d840-50ec-4676-abde-d52186c7d09b-utilities\") pod \"certified-operators-hdlfb\" (UID: \"1018d840-50ec-4676-abde-d52186c7d09b\") " pod="openshift-marketplace/certified-operators-hdlfb" Dec 04 11:12:33 crc kubenswrapper[4831]: I1204 11:12:33.603989 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1018d840-50ec-4676-abde-d52186c7d09b-catalog-content\") pod \"certified-operators-hdlfb\" (UID: \"1018d840-50ec-4676-abde-d52186c7d09b\") " pod="openshift-marketplace/certified-operators-hdlfb" Dec 04 11:12:33 crc kubenswrapper[4831]: I1204 11:12:33.604093 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1018d840-50ec-4676-abde-d52186c7d09b-utilities\") pod \"certified-operators-hdlfb\" (UID: \"1018d840-50ec-4676-abde-d52186c7d09b\") " pod="openshift-marketplace/certified-operators-hdlfb" Dec 04 11:12:33 crc kubenswrapper[4831]: I1204 11:12:33.623966 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnzbg\" (UniqueName: \"kubernetes.io/projected/1018d840-50ec-4676-abde-d52186c7d09b-kube-api-access-mnzbg\") pod \"certified-operators-hdlfb\" (UID: \"1018d840-50ec-4676-abde-d52186c7d09b\") " pod="openshift-marketplace/certified-operators-hdlfb" Dec 04 11:12:33 crc kubenswrapper[4831]: I1204 11:12:33.690454 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hdlfb" Dec 04 11:12:34 crc kubenswrapper[4831]: I1204 11:12:34.376077 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hdlfb"] Dec 04 11:12:35 crc kubenswrapper[4831]: I1204 11:12:35.169214 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdlfb" event={"ID":"1018d840-50ec-4676-abde-d52186c7d09b","Type":"ContainerStarted","Data":"4b66d1c6991550ad7fc870a809e011d17241a5bf7a261afde1c739abb1c4744d"} Dec 04 11:12:36 crc kubenswrapper[4831]: I1204 11:12:36.180091 4831 generic.go:334] "Generic (PLEG): container finished" podID="1018d840-50ec-4676-abde-d52186c7d09b" containerID="9d6d96b23c8bc1f6bab2db9222ed68ff9f6af2f368d439da18bb90f6d1a9b69c" exitCode=0 Dec 04 11:12:36 crc kubenswrapper[4831]: I1204 11:12:36.180173 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdlfb" event={"ID":"1018d840-50ec-4676-abde-d52186c7d09b","Type":"ContainerDied","Data":"9d6d96b23c8bc1f6bab2db9222ed68ff9f6af2f368d439da18bb90f6d1a9b69c"} Dec 04 11:12:36 crc kubenswrapper[4831]: I1204 11:12:36.953488 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vqbk2"] Dec 04 11:12:36 crc kubenswrapper[4831]: I1204 11:12:36.956081 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vqbk2" Dec 04 11:12:36 crc kubenswrapper[4831]: I1204 11:12:36.967342 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vqbk2"] Dec 04 11:12:37 crc kubenswrapper[4831]: I1204 11:12:37.078838 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7p8t\" (UniqueName: \"kubernetes.io/projected/d5efd266-00e3-426a-9136-30920dd67f9f-kube-api-access-s7p8t\") pod \"redhat-marketplace-vqbk2\" (UID: \"d5efd266-00e3-426a-9136-30920dd67f9f\") " pod="openshift-marketplace/redhat-marketplace-vqbk2" Dec 04 11:12:37 crc kubenswrapper[4831]: I1204 11:12:37.078901 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5efd266-00e3-426a-9136-30920dd67f9f-catalog-content\") pod \"redhat-marketplace-vqbk2\" (UID: \"d5efd266-00e3-426a-9136-30920dd67f9f\") " pod="openshift-marketplace/redhat-marketplace-vqbk2" Dec 04 11:12:37 crc kubenswrapper[4831]: I1204 11:12:37.078934 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5efd266-00e3-426a-9136-30920dd67f9f-utilities\") pod \"redhat-marketplace-vqbk2\" (UID: \"d5efd266-00e3-426a-9136-30920dd67f9f\") " pod="openshift-marketplace/redhat-marketplace-vqbk2" Dec 04 11:12:37 crc kubenswrapper[4831]: I1204 11:12:37.180920 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7p8t\" (UniqueName: \"kubernetes.io/projected/d5efd266-00e3-426a-9136-30920dd67f9f-kube-api-access-s7p8t\") pod \"redhat-marketplace-vqbk2\" (UID: \"d5efd266-00e3-426a-9136-30920dd67f9f\") " pod="openshift-marketplace/redhat-marketplace-vqbk2" Dec 04 11:12:37 crc kubenswrapper[4831]: I1204 11:12:37.180987 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5efd266-00e3-426a-9136-30920dd67f9f-catalog-content\") pod \"redhat-marketplace-vqbk2\" (UID: \"d5efd266-00e3-426a-9136-30920dd67f9f\") " pod="openshift-marketplace/redhat-marketplace-vqbk2" Dec 04 11:12:37 crc kubenswrapper[4831]: I1204 11:12:37.181011 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5efd266-00e3-426a-9136-30920dd67f9f-utilities\") pod \"redhat-marketplace-vqbk2\" (UID: \"d5efd266-00e3-426a-9136-30920dd67f9f\") " pod="openshift-marketplace/redhat-marketplace-vqbk2" Dec 04 11:12:37 crc kubenswrapper[4831]: I1204 11:12:37.181591 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5efd266-00e3-426a-9136-30920dd67f9f-catalog-content\") pod \"redhat-marketplace-vqbk2\" (UID: \"d5efd266-00e3-426a-9136-30920dd67f9f\") " pod="openshift-marketplace/redhat-marketplace-vqbk2" Dec 04 11:12:37 crc kubenswrapper[4831]: I1204 11:12:37.181598 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5efd266-00e3-426a-9136-30920dd67f9f-utilities\") pod \"redhat-marketplace-vqbk2\" (UID: \"d5efd266-00e3-426a-9136-30920dd67f9f\") " pod="openshift-marketplace/redhat-marketplace-vqbk2" Dec 04 11:12:37 crc kubenswrapper[4831]: I1204 11:12:37.192185 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdlfb" event={"ID":"1018d840-50ec-4676-abde-d52186c7d09b","Type":"ContainerStarted","Data":"ded6132a11e3cee42001b3b517874f9086f172e1d3fcc8aeeb89a0e2bb63a5e0"} Dec 04 11:12:37 crc kubenswrapper[4831]: I1204 11:12:37.194057 4831 generic.go:334] "Generic (PLEG): container finished" podID="66f3bb3f-4862-4db4-8afe-edd235beb24a" containerID="0426b2caa4381e1f9fa0fce3a80d8ba5ac21e53734d3b087b680d7031819d8c5" exitCode=0 Dec 04 11:12:37 crc kubenswrapper[4831]: I1204 11:12:37.194088 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7zzf4" event={"ID":"66f3bb3f-4862-4db4-8afe-edd235beb24a","Type":"ContainerDied","Data":"0426b2caa4381e1f9fa0fce3a80d8ba5ac21e53734d3b087b680d7031819d8c5"} Dec 04 11:12:37 crc kubenswrapper[4831]: I1204 11:12:37.213943 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7p8t\" (UniqueName: \"kubernetes.io/projected/d5efd266-00e3-426a-9136-30920dd67f9f-kube-api-access-s7p8t\") pod \"redhat-marketplace-vqbk2\" (UID: \"d5efd266-00e3-426a-9136-30920dd67f9f\") " pod="openshift-marketplace/redhat-marketplace-vqbk2" Dec 04 11:12:37 crc kubenswrapper[4831]: I1204 11:12:37.278519 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vqbk2" Dec 04 11:12:37 crc kubenswrapper[4831]: I1204 11:12:37.802393 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vqbk2"] Dec 04 11:12:38 crc kubenswrapper[4831]: I1204 11:12:38.207723 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vqbk2" event={"ID":"d5efd266-00e3-426a-9136-30920dd67f9f","Type":"ContainerStarted","Data":"5e96aa5151ba54901a252f7120b84565facc956eb8c0c70bb6f6d75c73e835bd"} Dec 04 11:12:40 crc kubenswrapper[4831]: I1204 11:12:40.225475 4831 generic.go:334] "Generic (PLEG): container finished" podID="1018d840-50ec-4676-abde-d52186c7d09b" containerID="ded6132a11e3cee42001b3b517874f9086f172e1d3fcc8aeeb89a0e2bb63a5e0" exitCode=0 Dec 04 11:12:40 crc kubenswrapper[4831]: I1204 11:12:40.225537 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdlfb" event={"ID":"1018d840-50ec-4676-abde-d52186c7d09b","Type":"ContainerDied","Data":"ded6132a11e3cee42001b3b517874f9086f172e1d3fcc8aeeb89a0e2bb63a5e0"} Dec 04 11:12:40 crc kubenswrapper[4831]: I1204 11:12:40.228610 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7zzf4" event={"ID":"66f3bb3f-4862-4db4-8afe-edd235beb24a","Type":"ContainerStarted","Data":"821099a98eb34080d440421ca2604c64e4e3aba1e6bc629fb693a35659252dc3"} Dec 04 11:12:40 crc kubenswrapper[4831]: I1204 11:12:40.231516 4831 generic.go:334] "Generic (PLEG): container finished" podID="d5efd266-00e3-426a-9136-30920dd67f9f" containerID="7e9245c1e9b8d7a93a7d0616b55194b01dc30327acac84d7b50f69107dd44927" exitCode=0 Dec 04 11:12:40 crc kubenswrapper[4831]: I1204 11:12:40.231546 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vqbk2" event={"ID":"d5efd266-00e3-426a-9136-30920dd67f9f","Type":"ContainerDied","Data":"7e9245c1e9b8d7a93a7d0616b55194b01dc30327acac84d7b50f69107dd44927"} Dec 04 11:12:40 crc kubenswrapper[4831]: I1204 11:12:40.264749 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7zzf4" podStartSLOduration=4.7069731820000005 podStartE2EDuration="11.264726455s" podCreationTimestamp="2025-12-04 11:12:29 +0000 UTC" firstStartedPulling="2025-12-04 11:12:31.124387013 +0000 UTC m=+3448.073562327" lastFinishedPulling="2025-12-04 11:12:37.682140286 +0000 UTC m=+3454.631315600" observedRunningTime="2025-12-04 11:12:40.260189108 +0000 UTC m=+3457.209364432" watchObservedRunningTime="2025-12-04 11:12:40.264726455 +0000 UTC m=+3457.213901769" Dec 04 11:12:41 crc kubenswrapper[4831]: I1204 11:12:41.316398 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdlfb" event={"ID":"1018d840-50ec-4676-abde-d52186c7d09b","Type":"ContainerStarted","Data":"93a03377d1baceab567336096867e8a06dbf5508c9d23b455be076c2ef5bfafd"} Dec 04 11:12:41 crc kubenswrapper[4831]: I1204 11:12:41.351924 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hdlfb" podStartSLOduration=3.8584695780000002 podStartE2EDuration="8.351906328s" podCreationTimestamp="2025-12-04 11:12:33 +0000 UTC" firstStartedPulling="2025-12-04 11:12:36.183366425 +0000 UTC m=+3453.132541739" lastFinishedPulling="2025-12-04 11:12:40.676803175 +0000 UTC m=+3457.625978489" observedRunningTime="2025-12-04 11:12:41.351131476 +0000 UTC m=+3458.300306790" watchObservedRunningTime="2025-12-04 11:12:41.351906328 +0000 UTC m=+3458.301081642" Dec 04 11:12:42 crc kubenswrapper[4831]: I1204 11:12:42.333404 4831 generic.go:334] "Generic (PLEG): container finished" podID="d5efd266-00e3-426a-9136-30920dd67f9f" containerID="ee04a365c691af0d3e8ecb7895e766c67b5f7c0ee1aa88506f4fd0ca59e27e4f" exitCode=0 Dec 04 11:12:42 crc kubenswrapper[4831]: I1204 11:12:42.333770 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vqbk2" event={"ID":"d5efd266-00e3-426a-9136-30920dd67f9f","Type":"ContainerDied","Data":"ee04a365c691af0d3e8ecb7895e766c67b5f7c0ee1aa88506f4fd0ca59e27e4f"} Dec 04 11:12:43 crc kubenswrapper[4831]: I1204 11:12:43.348262 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vqbk2" event={"ID":"d5efd266-00e3-426a-9136-30920dd67f9f","Type":"ContainerStarted","Data":"2464c6cae8c5b4a114273bf66c1ce8d742572f3c4c89829d9061cf5f16e91be5"} Dec 04 11:12:43 crc kubenswrapper[4831]: I1204 11:12:43.380622 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vqbk2" podStartSLOduration=4.819894167 podStartE2EDuration="7.380589698s" podCreationTimestamp="2025-12-04 11:12:36 +0000 UTC" firstStartedPulling="2025-12-04 11:12:40.233136035 +0000 UTC m=+3457.182311349" lastFinishedPulling="2025-12-04 11:12:42.793831556 +0000 UTC m=+3459.743006880" observedRunningTime="2025-12-04 11:12:43.37490199 +0000 UTC m=+3460.324077324" watchObservedRunningTime="2025-12-04 11:12:43.380589698 +0000 UTC m=+3460.329765012" Dec 04 11:12:43 crc kubenswrapper[4831]: I1204 11:12:43.691345 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hdlfb" Dec 04 11:12:43 crc kubenswrapper[4831]: I1204 11:12:43.691410 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hdlfb" Dec 04 11:12:44 crc kubenswrapper[4831]: I1204 11:12:44.748969 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-hdlfb" podUID="1018d840-50ec-4676-abde-d52186c7d09b" containerName="registry-server" probeResult="failure" output=< Dec 04 11:12:44 crc kubenswrapper[4831]: timeout: failed to connect service ":50051" within 1s Dec 04 11:12:44 crc kubenswrapper[4831]: > Dec 04 11:12:47 crc kubenswrapper[4831]: I1204 11:12:47.287744 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vqbk2" Dec 04 11:12:47 crc kubenswrapper[4831]: I1204 11:12:47.288119 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vqbk2" Dec 04 11:12:47 crc kubenswrapper[4831]: I1204 11:12:47.334623 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vqbk2" Dec 04 11:12:47 crc kubenswrapper[4831]: I1204 11:12:47.438488 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vqbk2" Dec 04 11:12:49 crc kubenswrapper[4831]: I1204 11:12:49.149962 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vqbk2"] Dec 04 11:12:49 crc kubenswrapper[4831]: I1204 11:12:49.405313 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vqbk2" podUID="d5efd266-00e3-426a-9136-30920dd67f9f" containerName="registry-server" containerID="cri-o://2464c6cae8c5b4a114273bf66c1ce8d742572f3c4c89829d9061cf5f16e91be5" gracePeriod=2 Dec 04 11:12:49 crc kubenswrapper[4831]: I1204 11:12:49.722128 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7zzf4" Dec 04 11:12:49 crc kubenswrapper[4831]: I1204 11:12:49.722452 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7zzf4" Dec 04 11:12:49 crc kubenswrapper[4831]: I1204 11:12:49.776036 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7zzf4" Dec 04 11:12:49 crc kubenswrapper[4831]: I1204 11:12:49.896321 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vqbk2" Dec 04 11:12:50 crc kubenswrapper[4831]: I1204 11:12:50.024732 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7p8t\" (UniqueName: \"kubernetes.io/projected/d5efd266-00e3-426a-9136-30920dd67f9f-kube-api-access-s7p8t\") pod \"d5efd266-00e3-426a-9136-30920dd67f9f\" (UID: \"d5efd266-00e3-426a-9136-30920dd67f9f\") " Dec 04 11:12:50 crc kubenswrapper[4831]: I1204 11:12:50.025184 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5efd266-00e3-426a-9136-30920dd67f9f-utilities\") pod \"d5efd266-00e3-426a-9136-30920dd67f9f\" (UID: \"d5efd266-00e3-426a-9136-30920dd67f9f\") " Dec 04 11:12:50 crc kubenswrapper[4831]: I1204 11:12:50.025263 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5efd266-00e3-426a-9136-30920dd67f9f-catalog-content\") pod \"d5efd266-00e3-426a-9136-30920dd67f9f\" (UID: \"d5efd266-00e3-426a-9136-30920dd67f9f\") " Dec 04 11:12:50 crc kubenswrapper[4831]: I1204 11:12:50.025917 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5efd266-00e3-426a-9136-30920dd67f9f-utilities" (OuterVolumeSpecName: "utilities") pod "d5efd266-00e3-426a-9136-30920dd67f9f" (UID: "d5efd266-00e3-426a-9136-30920dd67f9f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:12:50 crc kubenswrapper[4831]: I1204 11:12:50.030606 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5efd266-00e3-426a-9136-30920dd67f9f-kube-api-access-s7p8t" (OuterVolumeSpecName: "kube-api-access-s7p8t") pod "d5efd266-00e3-426a-9136-30920dd67f9f" (UID: "d5efd266-00e3-426a-9136-30920dd67f9f"). InnerVolumeSpecName "kube-api-access-s7p8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:12:50 crc kubenswrapper[4831]: I1204 11:12:50.045065 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5efd266-00e3-426a-9136-30920dd67f9f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d5efd266-00e3-426a-9136-30920dd67f9f" (UID: "d5efd266-00e3-426a-9136-30920dd67f9f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:12:50 crc kubenswrapper[4831]: I1204 11:12:50.127960 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5efd266-00e3-426a-9136-30920dd67f9f-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 11:12:50 crc kubenswrapper[4831]: I1204 11:12:50.127998 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5efd266-00e3-426a-9136-30920dd67f9f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 11:12:50 crc kubenswrapper[4831]: I1204 11:12:50.128012 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7p8t\" (UniqueName: \"kubernetes.io/projected/d5efd266-00e3-426a-9136-30920dd67f9f-kube-api-access-s7p8t\") on node \"crc\" DevicePath \"\"" Dec 04 11:12:50 crc kubenswrapper[4831]: I1204 11:12:50.415382 4831 generic.go:334] "Generic (PLEG): container finished" podID="d5efd266-00e3-426a-9136-30920dd67f9f" containerID="2464c6cae8c5b4a114273bf66c1ce8d742572f3c4c89829d9061cf5f16e91be5" exitCode=0 Dec 04 11:12:50 crc kubenswrapper[4831]: I1204 11:12:50.415471 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vqbk2" Dec 04 11:12:50 crc kubenswrapper[4831]: I1204 11:12:50.415472 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vqbk2" event={"ID":"d5efd266-00e3-426a-9136-30920dd67f9f","Type":"ContainerDied","Data":"2464c6cae8c5b4a114273bf66c1ce8d742572f3c4c89829d9061cf5f16e91be5"} Dec 04 11:12:50 crc kubenswrapper[4831]: I1204 11:12:50.415586 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vqbk2" event={"ID":"d5efd266-00e3-426a-9136-30920dd67f9f","Type":"ContainerDied","Data":"5e96aa5151ba54901a252f7120b84565facc956eb8c0c70bb6f6d75c73e835bd"} Dec 04 11:12:50 crc kubenswrapper[4831]: I1204 11:12:50.415603 4831 scope.go:117] "RemoveContainer" containerID="2464c6cae8c5b4a114273bf66c1ce8d742572f3c4c89829d9061cf5f16e91be5" Dec 04 11:12:50 crc kubenswrapper[4831]: I1204 11:12:50.437356 4831 scope.go:117] "RemoveContainer" containerID="ee04a365c691af0d3e8ecb7895e766c67b5f7c0ee1aa88506f4fd0ca59e27e4f" Dec 04 11:12:50 crc kubenswrapper[4831]: I1204 11:12:50.459017 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vqbk2"] Dec 04 11:12:50 crc kubenswrapper[4831]: I1204 11:12:50.473437 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vqbk2"] Dec 04 11:12:50 crc kubenswrapper[4831]: I1204 11:12:50.474716 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7zzf4" Dec 04 11:12:50 crc kubenswrapper[4831]: I1204 11:12:50.497072 4831 scope.go:117] "RemoveContainer" containerID="7e9245c1e9b8d7a93a7d0616b55194b01dc30327acac84d7b50f69107dd44927" Dec 04 11:12:50 crc kubenswrapper[4831]: I1204 11:12:50.527517 4831 scope.go:117] "RemoveContainer" containerID="2464c6cae8c5b4a114273bf66c1ce8d742572f3c4c89829d9061cf5f16e91be5" Dec 04 11:12:50 crc kubenswrapper[4831]: E1204 11:12:50.528072 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2464c6cae8c5b4a114273bf66c1ce8d742572f3c4c89829d9061cf5f16e91be5\": container with ID starting with 2464c6cae8c5b4a114273bf66c1ce8d742572f3c4c89829d9061cf5f16e91be5 not found: ID does not exist" containerID="2464c6cae8c5b4a114273bf66c1ce8d742572f3c4c89829d9061cf5f16e91be5" Dec 04 11:12:50 crc kubenswrapper[4831]: I1204 11:12:50.528102 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2464c6cae8c5b4a114273bf66c1ce8d742572f3c4c89829d9061cf5f16e91be5"} err="failed to get container status \"2464c6cae8c5b4a114273bf66c1ce8d742572f3c4c89829d9061cf5f16e91be5\": rpc error: code = NotFound desc = could not find container \"2464c6cae8c5b4a114273bf66c1ce8d742572f3c4c89829d9061cf5f16e91be5\": container with ID starting with 2464c6cae8c5b4a114273bf66c1ce8d742572f3c4c89829d9061cf5f16e91be5 not found: ID does not exist" Dec 04 11:12:50 crc kubenswrapper[4831]: I1204 11:12:50.528124 4831 scope.go:117] "RemoveContainer" containerID="ee04a365c691af0d3e8ecb7895e766c67b5f7c0ee1aa88506f4fd0ca59e27e4f" Dec 04 11:12:50 crc kubenswrapper[4831]: E1204 11:12:50.528517 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee04a365c691af0d3e8ecb7895e766c67b5f7c0ee1aa88506f4fd0ca59e27e4f\": container with ID starting with ee04a365c691af0d3e8ecb7895e766c67b5f7c0ee1aa88506f4fd0ca59e27e4f not found: ID does not exist" containerID="ee04a365c691af0d3e8ecb7895e766c67b5f7c0ee1aa88506f4fd0ca59e27e4f" Dec 04 11:12:50 crc kubenswrapper[4831]: I1204 11:12:50.528564 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee04a365c691af0d3e8ecb7895e766c67b5f7c0ee1aa88506f4fd0ca59e27e4f"} err="failed to get container status \"ee04a365c691af0d3e8ecb7895e766c67b5f7c0ee1aa88506f4fd0ca59e27e4f\": rpc error: code = NotFound desc = could not find container \"ee04a365c691af0d3e8ecb7895e766c67b5f7c0ee1aa88506f4fd0ca59e27e4f\": container with ID starting with ee04a365c691af0d3e8ecb7895e766c67b5f7c0ee1aa88506f4fd0ca59e27e4f not found: ID does not exist" Dec 04 11:12:50 crc kubenswrapper[4831]: I1204 11:12:50.528600 4831 scope.go:117] "RemoveContainer" containerID="7e9245c1e9b8d7a93a7d0616b55194b01dc30327acac84d7b50f69107dd44927" Dec 04 11:12:50 crc kubenswrapper[4831]: E1204 11:12:50.529063 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e9245c1e9b8d7a93a7d0616b55194b01dc30327acac84d7b50f69107dd44927\": container with ID starting with 7e9245c1e9b8d7a93a7d0616b55194b01dc30327acac84d7b50f69107dd44927 not found: ID does not exist" containerID="7e9245c1e9b8d7a93a7d0616b55194b01dc30327acac84d7b50f69107dd44927" Dec 04 11:12:50 crc kubenswrapper[4831]: I1204 11:12:50.529083 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e9245c1e9b8d7a93a7d0616b55194b01dc30327acac84d7b50f69107dd44927"} err="failed to get container status \"7e9245c1e9b8d7a93a7d0616b55194b01dc30327acac84d7b50f69107dd44927\": rpc error: code = NotFound desc = could not find container \"7e9245c1e9b8d7a93a7d0616b55194b01dc30327acac84d7b50f69107dd44927\": container with ID starting with 7e9245c1e9b8d7a93a7d0616b55194b01dc30327acac84d7b50f69107dd44927 not found: ID does not exist" Dec 04 11:12:51 crc kubenswrapper[4831]: I1204 11:12:51.287726 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5efd266-00e3-426a-9136-30920dd67f9f" path="/var/lib/kubelet/pods/d5efd266-00e3-426a-9136-30920dd67f9f/volumes" Dec 04 11:12:52 crc kubenswrapper[4831]: I1204 11:12:52.151354 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7zzf4"] Dec 04 11:12:52 crc kubenswrapper[4831]: I1204 11:12:52.435034 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7zzf4" podUID="66f3bb3f-4862-4db4-8afe-edd235beb24a" containerName="registry-server" containerID="cri-o://821099a98eb34080d440421ca2604c64e4e3aba1e6bc629fb693a35659252dc3" gracePeriod=2 Dec 04 11:12:53 crc kubenswrapper[4831]: I1204 11:12:52.957610 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7zzf4" Dec 04 11:12:53 crc kubenswrapper[4831]: I1204 11:12:52.988331 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66f3bb3f-4862-4db4-8afe-edd235beb24a-catalog-content\") pod \"66f3bb3f-4862-4db4-8afe-edd235beb24a\" (UID: \"66f3bb3f-4862-4db4-8afe-edd235beb24a\") " Dec 04 11:12:53 crc kubenswrapper[4831]: I1204 11:12:52.988389 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66f3bb3f-4862-4db4-8afe-edd235beb24a-utilities\") pod \"66f3bb3f-4862-4db4-8afe-edd235beb24a\" (UID: \"66f3bb3f-4862-4db4-8afe-edd235beb24a\") " Dec 04 11:12:53 crc kubenswrapper[4831]: I1204 11:12:52.988492 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phrvr\" (UniqueName: \"kubernetes.io/projected/66f3bb3f-4862-4db4-8afe-edd235beb24a-kube-api-access-phrvr\") pod \"66f3bb3f-4862-4db4-8afe-edd235beb24a\" (UID: \"66f3bb3f-4862-4db4-8afe-edd235beb24a\") " Dec 04 11:12:53 crc kubenswrapper[4831]: I1204 11:12:52.989198 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66f3bb3f-4862-4db4-8afe-edd235beb24a-utilities" (OuterVolumeSpecName: "utilities") pod "66f3bb3f-4862-4db4-8afe-edd235beb24a" (UID: "66f3bb3f-4862-4db4-8afe-edd235beb24a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:12:53 crc kubenswrapper[4831]: I1204 11:12:52.994270 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66f3bb3f-4862-4db4-8afe-edd235beb24a-kube-api-access-phrvr" (OuterVolumeSpecName: "kube-api-access-phrvr") pod "66f3bb3f-4862-4db4-8afe-edd235beb24a" (UID: "66f3bb3f-4862-4db4-8afe-edd235beb24a"). InnerVolumeSpecName "kube-api-access-phrvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:12:53 crc kubenswrapper[4831]: I1204 11:12:53.091603 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66f3bb3f-4862-4db4-8afe-edd235beb24a-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 11:12:53 crc kubenswrapper[4831]: I1204 11:12:53.091672 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phrvr\" (UniqueName: \"kubernetes.io/projected/66f3bb3f-4862-4db4-8afe-edd235beb24a-kube-api-access-phrvr\") on node \"crc\" DevicePath \"\"" Dec 04 11:12:53 crc kubenswrapper[4831]: I1204 11:12:53.102865 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66f3bb3f-4862-4db4-8afe-edd235beb24a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "66f3bb3f-4862-4db4-8afe-edd235beb24a" (UID: "66f3bb3f-4862-4db4-8afe-edd235beb24a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:12:53 crc kubenswrapper[4831]: I1204 11:12:53.192476 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66f3bb3f-4862-4db4-8afe-edd235beb24a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 11:12:53 crc kubenswrapper[4831]: I1204 11:12:53.448060 4831 generic.go:334] "Generic (PLEG): container finished" podID="66f3bb3f-4862-4db4-8afe-edd235beb24a" containerID="821099a98eb34080d440421ca2604c64e4e3aba1e6bc629fb693a35659252dc3" exitCode=0 Dec 04 11:12:53 crc kubenswrapper[4831]: I1204 11:12:53.448132 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7zzf4" Dec 04 11:12:53 crc kubenswrapper[4831]: I1204 11:12:53.448143 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7zzf4" event={"ID":"66f3bb3f-4862-4db4-8afe-edd235beb24a","Type":"ContainerDied","Data":"821099a98eb34080d440421ca2604c64e4e3aba1e6bc629fb693a35659252dc3"} Dec 04 11:12:53 crc kubenswrapper[4831]: I1204 11:12:53.448416 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7zzf4" event={"ID":"66f3bb3f-4862-4db4-8afe-edd235beb24a","Type":"ContainerDied","Data":"8c22e415a53bae2bdc53192cebc4d882f0b430b222e8c53daca336e07180d4e9"} Dec 04 11:12:53 crc kubenswrapper[4831]: I1204 11:12:53.448441 4831 scope.go:117] "RemoveContainer" containerID="821099a98eb34080d440421ca2604c64e4e3aba1e6bc629fb693a35659252dc3" Dec 04 11:12:53 crc kubenswrapper[4831]: I1204 11:12:53.474914 4831 scope.go:117] "RemoveContainer" containerID="0426b2caa4381e1f9fa0fce3a80d8ba5ac21e53734d3b087b680d7031819d8c5" Dec 04 11:12:53 crc kubenswrapper[4831]: I1204 11:12:53.478060 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7zzf4"] Dec 04 11:12:53 crc kubenswrapper[4831]: I1204 11:12:53.489696 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7zzf4"] Dec 04 11:12:53 crc kubenswrapper[4831]: I1204 11:12:53.506643 4831 scope.go:117] "RemoveContainer" containerID="4f078b742d5ec2884b9952edb7f73a9f28ca2b543abfa7b4c9e1150a8c882ab7" Dec 04 11:12:53 crc kubenswrapper[4831]: I1204 11:12:53.557945 4831 scope.go:117] "RemoveContainer" containerID="821099a98eb34080d440421ca2604c64e4e3aba1e6bc629fb693a35659252dc3" Dec 04 11:12:53 crc kubenswrapper[4831]: E1204 11:12:53.558504 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"821099a98eb34080d440421ca2604c64e4e3aba1e6bc629fb693a35659252dc3\": container with ID starting with 821099a98eb34080d440421ca2604c64e4e3aba1e6bc629fb693a35659252dc3 not found: ID does not exist" containerID="821099a98eb34080d440421ca2604c64e4e3aba1e6bc629fb693a35659252dc3" Dec 04 11:12:53 crc kubenswrapper[4831]: I1204 11:12:53.558570 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"821099a98eb34080d440421ca2604c64e4e3aba1e6bc629fb693a35659252dc3"} err="failed to get container status \"821099a98eb34080d440421ca2604c64e4e3aba1e6bc629fb693a35659252dc3\": rpc error: code = NotFound desc = could not find container \"821099a98eb34080d440421ca2604c64e4e3aba1e6bc629fb693a35659252dc3\": container with ID starting with 821099a98eb34080d440421ca2604c64e4e3aba1e6bc629fb693a35659252dc3 not found: ID does not exist" Dec 04 11:12:53 crc kubenswrapper[4831]: I1204 11:12:53.558604 4831 scope.go:117] "RemoveContainer" containerID="0426b2caa4381e1f9fa0fce3a80d8ba5ac21e53734d3b087b680d7031819d8c5" Dec 04 11:12:53 crc kubenswrapper[4831]: E1204 11:12:53.558922 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0426b2caa4381e1f9fa0fce3a80d8ba5ac21e53734d3b087b680d7031819d8c5\": container with ID starting with 0426b2caa4381e1f9fa0fce3a80d8ba5ac21e53734d3b087b680d7031819d8c5 not found: ID does not exist" containerID="0426b2caa4381e1f9fa0fce3a80d8ba5ac21e53734d3b087b680d7031819d8c5" Dec 04 11:12:53 crc kubenswrapper[4831]: I1204 11:12:53.558953 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0426b2caa4381e1f9fa0fce3a80d8ba5ac21e53734d3b087b680d7031819d8c5"} err="failed to get container status \"0426b2caa4381e1f9fa0fce3a80d8ba5ac21e53734d3b087b680d7031819d8c5\": rpc error: code = NotFound desc = could not find container \"0426b2caa4381e1f9fa0fce3a80d8ba5ac21e53734d3b087b680d7031819d8c5\": container with ID starting with 0426b2caa4381e1f9fa0fce3a80d8ba5ac21e53734d3b087b680d7031819d8c5 not found: ID does not exist" Dec 04 11:12:53 crc kubenswrapper[4831]: I1204 11:12:53.558971 4831 scope.go:117] "RemoveContainer" containerID="4f078b742d5ec2884b9952edb7f73a9f28ca2b543abfa7b4c9e1150a8c882ab7" Dec 04 11:12:53 crc kubenswrapper[4831]: E1204 11:12:53.559537 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f078b742d5ec2884b9952edb7f73a9f28ca2b543abfa7b4c9e1150a8c882ab7\": container with ID starting with 4f078b742d5ec2884b9952edb7f73a9f28ca2b543abfa7b4c9e1150a8c882ab7 not found: ID does not exist" containerID="4f078b742d5ec2884b9952edb7f73a9f28ca2b543abfa7b4c9e1150a8c882ab7" Dec 04 11:12:53 crc kubenswrapper[4831]: I1204 11:12:53.559613 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f078b742d5ec2884b9952edb7f73a9f28ca2b543abfa7b4c9e1150a8c882ab7"} err="failed to get container status \"4f078b742d5ec2884b9952edb7f73a9f28ca2b543abfa7b4c9e1150a8c882ab7\": rpc error: code = NotFound desc = could not find container \"4f078b742d5ec2884b9952edb7f73a9f28ca2b543abfa7b4c9e1150a8c882ab7\": container with ID starting with 4f078b742d5ec2884b9952edb7f73a9f28ca2b543abfa7b4c9e1150a8c882ab7 not found: ID does not exist" Dec 04 11:12:53 crc kubenswrapper[4831]: I1204 11:12:53.743934 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hdlfb" Dec 04 11:12:53 crc kubenswrapper[4831]: I1204 11:12:53.809064 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hdlfb" Dec 04 11:12:55 crc kubenswrapper[4831]: I1204 11:12:55.286595 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66f3bb3f-4862-4db4-8afe-edd235beb24a" path="/var/lib/kubelet/pods/66f3bb3f-4862-4db4-8afe-edd235beb24a/volumes" Dec 04 11:12:55 crc kubenswrapper[4831]: I1204 11:12:55.745461 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hdlfb"] Dec 04 11:12:55 crc kubenswrapper[4831]: I1204 11:12:55.745695 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hdlfb" podUID="1018d840-50ec-4676-abde-d52186c7d09b" containerName="registry-server" containerID="cri-o://93a03377d1baceab567336096867e8a06dbf5508c9d23b455be076c2ef5bfafd" gracePeriod=2 Dec 04 11:12:56 crc kubenswrapper[4831]: I1204 11:12:56.191762 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hdlfb" Dec 04 11:12:56 crc kubenswrapper[4831]: I1204 11:12:56.352529 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1018d840-50ec-4676-abde-d52186c7d09b-catalog-content\") pod \"1018d840-50ec-4676-abde-d52186c7d09b\" (UID: \"1018d840-50ec-4676-abde-d52186c7d09b\") " Dec 04 11:12:56 crc kubenswrapper[4831]: I1204 11:12:56.352860 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnzbg\" (UniqueName: \"kubernetes.io/projected/1018d840-50ec-4676-abde-d52186c7d09b-kube-api-access-mnzbg\") pod \"1018d840-50ec-4676-abde-d52186c7d09b\" (UID: \"1018d840-50ec-4676-abde-d52186c7d09b\") " Dec 04 11:12:56 crc kubenswrapper[4831]: I1204 11:12:56.353017 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1018d840-50ec-4676-abde-d52186c7d09b-utilities\") pod \"1018d840-50ec-4676-abde-d52186c7d09b\" (UID: \"1018d840-50ec-4676-abde-d52186c7d09b\") " Dec 04 11:12:56 crc kubenswrapper[4831]: I1204 11:12:56.353729 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1018d840-50ec-4676-abde-d52186c7d09b-utilities" (OuterVolumeSpecName: "utilities") pod "1018d840-50ec-4676-abde-d52186c7d09b" (UID: "1018d840-50ec-4676-abde-d52186c7d09b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:12:56 crc kubenswrapper[4831]: I1204 11:12:56.359914 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1018d840-50ec-4676-abde-d52186c7d09b-kube-api-access-mnzbg" (OuterVolumeSpecName: "kube-api-access-mnzbg") pod "1018d840-50ec-4676-abde-d52186c7d09b" (UID: "1018d840-50ec-4676-abde-d52186c7d09b"). InnerVolumeSpecName "kube-api-access-mnzbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:12:56 crc kubenswrapper[4831]: I1204 11:12:56.404774 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1018d840-50ec-4676-abde-d52186c7d09b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1018d840-50ec-4676-abde-d52186c7d09b" (UID: "1018d840-50ec-4676-abde-d52186c7d09b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:12:56 crc kubenswrapper[4831]: I1204 11:12:56.455566 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1018d840-50ec-4676-abde-d52186c7d09b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 11:12:56 crc kubenswrapper[4831]: I1204 11:12:56.455601 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnzbg\" (UniqueName: \"kubernetes.io/projected/1018d840-50ec-4676-abde-d52186c7d09b-kube-api-access-mnzbg\") on node \"crc\" DevicePath \"\"" Dec 04 11:12:56 crc kubenswrapper[4831]: I1204 11:12:56.455613 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1018d840-50ec-4676-abde-d52186c7d09b-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 11:12:56 crc kubenswrapper[4831]: I1204 11:12:56.482110 4831 generic.go:334] "Generic (PLEG): container finished" podID="1018d840-50ec-4676-abde-d52186c7d09b" containerID="93a03377d1baceab567336096867e8a06dbf5508c9d23b455be076c2ef5bfafd" exitCode=0 Dec 04 11:12:56 crc kubenswrapper[4831]: I1204 11:12:56.482146 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdlfb" event={"ID":"1018d840-50ec-4676-abde-d52186c7d09b","Type":"ContainerDied","Data":"93a03377d1baceab567336096867e8a06dbf5508c9d23b455be076c2ef5bfafd"} Dec 04 11:12:56 crc kubenswrapper[4831]: I1204 11:12:56.482195 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdlfb" event={"ID":"1018d840-50ec-4676-abde-d52186c7d09b","Type":"ContainerDied","Data":"4b66d1c6991550ad7fc870a809e011d17241a5bf7a261afde1c739abb1c4744d"} Dec 04 11:12:56 crc kubenswrapper[4831]: I1204 11:12:56.482199 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hdlfb" Dec 04 11:12:56 crc kubenswrapper[4831]: I1204 11:12:56.482217 4831 scope.go:117] "RemoveContainer" containerID="93a03377d1baceab567336096867e8a06dbf5508c9d23b455be076c2ef5bfafd" Dec 04 11:12:56 crc kubenswrapper[4831]: I1204 11:12:56.513899 4831 scope.go:117] "RemoveContainer" containerID="ded6132a11e3cee42001b3b517874f9086f172e1d3fcc8aeeb89a0e2bb63a5e0" Dec 04 11:12:56 crc kubenswrapper[4831]: I1204 11:12:56.525006 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hdlfb"] Dec 04 11:12:56 crc kubenswrapper[4831]: I1204 11:12:56.533369 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hdlfb"] Dec 04 11:12:56 crc kubenswrapper[4831]: I1204 11:12:56.553312 4831 scope.go:117] "RemoveContainer" containerID="9d6d96b23c8bc1f6bab2db9222ed68ff9f6af2f368d439da18bb90f6d1a9b69c" Dec 04 11:12:56 crc kubenswrapper[4831]: I1204 11:12:56.594863 4831 scope.go:117] "RemoveContainer" containerID="93a03377d1baceab567336096867e8a06dbf5508c9d23b455be076c2ef5bfafd" Dec 04 11:12:56 crc kubenswrapper[4831]: E1204 11:12:56.595474 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93a03377d1baceab567336096867e8a06dbf5508c9d23b455be076c2ef5bfafd\": container with ID starting with 93a03377d1baceab567336096867e8a06dbf5508c9d23b455be076c2ef5bfafd not found: ID does not exist" containerID="93a03377d1baceab567336096867e8a06dbf5508c9d23b455be076c2ef5bfafd" Dec 04 11:12:56 crc kubenswrapper[4831]: I1204 11:12:56.595515 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93a03377d1baceab567336096867e8a06dbf5508c9d23b455be076c2ef5bfafd"} err="failed to get container status \"93a03377d1baceab567336096867e8a06dbf5508c9d23b455be076c2ef5bfafd\": rpc error: code = NotFound desc = could not find container \"93a03377d1baceab567336096867e8a06dbf5508c9d23b455be076c2ef5bfafd\": container with ID starting with 93a03377d1baceab567336096867e8a06dbf5508c9d23b455be076c2ef5bfafd not found: ID does not exist" Dec 04 11:12:56 crc kubenswrapper[4831]: I1204 11:12:56.595543 4831 scope.go:117] "RemoveContainer" containerID="ded6132a11e3cee42001b3b517874f9086f172e1d3fcc8aeeb89a0e2bb63a5e0" Dec 04 11:12:56 crc kubenswrapper[4831]: E1204 11:12:56.595966 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ded6132a11e3cee42001b3b517874f9086f172e1d3fcc8aeeb89a0e2bb63a5e0\": container with ID starting with ded6132a11e3cee42001b3b517874f9086f172e1d3fcc8aeeb89a0e2bb63a5e0 not found: ID does not exist" containerID="ded6132a11e3cee42001b3b517874f9086f172e1d3fcc8aeeb89a0e2bb63a5e0" Dec 04 11:12:56 crc kubenswrapper[4831]: I1204 11:12:56.596017 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ded6132a11e3cee42001b3b517874f9086f172e1d3fcc8aeeb89a0e2bb63a5e0"} err="failed to get container status \"ded6132a11e3cee42001b3b517874f9086f172e1d3fcc8aeeb89a0e2bb63a5e0\": rpc error: code = NotFound desc = could not find container \"ded6132a11e3cee42001b3b517874f9086f172e1d3fcc8aeeb89a0e2bb63a5e0\": container with ID starting with ded6132a11e3cee42001b3b517874f9086f172e1d3fcc8aeeb89a0e2bb63a5e0 not found: ID does not exist" Dec 04 11:12:56 crc kubenswrapper[4831]: I1204 11:12:56.596049 4831 scope.go:117] "RemoveContainer" containerID="9d6d96b23c8bc1f6bab2db9222ed68ff9f6af2f368d439da18bb90f6d1a9b69c" Dec 04 11:12:56 crc kubenswrapper[4831]: E1204 11:12:56.596309 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d6d96b23c8bc1f6bab2db9222ed68ff9f6af2f368d439da18bb90f6d1a9b69c\": container with ID starting with 9d6d96b23c8bc1f6bab2db9222ed68ff9f6af2f368d439da18bb90f6d1a9b69c not found: ID does not exist" containerID="9d6d96b23c8bc1f6bab2db9222ed68ff9f6af2f368d439da18bb90f6d1a9b69c" Dec 04 11:12:56 crc kubenswrapper[4831]: I1204 11:12:56.596358 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d6d96b23c8bc1f6bab2db9222ed68ff9f6af2f368d439da18bb90f6d1a9b69c"} err="failed to get container status \"9d6d96b23c8bc1f6bab2db9222ed68ff9f6af2f368d439da18bb90f6d1a9b69c\": rpc error: code = NotFound desc = could not find container \"9d6d96b23c8bc1f6bab2db9222ed68ff9f6af2f368d439da18bb90f6d1a9b69c\": container with ID starting with 9d6d96b23c8bc1f6bab2db9222ed68ff9f6af2f368d439da18bb90f6d1a9b69c not found: ID does not exist" Dec 04 11:12:57 crc kubenswrapper[4831]: I1204 11:12:57.290246 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1018d840-50ec-4676-abde-d52186c7d09b" path="/var/lib/kubelet/pods/1018d840-50ec-4676-abde-d52186c7d09b/volumes" Dec 04 11:13:43 crc kubenswrapper[4831]: I1204 11:13:43.534073 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5brfj"] Dec 04 11:13:43 crc kubenswrapper[4831]: E1204 11:13:43.534964 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5efd266-00e3-426a-9136-30920dd67f9f" containerName="extract-utilities" Dec 04 11:13:43 crc kubenswrapper[4831]: I1204 11:13:43.534976 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5efd266-00e3-426a-9136-30920dd67f9f" containerName="extract-utilities" Dec 04 11:13:43 crc kubenswrapper[4831]: E1204 11:13:43.534986 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5efd266-00e3-426a-9136-30920dd67f9f" containerName="extract-content" Dec 04 11:13:43 crc kubenswrapper[4831]: I1204 11:13:43.534993 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5efd266-00e3-426a-9136-30920dd67f9f" containerName="extract-content" Dec 04 11:13:43 crc kubenswrapper[4831]: E1204 11:13:43.535009 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66f3bb3f-4862-4db4-8afe-edd235beb24a" containerName="extract-content" Dec 04 11:13:43 crc kubenswrapper[4831]: I1204 11:13:43.535014 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="66f3bb3f-4862-4db4-8afe-edd235beb24a" containerName="extract-content" Dec 04 11:13:43 crc kubenswrapper[4831]: E1204 11:13:43.535030 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1018d840-50ec-4676-abde-d52186c7d09b" containerName="extract-utilities" Dec 04 11:13:43 crc kubenswrapper[4831]: I1204 11:13:43.535035 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1018d840-50ec-4676-abde-d52186c7d09b" containerName="extract-utilities" Dec 04 11:13:43 crc kubenswrapper[4831]: E1204 11:13:43.535044 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1018d840-50ec-4676-abde-d52186c7d09b" containerName="registry-server" Dec 04 11:13:43 crc kubenswrapper[4831]: I1204 11:13:43.535049 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1018d840-50ec-4676-abde-d52186c7d09b" containerName="registry-server" Dec 04 11:13:43 crc kubenswrapper[4831]: E1204 11:13:43.535074 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1018d840-50ec-4676-abde-d52186c7d09b" containerName="extract-content" Dec 04 11:13:43 crc kubenswrapper[4831]: I1204 11:13:43.535080 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1018d840-50ec-4676-abde-d52186c7d09b" containerName="extract-content" Dec 04 11:13:43 crc kubenswrapper[4831]: E1204 11:13:43.535092 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66f3bb3f-4862-4db4-8afe-edd235beb24a" containerName="registry-server" Dec 04 11:13:43 crc kubenswrapper[4831]: I1204 11:13:43.535097 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="66f3bb3f-4862-4db4-8afe-edd235beb24a" containerName="registry-server" Dec 04 11:13:43 crc kubenswrapper[4831]: E1204 11:13:43.535114 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66f3bb3f-4862-4db4-8afe-edd235beb24a" containerName="extract-utilities" Dec 04 11:13:43 crc kubenswrapper[4831]: I1204 11:13:43.535120 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="66f3bb3f-4862-4db4-8afe-edd235beb24a" containerName="extract-utilities" Dec 04 11:13:43 crc kubenswrapper[4831]: E1204 11:13:43.535135 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5efd266-00e3-426a-9136-30920dd67f9f" containerName="registry-server" Dec 04 11:13:43 crc kubenswrapper[4831]: I1204 11:13:43.535140 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5efd266-00e3-426a-9136-30920dd67f9f" containerName="registry-server" Dec 04 11:13:43 crc kubenswrapper[4831]: I1204 11:13:43.535335 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5efd266-00e3-426a-9136-30920dd67f9f" containerName="registry-server" Dec 04 11:13:43 crc kubenswrapper[4831]: I1204 11:13:43.535350 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="1018d840-50ec-4676-abde-d52186c7d09b" containerName="registry-server" Dec 04 11:13:43 crc kubenswrapper[4831]: I1204 11:13:43.535360 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="66f3bb3f-4862-4db4-8afe-edd235beb24a" containerName="registry-server" Dec 04 11:13:43 crc kubenswrapper[4831]: I1204 11:13:43.536753 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5brfj" Dec 04 11:13:43 crc kubenswrapper[4831]: I1204 11:13:43.573784 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5brfj"] Dec 04 11:13:43 crc kubenswrapper[4831]: I1204 11:13:43.620349 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/699ad5f7-e52d-4a56-8880-2adf19bbe6d0-utilities\") pod \"community-operators-5brfj\" (UID: \"699ad5f7-e52d-4a56-8880-2adf19bbe6d0\") " pod="openshift-marketplace/community-operators-5brfj" Dec 04 11:13:43 crc kubenswrapper[4831]: I1204 11:13:43.620557 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/699ad5f7-e52d-4a56-8880-2adf19bbe6d0-catalog-content\") pod \"community-operators-5brfj\" (UID: \"699ad5f7-e52d-4a56-8880-2adf19bbe6d0\") " pod="openshift-marketplace/community-operators-5brfj" Dec 04 11:13:43 crc kubenswrapper[4831]: I1204 11:13:43.620616 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppbpf\" (UniqueName: \"kubernetes.io/projected/699ad5f7-e52d-4a56-8880-2adf19bbe6d0-kube-api-access-ppbpf\") pod \"community-operators-5brfj\" (UID: \"699ad5f7-e52d-4a56-8880-2adf19bbe6d0\") " pod="openshift-marketplace/community-operators-5brfj" Dec 04 11:13:43 crc kubenswrapper[4831]: I1204 11:13:43.754531 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/699ad5f7-e52d-4a56-8880-2adf19bbe6d0-catalog-content\") pod \"community-operators-5brfj\" (UID: \"699ad5f7-e52d-4a56-8880-2adf19bbe6d0\") " pod="openshift-marketplace/community-operators-5brfj" Dec 04 11:13:43 crc kubenswrapper[4831]: I1204 11:13:43.754606 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppbpf\" (UniqueName: \"kubernetes.io/projected/699ad5f7-e52d-4a56-8880-2adf19bbe6d0-kube-api-access-ppbpf\") pod \"community-operators-5brfj\" (UID: \"699ad5f7-e52d-4a56-8880-2adf19bbe6d0\") " pod="openshift-marketplace/community-operators-5brfj" Dec 04 11:13:43 crc kubenswrapper[4831]: I1204 11:13:43.754723 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/699ad5f7-e52d-4a56-8880-2adf19bbe6d0-utilities\") pod \"community-operators-5brfj\" (UID: \"699ad5f7-e52d-4a56-8880-2adf19bbe6d0\") " pod="openshift-marketplace/community-operators-5brfj" Dec 04 11:13:43 crc kubenswrapper[4831]: I1204 11:13:43.755076 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/699ad5f7-e52d-4a56-8880-2adf19bbe6d0-utilities\") pod \"community-operators-5brfj\" (UID: \"699ad5f7-e52d-4a56-8880-2adf19bbe6d0\") " pod="openshift-marketplace/community-operators-5brfj" Dec 04 11:13:43 crc kubenswrapper[4831]: I1204 11:13:43.755104 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/699ad5f7-e52d-4a56-8880-2adf19bbe6d0-catalog-content\") pod \"community-operators-5brfj\" (UID: \"699ad5f7-e52d-4a56-8880-2adf19bbe6d0\") " pod="openshift-marketplace/community-operators-5brfj" Dec 04 11:13:43 crc kubenswrapper[4831]: I1204 11:13:43.788544 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppbpf\" (UniqueName: \"kubernetes.io/projected/699ad5f7-e52d-4a56-8880-2adf19bbe6d0-kube-api-access-ppbpf\") pod \"community-operators-5brfj\" (UID: \"699ad5f7-e52d-4a56-8880-2adf19bbe6d0\") " pod="openshift-marketplace/community-operators-5brfj" Dec 04 11:13:43 crc kubenswrapper[4831]: I1204 11:13:43.856321 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5brfj" Dec 04 11:13:44 crc kubenswrapper[4831]: I1204 11:13:44.403640 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5brfj"] Dec 04 11:13:45 crc kubenswrapper[4831]: I1204 11:13:45.026725 4831 generic.go:334] "Generic (PLEG): container finished" podID="699ad5f7-e52d-4a56-8880-2adf19bbe6d0" containerID="36b8e5c995999937525b0154140a6cbf95706a6af70b881cec7939a83e95d396" exitCode=0 Dec 04 11:13:45 crc kubenswrapper[4831]: I1204 11:13:45.027497 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5brfj" event={"ID":"699ad5f7-e52d-4a56-8880-2adf19bbe6d0","Type":"ContainerDied","Data":"36b8e5c995999937525b0154140a6cbf95706a6af70b881cec7939a83e95d396"} Dec 04 11:13:45 crc kubenswrapper[4831]: I1204 11:13:45.027831 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5brfj" event={"ID":"699ad5f7-e52d-4a56-8880-2adf19bbe6d0","Type":"ContainerStarted","Data":"db572d091f1daf58c7ab470ec7927f88205a9a76919aa5e3eeae32ad68c4f977"} Dec 04 11:13:46 crc kubenswrapper[4831]: I1204 11:13:46.039640 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5brfj" event={"ID":"699ad5f7-e52d-4a56-8880-2adf19bbe6d0","Type":"ContainerStarted","Data":"650c42556590b6e5f87c644dfe95138e24c3ae81c8d6dee467c3a3914697013d"} Dec 04 11:13:47 crc kubenswrapper[4831]: I1204 11:13:47.053429 4831 generic.go:334] "Generic (PLEG): container finished" podID="699ad5f7-e52d-4a56-8880-2adf19bbe6d0" containerID="650c42556590b6e5f87c644dfe95138e24c3ae81c8d6dee467c3a3914697013d" exitCode=0 Dec 04 11:13:47 crc kubenswrapper[4831]: I1204 11:13:47.053553 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5brfj" event={"ID":"699ad5f7-e52d-4a56-8880-2adf19bbe6d0","Type":"ContainerDied","Data":"650c42556590b6e5f87c644dfe95138e24c3ae81c8d6dee467c3a3914697013d"} Dec 04 11:13:48 crc kubenswrapper[4831]: I1204 11:13:48.064985 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5brfj" event={"ID":"699ad5f7-e52d-4a56-8880-2adf19bbe6d0","Type":"ContainerStarted","Data":"5c7929f7ff6ae6246612f0b02b83cb2fccb997071ea5f38c1cbb282c3300059b"} Dec 04 11:13:48 crc kubenswrapper[4831]: I1204 11:13:48.102922 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5brfj" podStartSLOduration=2.712795208 podStartE2EDuration="5.102903345s" podCreationTimestamp="2025-12-04 11:13:43 +0000 UTC" firstStartedPulling="2025-12-04 11:13:45.03426029 +0000 UTC m=+3521.983435624" lastFinishedPulling="2025-12-04 11:13:47.424368447 +0000 UTC m=+3524.373543761" observedRunningTime="2025-12-04 11:13:48.095252621 +0000 UTC m=+3525.044427935" watchObservedRunningTime="2025-12-04 11:13:48.102903345 +0000 UTC m=+3525.052078659" Dec 04 11:13:53 crc kubenswrapper[4831]: I1204 11:13:53.856805 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5brfj" Dec 04 11:13:53 crc kubenswrapper[4831]: I1204 11:13:53.857189 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5brfj" Dec 04 11:13:53 crc kubenswrapper[4831]: I1204 11:13:53.903395 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5brfj" Dec 04 11:13:54 crc kubenswrapper[4831]: I1204 11:13:54.206061 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5brfj" Dec 04 11:13:54 crc kubenswrapper[4831]: I1204 11:13:54.258423 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5brfj"] Dec 04 11:13:56 crc kubenswrapper[4831]: I1204 11:13:56.175365 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5brfj" podUID="699ad5f7-e52d-4a56-8880-2adf19bbe6d0" containerName="registry-server" containerID="cri-o://5c7929f7ff6ae6246612f0b02b83cb2fccb997071ea5f38c1cbb282c3300059b" gracePeriod=2 Dec 04 11:13:57 crc kubenswrapper[4831]: I1204 11:13:57.169962 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5brfj" Dec 04 11:13:57 crc kubenswrapper[4831]: I1204 11:13:57.186796 4831 generic.go:334] "Generic (PLEG): container finished" podID="699ad5f7-e52d-4a56-8880-2adf19bbe6d0" containerID="5c7929f7ff6ae6246612f0b02b83cb2fccb997071ea5f38c1cbb282c3300059b" exitCode=0 Dec 04 11:13:57 crc kubenswrapper[4831]: I1204 11:13:57.186848 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5brfj" event={"ID":"699ad5f7-e52d-4a56-8880-2adf19bbe6d0","Type":"ContainerDied","Data":"5c7929f7ff6ae6246612f0b02b83cb2fccb997071ea5f38c1cbb282c3300059b"} Dec 04 11:13:57 crc kubenswrapper[4831]: I1204 11:13:57.186870 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5brfj" Dec 04 11:13:57 crc kubenswrapper[4831]: I1204 11:13:57.186888 4831 scope.go:117] "RemoveContainer" containerID="5c7929f7ff6ae6246612f0b02b83cb2fccb997071ea5f38c1cbb282c3300059b" Dec 04 11:13:57 crc kubenswrapper[4831]: I1204 11:13:57.186877 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5brfj" event={"ID":"699ad5f7-e52d-4a56-8880-2adf19bbe6d0","Type":"ContainerDied","Data":"db572d091f1daf58c7ab470ec7927f88205a9a76919aa5e3eeae32ad68c4f977"} Dec 04 11:13:57 crc kubenswrapper[4831]: I1204 11:13:57.215700 4831 scope.go:117] "RemoveContainer" containerID="650c42556590b6e5f87c644dfe95138e24c3ae81c8d6dee467c3a3914697013d" Dec 04 11:13:57 crc kubenswrapper[4831]: I1204 11:13:57.240021 4831 scope.go:117] "RemoveContainer" containerID="36b8e5c995999937525b0154140a6cbf95706a6af70b881cec7939a83e95d396" Dec 04 11:13:57 crc kubenswrapper[4831]: I1204 11:13:57.261904 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/699ad5f7-e52d-4a56-8880-2adf19bbe6d0-utilities\") pod \"699ad5f7-e52d-4a56-8880-2adf19bbe6d0\" (UID: \"699ad5f7-e52d-4a56-8880-2adf19bbe6d0\") " Dec 04 11:13:57 crc kubenswrapper[4831]: I1204 11:13:57.262015 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppbpf\" (UniqueName: \"kubernetes.io/projected/699ad5f7-e52d-4a56-8880-2adf19bbe6d0-kube-api-access-ppbpf\") pod \"699ad5f7-e52d-4a56-8880-2adf19bbe6d0\" (UID: \"699ad5f7-e52d-4a56-8880-2adf19bbe6d0\") " Dec 04 11:13:57 crc kubenswrapper[4831]: I1204 11:13:57.262047 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/699ad5f7-e52d-4a56-8880-2adf19bbe6d0-catalog-content\") pod \"699ad5f7-e52d-4a56-8880-2adf19bbe6d0\" (UID: \"699ad5f7-e52d-4a56-8880-2adf19bbe6d0\") " Dec 04 11:13:57 crc kubenswrapper[4831]: I1204 11:13:57.263922 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/699ad5f7-e52d-4a56-8880-2adf19bbe6d0-utilities" (OuterVolumeSpecName: "utilities") pod "699ad5f7-e52d-4a56-8880-2adf19bbe6d0" (UID: "699ad5f7-e52d-4a56-8880-2adf19bbe6d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:13:57 crc kubenswrapper[4831]: I1204 11:13:57.275461 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/699ad5f7-e52d-4a56-8880-2adf19bbe6d0-kube-api-access-ppbpf" (OuterVolumeSpecName: "kube-api-access-ppbpf") pod "699ad5f7-e52d-4a56-8880-2adf19bbe6d0" (UID: "699ad5f7-e52d-4a56-8880-2adf19bbe6d0"). InnerVolumeSpecName "kube-api-access-ppbpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:13:57 crc kubenswrapper[4831]: I1204 11:13:57.311209 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/699ad5f7-e52d-4a56-8880-2adf19bbe6d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "699ad5f7-e52d-4a56-8880-2adf19bbe6d0" (UID: "699ad5f7-e52d-4a56-8880-2adf19bbe6d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:13:57 crc kubenswrapper[4831]: I1204 11:13:57.356349 4831 scope.go:117] "RemoveContainer" containerID="5c7929f7ff6ae6246612f0b02b83cb2fccb997071ea5f38c1cbb282c3300059b" Dec 04 11:13:57 crc kubenswrapper[4831]: E1204 11:13:57.356921 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c7929f7ff6ae6246612f0b02b83cb2fccb997071ea5f38c1cbb282c3300059b\": container with ID starting with 5c7929f7ff6ae6246612f0b02b83cb2fccb997071ea5f38c1cbb282c3300059b not found: ID does not exist" containerID="5c7929f7ff6ae6246612f0b02b83cb2fccb997071ea5f38c1cbb282c3300059b" Dec 04 11:13:57 crc kubenswrapper[4831]: I1204 11:13:57.356964 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c7929f7ff6ae6246612f0b02b83cb2fccb997071ea5f38c1cbb282c3300059b"} err="failed to get container status \"5c7929f7ff6ae6246612f0b02b83cb2fccb997071ea5f38c1cbb282c3300059b\": rpc error: code = NotFound desc = could not find container \"5c7929f7ff6ae6246612f0b02b83cb2fccb997071ea5f38c1cbb282c3300059b\": container with ID starting with 5c7929f7ff6ae6246612f0b02b83cb2fccb997071ea5f38c1cbb282c3300059b not found: ID does not exist" Dec 04 11:13:57 crc kubenswrapper[4831]: I1204 11:13:57.356991 4831 scope.go:117] "RemoveContainer" containerID="650c42556590b6e5f87c644dfe95138e24c3ae81c8d6dee467c3a3914697013d" Dec 04 11:13:57 crc kubenswrapper[4831]: E1204 11:13:57.358063 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"650c42556590b6e5f87c644dfe95138e24c3ae81c8d6dee467c3a3914697013d\": container with ID starting with 650c42556590b6e5f87c644dfe95138e24c3ae81c8d6dee467c3a3914697013d not found: ID does not exist" containerID="650c42556590b6e5f87c644dfe95138e24c3ae81c8d6dee467c3a3914697013d" Dec 04 11:13:57 crc kubenswrapper[4831]: I1204 11:13:57.358092 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"650c42556590b6e5f87c644dfe95138e24c3ae81c8d6dee467c3a3914697013d"} err="failed to get container status \"650c42556590b6e5f87c644dfe95138e24c3ae81c8d6dee467c3a3914697013d\": rpc error: code = NotFound desc = could not find container \"650c42556590b6e5f87c644dfe95138e24c3ae81c8d6dee467c3a3914697013d\": container with ID starting with 650c42556590b6e5f87c644dfe95138e24c3ae81c8d6dee467c3a3914697013d not found: ID does not exist" Dec 04 11:13:57 crc kubenswrapper[4831]: I1204 11:13:57.358113 4831 scope.go:117] "RemoveContainer" containerID="36b8e5c995999937525b0154140a6cbf95706a6af70b881cec7939a83e95d396" Dec 04 11:13:57 crc kubenswrapper[4831]: E1204 11:13:57.358595 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36b8e5c995999937525b0154140a6cbf95706a6af70b881cec7939a83e95d396\": container with ID starting with 36b8e5c995999937525b0154140a6cbf95706a6af70b881cec7939a83e95d396 not found: ID does not exist" containerID="36b8e5c995999937525b0154140a6cbf95706a6af70b881cec7939a83e95d396" Dec 04 11:13:57 crc kubenswrapper[4831]: I1204 11:13:57.358616 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36b8e5c995999937525b0154140a6cbf95706a6af70b881cec7939a83e95d396"} err="failed to get container status \"36b8e5c995999937525b0154140a6cbf95706a6af70b881cec7939a83e95d396\": rpc error: code = NotFound desc = could not find container \"36b8e5c995999937525b0154140a6cbf95706a6af70b881cec7939a83e95d396\": container with ID starting with 36b8e5c995999937525b0154140a6cbf95706a6af70b881cec7939a83e95d396 not found: ID does not exist" Dec 04 11:13:57 crc kubenswrapper[4831]: I1204 11:13:57.365633 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/699ad5f7-e52d-4a56-8880-2adf19bbe6d0-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 11:13:57 crc kubenswrapper[4831]: I1204 11:13:57.365686 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppbpf\" (UniqueName: \"kubernetes.io/projected/699ad5f7-e52d-4a56-8880-2adf19bbe6d0-kube-api-access-ppbpf\") on node \"crc\" DevicePath \"\"" Dec 04 11:13:57 crc kubenswrapper[4831]: I1204 11:13:57.365704 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/699ad5f7-e52d-4a56-8880-2adf19bbe6d0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 11:13:57 crc kubenswrapper[4831]: I1204 11:13:57.517386 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5brfj"] Dec 04 11:13:57 crc kubenswrapper[4831]: I1204 11:13:57.527240 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5brfj"] Dec 04 11:13:59 crc kubenswrapper[4831]: I1204 11:13:59.292364 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="699ad5f7-e52d-4a56-8880-2adf19bbe6d0" path="/var/lib/kubelet/pods/699ad5f7-e52d-4a56-8880-2adf19bbe6d0/volumes" Dec 04 11:14:51 crc kubenswrapper[4831]: I1204 11:14:51.971593 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:14:51 crc kubenswrapper[4831]: I1204 11:14:51.972060 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:15:00 crc kubenswrapper[4831]: I1204 11:15:00.164969 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414115-zdpq9"] Dec 04 11:15:00 crc kubenswrapper[4831]: E1204 11:15:00.165992 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="699ad5f7-e52d-4a56-8880-2adf19bbe6d0" containerName="extract-utilities" Dec 04 11:15:00 crc kubenswrapper[4831]: I1204 11:15:00.166008 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="699ad5f7-e52d-4a56-8880-2adf19bbe6d0" containerName="extract-utilities" Dec 04 11:15:00 crc kubenswrapper[4831]: E1204 11:15:00.166025 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="699ad5f7-e52d-4a56-8880-2adf19bbe6d0" containerName="registry-server" Dec 04 11:15:00 crc kubenswrapper[4831]: I1204 11:15:00.166031 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="699ad5f7-e52d-4a56-8880-2adf19bbe6d0" containerName="registry-server" Dec 04 11:15:00 crc kubenswrapper[4831]: E1204 11:15:00.166043 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="699ad5f7-e52d-4a56-8880-2adf19bbe6d0" containerName="extract-content" Dec 04 11:15:00 crc kubenswrapper[4831]: I1204 11:15:00.166051 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="699ad5f7-e52d-4a56-8880-2adf19bbe6d0" containerName="extract-content" Dec 04 11:15:00 crc kubenswrapper[4831]: I1204 11:15:00.166247 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="699ad5f7-e52d-4a56-8880-2adf19bbe6d0" containerName="registry-server" Dec 04 11:15:00 crc kubenswrapper[4831]: I1204 11:15:00.167058 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414115-zdpq9" Dec 04 11:15:00 crc kubenswrapper[4831]: I1204 11:15:00.169232 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 11:15:00 crc kubenswrapper[4831]: I1204 11:15:00.170479 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 11:15:00 crc kubenswrapper[4831]: I1204 11:15:00.189996 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414115-zdpq9"] Dec 04 11:15:00 crc kubenswrapper[4831]: I1204 11:15:00.236417 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a8e126ce-a093-4fa0-b9f3-81c430d9479c-secret-volume\") pod \"collect-profiles-29414115-zdpq9\" (UID: \"a8e126ce-a093-4fa0-b9f3-81c430d9479c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414115-zdpq9" Dec 04 11:15:00 crc kubenswrapper[4831]: I1204 11:15:00.236456 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kcs5\" (UniqueName: \"kubernetes.io/projected/a8e126ce-a093-4fa0-b9f3-81c430d9479c-kube-api-access-7kcs5\") pod \"collect-profiles-29414115-zdpq9\" (UID: \"a8e126ce-a093-4fa0-b9f3-81c430d9479c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414115-zdpq9" Dec 04 11:15:00 crc kubenswrapper[4831]: I1204 11:15:00.236546 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8e126ce-a093-4fa0-b9f3-81c430d9479c-config-volume\") pod \"collect-profiles-29414115-zdpq9\" (UID: \"a8e126ce-a093-4fa0-b9f3-81c430d9479c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414115-zdpq9" Dec 04 11:15:00 crc kubenswrapper[4831]: I1204 11:15:00.338021 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a8e126ce-a093-4fa0-b9f3-81c430d9479c-secret-volume\") pod \"collect-profiles-29414115-zdpq9\" (UID: \"a8e126ce-a093-4fa0-b9f3-81c430d9479c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414115-zdpq9" Dec 04 11:15:00 crc kubenswrapper[4831]: I1204 11:15:00.338077 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kcs5\" (UniqueName: \"kubernetes.io/projected/a8e126ce-a093-4fa0-b9f3-81c430d9479c-kube-api-access-7kcs5\") pod \"collect-profiles-29414115-zdpq9\" (UID: \"a8e126ce-a093-4fa0-b9f3-81c430d9479c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414115-zdpq9" Dec 04 11:15:00 crc kubenswrapper[4831]: I1204 11:15:00.338196 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8e126ce-a093-4fa0-b9f3-81c430d9479c-config-volume\") pod \"collect-profiles-29414115-zdpq9\" (UID: \"a8e126ce-a093-4fa0-b9f3-81c430d9479c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414115-zdpq9" Dec 04 11:15:00 crc kubenswrapper[4831]: I1204 11:15:00.340052 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8e126ce-a093-4fa0-b9f3-81c430d9479c-config-volume\") pod \"collect-profiles-29414115-zdpq9\" (UID: \"a8e126ce-a093-4fa0-b9f3-81c430d9479c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414115-zdpq9" Dec 04 11:15:00 crc kubenswrapper[4831]: I1204 11:15:00.350362 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a8e126ce-a093-4fa0-b9f3-81c430d9479c-secret-volume\") pod \"collect-profiles-29414115-zdpq9\" (UID: \"a8e126ce-a093-4fa0-b9f3-81c430d9479c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414115-zdpq9" Dec 04 11:15:00 crc kubenswrapper[4831]: I1204 11:15:00.360239 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kcs5\" (UniqueName: \"kubernetes.io/projected/a8e126ce-a093-4fa0-b9f3-81c430d9479c-kube-api-access-7kcs5\") pod \"collect-profiles-29414115-zdpq9\" (UID: \"a8e126ce-a093-4fa0-b9f3-81c430d9479c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414115-zdpq9" Dec 04 11:15:00 crc kubenswrapper[4831]: I1204 11:15:00.492807 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414115-zdpq9" Dec 04 11:15:00 crc kubenswrapper[4831]: I1204 11:15:00.948035 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414115-zdpq9"] Dec 04 11:15:01 crc kubenswrapper[4831]: I1204 11:15:01.830492 4831 generic.go:334] "Generic (PLEG): container finished" podID="a8e126ce-a093-4fa0-b9f3-81c430d9479c" containerID="9f2fda01499090be9212b3a31b76a702b1d898079d0cd4c20f991673cf0a15d5" exitCode=0 Dec 04 11:15:01 crc kubenswrapper[4831]: I1204 11:15:01.830557 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414115-zdpq9" event={"ID":"a8e126ce-a093-4fa0-b9f3-81c430d9479c","Type":"ContainerDied","Data":"9f2fda01499090be9212b3a31b76a702b1d898079d0cd4c20f991673cf0a15d5"} Dec 04 11:15:01 crc kubenswrapper[4831]: I1204 11:15:01.830814 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414115-zdpq9" event={"ID":"a8e126ce-a093-4fa0-b9f3-81c430d9479c","Type":"ContainerStarted","Data":"ec32ab286e56fc9ff23249712467e3a6ea4c9b0a1e8b83c6567e170ec9ee7a98"} Dec 04 11:15:03 crc kubenswrapper[4831]: I1204 11:15:03.211622 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414115-zdpq9" Dec 04 11:15:03 crc kubenswrapper[4831]: I1204 11:15:03.302885 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a8e126ce-a093-4fa0-b9f3-81c430d9479c-secret-volume\") pod \"a8e126ce-a093-4fa0-b9f3-81c430d9479c\" (UID: \"a8e126ce-a093-4fa0-b9f3-81c430d9479c\") " Dec 04 11:15:03 crc kubenswrapper[4831]: I1204 11:15:03.303818 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kcs5\" (UniqueName: \"kubernetes.io/projected/a8e126ce-a093-4fa0-b9f3-81c430d9479c-kube-api-access-7kcs5\") pod \"a8e126ce-a093-4fa0-b9f3-81c430d9479c\" (UID: \"a8e126ce-a093-4fa0-b9f3-81c430d9479c\") " Dec 04 11:15:03 crc kubenswrapper[4831]: I1204 11:15:03.303963 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8e126ce-a093-4fa0-b9f3-81c430d9479c-config-volume\") pod \"a8e126ce-a093-4fa0-b9f3-81c430d9479c\" (UID: \"a8e126ce-a093-4fa0-b9f3-81c430d9479c\") " Dec 04 11:15:03 crc kubenswrapper[4831]: I1204 11:15:03.305096 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8e126ce-a093-4fa0-b9f3-81c430d9479c-config-volume" (OuterVolumeSpecName: "config-volume") pod "a8e126ce-a093-4fa0-b9f3-81c430d9479c" (UID: "a8e126ce-a093-4fa0-b9f3-81c430d9479c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 11:15:03 crc kubenswrapper[4831]: I1204 11:15:03.309923 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8e126ce-a093-4fa0-b9f3-81c430d9479c-kube-api-access-7kcs5" (OuterVolumeSpecName: "kube-api-access-7kcs5") pod "a8e126ce-a093-4fa0-b9f3-81c430d9479c" (UID: "a8e126ce-a093-4fa0-b9f3-81c430d9479c"). InnerVolumeSpecName "kube-api-access-7kcs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:15:03 crc kubenswrapper[4831]: I1204 11:15:03.309995 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8e126ce-a093-4fa0-b9f3-81c430d9479c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a8e126ce-a093-4fa0-b9f3-81c430d9479c" (UID: "a8e126ce-a093-4fa0-b9f3-81c430d9479c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 11:15:03 crc kubenswrapper[4831]: I1204 11:15:03.406844 4831 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a8e126ce-a093-4fa0-b9f3-81c430d9479c-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 11:15:03 crc kubenswrapper[4831]: I1204 11:15:03.406884 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kcs5\" (UniqueName: \"kubernetes.io/projected/a8e126ce-a093-4fa0-b9f3-81c430d9479c-kube-api-access-7kcs5\") on node \"crc\" DevicePath \"\"" Dec 04 11:15:03 crc kubenswrapper[4831]: I1204 11:15:03.406894 4831 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8e126ce-a093-4fa0-b9f3-81c430d9479c-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 11:15:03 crc kubenswrapper[4831]: I1204 11:15:03.852594 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414115-zdpq9" event={"ID":"a8e126ce-a093-4fa0-b9f3-81c430d9479c","Type":"ContainerDied","Data":"ec32ab286e56fc9ff23249712467e3a6ea4c9b0a1e8b83c6567e170ec9ee7a98"} Dec 04 11:15:03 crc kubenswrapper[4831]: I1204 11:15:03.852642 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec32ab286e56fc9ff23249712467e3a6ea4c9b0a1e8b83c6567e170ec9ee7a98" Dec 04 11:15:03 crc kubenswrapper[4831]: I1204 11:15:03.852684 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414115-zdpq9" Dec 04 11:15:04 crc kubenswrapper[4831]: I1204 11:15:04.283629 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414070-rl7v9"] Dec 04 11:15:04 crc kubenswrapper[4831]: I1204 11:15:04.292175 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414070-rl7v9"] Dec 04 11:15:05 crc kubenswrapper[4831]: I1204 11:15:05.293113 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afb08d97-21c7-4452-b8a7-8a8776ee28dd" path="/var/lib/kubelet/pods/afb08d97-21c7-4452-b8a7-8a8776ee28dd/volumes" Dec 04 11:15:21 crc kubenswrapper[4831]: I1204 11:15:21.971274 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:15:21 crc kubenswrapper[4831]: I1204 11:15:21.971910 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:15:27 crc kubenswrapper[4831]: I1204 11:15:27.097597 4831 scope.go:117] "RemoveContainer" containerID="bab3605276bec84e3f327d7aaa6d5632752bdf9a166e75a92d25d378b0f8e676" Dec 04 11:15:51 crc kubenswrapper[4831]: I1204 11:15:51.971638 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:15:51 crc kubenswrapper[4831]: I1204 11:15:51.972215 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:15:51 crc kubenswrapper[4831]: I1204 11:15:51.972272 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" Dec 04 11:15:51 crc kubenswrapper[4831]: I1204 11:15:51.973100 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"57e437ff5859ec9f82a8e1fe2572501527333499dc8755f22ed7b112ac0dbdce"} pod="openshift-machine-config-operator/machine-config-daemon-g76nn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 11:15:51 crc kubenswrapper[4831]: I1204 11:15:51.973154 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" containerID="cri-o://57e437ff5859ec9f82a8e1fe2572501527333499dc8755f22ed7b112ac0dbdce" gracePeriod=600 Dec 04 11:15:52 crc kubenswrapper[4831]: I1204 11:15:52.346656 4831 generic.go:334] "Generic (PLEG): container finished" podID="8475bb26-8864-4d49-935b-db7d4cb73387" containerID="57e437ff5859ec9f82a8e1fe2572501527333499dc8755f22ed7b112ac0dbdce" exitCode=0 Dec 04 11:15:52 crc kubenswrapper[4831]: I1204 11:15:52.346730 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" event={"ID":"8475bb26-8864-4d49-935b-db7d4cb73387","Type":"ContainerDied","Data":"57e437ff5859ec9f82a8e1fe2572501527333499dc8755f22ed7b112ac0dbdce"} Dec 04 11:15:52 crc kubenswrapper[4831]: I1204 11:15:52.347061 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" event={"ID":"8475bb26-8864-4d49-935b-db7d4cb73387","Type":"ContainerStarted","Data":"bf3c114529f9790daef73bbc5dcad6e21ea3099eb2d7c3da40301b455b0aef23"} Dec 04 11:15:52 crc kubenswrapper[4831]: I1204 11:15:52.347089 4831 scope.go:117] "RemoveContainer" containerID="7b5d270b8a8f031ed2b37a53c176bca264e70c00ff17665ac0f58a0f5614d78e" Dec 04 11:18:21 crc kubenswrapper[4831]: I1204 11:18:21.972371 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:18:21 crc kubenswrapper[4831]: I1204 11:18:21.973608 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:18:51 crc kubenswrapper[4831]: I1204 11:18:51.971302 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:18:51 crc kubenswrapper[4831]: I1204 11:18:51.971888 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:19:21 crc kubenswrapper[4831]: I1204 11:19:21.971999 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:19:21 crc kubenswrapper[4831]: I1204 11:19:21.972529 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:19:21 crc kubenswrapper[4831]: I1204 11:19:21.972580 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" Dec 04 11:19:21 crc kubenswrapper[4831]: I1204 11:19:21.973415 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bf3c114529f9790daef73bbc5dcad6e21ea3099eb2d7c3da40301b455b0aef23"} pod="openshift-machine-config-operator/machine-config-daemon-g76nn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 11:19:21 crc kubenswrapper[4831]: I1204 11:19:21.973478 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" containerID="cri-o://bf3c114529f9790daef73bbc5dcad6e21ea3099eb2d7c3da40301b455b0aef23" gracePeriod=600 Dec 04 11:19:22 crc kubenswrapper[4831]: I1204 11:19:22.143392 4831 generic.go:334] "Generic (PLEG): container finished" podID="8475bb26-8864-4d49-935b-db7d4cb73387" containerID="bf3c114529f9790daef73bbc5dcad6e21ea3099eb2d7c3da40301b455b0aef23" exitCode=0 Dec 04 11:19:22 crc kubenswrapper[4831]: I1204 11:19:22.143446 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" event={"ID":"8475bb26-8864-4d49-935b-db7d4cb73387","Type":"ContainerDied","Data":"bf3c114529f9790daef73bbc5dcad6e21ea3099eb2d7c3da40301b455b0aef23"} Dec 04 11:19:22 crc kubenswrapper[4831]: I1204 11:19:22.143480 4831 scope.go:117] "RemoveContainer" containerID="57e437ff5859ec9f82a8e1fe2572501527333499dc8755f22ed7b112ac0dbdce" Dec 04 11:19:22 crc kubenswrapper[4831]: E1204 11:19:22.251185 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:19:23 crc kubenswrapper[4831]: I1204 11:19:23.158161 4831 scope.go:117] "RemoveContainer" containerID="bf3c114529f9790daef73bbc5dcad6e21ea3099eb2d7c3da40301b455b0aef23" Dec 04 11:19:23 crc kubenswrapper[4831]: E1204 11:19:23.158690 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:19:36 crc kubenswrapper[4831]: I1204 11:19:36.277133 4831 scope.go:117] "RemoveContainer" containerID="bf3c114529f9790daef73bbc5dcad6e21ea3099eb2d7c3da40301b455b0aef23" Dec 04 11:19:36 crc kubenswrapper[4831]: E1204 11:19:36.278017 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:19:50 crc kubenswrapper[4831]: I1204 11:19:50.276315 4831 scope.go:117] "RemoveContainer" containerID="bf3c114529f9790daef73bbc5dcad6e21ea3099eb2d7c3da40301b455b0aef23" Dec 04 11:19:50 crc kubenswrapper[4831]: E1204 11:19:50.277307 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:20:01 crc kubenswrapper[4831]: I1204 11:20:01.277078 4831 scope.go:117] "RemoveContainer" containerID="bf3c114529f9790daef73bbc5dcad6e21ea3099eb2d7c3da40301b455b0aef23" Dec 04 11:20:01 crc kubenswrapper[4831]: E1204 11:20:01.278431 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:20:13 crc kubenswrapper[4831]: I1204 11:20:13.285503 4831 scope.go:117] "RemoveContainer" containerID="bf3c114529f9790daef73bbc5dcad6e21ea3099eb2d7c3da40301b455b0aef23" Dec 04 11:20:13 crc kubenswrapper[4831]: E1204 11:20:13.286273 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:20:28 crc kubenswrapper[4831]: I1204 11:20:28.276932 4831 scope.go:117] "RemoveContainer" containerID="bf3c114529f9790daef73bbc5dcad6e21ea3099eb2d7c3da40301b455b0aef23" Dec 04 11:20:28 crc kubenswrapper[4831]: E1204 11:20:28.278164 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:20:41 crc kubenswrapper[4831]: I1204 11:20:41.276941 4831 scope.go:117] "RemoveContainer" containerID="bf3c114529f9790daef73bbc5dcad6e21ea3099eb2d7c3da40301b455b0aef23" Dec 04 11:20:41 crc kubenswrapper[4831]: E1204 11:20:41.277792 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:20:54 crc kubenswrapper[4831]: I1204 11:20:54.277687 4831 scope.go:117] "RemoveContainer" containerID="bf3c114529f9790daef73bbc5dcad6e21ea3099eb2d7c3da40301b455b0aef23" Dec 04 11:20:54 crc kubenswrapper[4831]: E1204 11:20:54.278850 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:21:05 crc kubenswrapper[4831]: I1204 11:21:05.276300 4831 scope.go:117] "RemoveContainer" containerID="bf3c114529f9790daef73bbc5dcad6e21ea3099eb2d7c3da40301b455b0aef23" Dec 04 11:21:05 crc kubenswrapper[4831]: E1204 11:21:05.277101 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:21:17 crc kubenswrapper[4831]: I1204 11:21:17.277312 4831 scope.go:117] "RemoveContainer" containerID="bf3c114529f9790daef73bbc5dcad6e21ea3099eb2d7c3da40301b455b0aef23" Dec 04 11:21:17 crc kubenswrapper[4831]: E1204 11:21:17.278555 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:21:31 crc kubenswrapper[4831]: I1204 11:21:31.276777 4831 scope.go:117] "RemoveContainer" containerID="bf3c114529f9790daef73bbc5dcad6e21ea3099eb2d7c3da40301b455b0aef23" Dec 04 11:21:31 crc kubenswrapper[4831]: E1204 11:21:31.277763 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:21:45 crc kubenswrapper[4831]: I1204 11:21:45.276721 4831 scope.go:117] "RemoveContainer" containerID="bf3c114529f9790daef73bbc5dcad6e21ea3099eb2d7c3da40301b455b0aef23" Dec 04 11:21:45 crc kubenswrapper[4831]: E1204 11:21:45.278615 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:21:57 crc kubenswrapper[4831]: I1204 11:21:57.277354 4831 scope.go:117] "RemoveContainer" containerID="bf3c114529f9790daef73bbc5dcad6e21ea3099eb2d7c3da40301b455b0aef23" Dec 04 11:21:57 crc kubenswrapper[4831]: E1204 11:21:57.278504 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:22:10 crc kubenswrapper[4831]: I1204 11:22:10.276729 4831 scope.go:117] "RemoveContainer" containerID="bf3c114529f9790daef73bbc5dcad6e21ea3099eb2d7c3da40301b455b0aef23" Dec 04 11:22:10 crc kubenswrapper[4831]: E1204 11:22:10.277482 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:22:22 crc kubenswrapper[4831]: I1204 11:22:22.276339 4831 scope.go:117] "RemoveContainer" containerID="bf3c114529f9790daef73bbc5dcad6e21ea3099eb2d7c3da40301b455b0aef23" Dec 04 11:22:22 crc kubenswrapper[4831]: E1204 11:22:22.277070 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:22:34 crc kubenswrapper[4831]: I1204 11:22:34.276925 4831 scope.go:117] "RemoveContainer" containerID="bf3c114529f9790daef73bbc5dcad6e21ea3099eb2d7c3da40301b455b0aef23" Dec 04 11:22:34 crc kubenswrapper[4831]: E1204 11:22:34.278853 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:22:48 crc kubenswrapper[4831]: I1204 11:22:48.277027 4831 scope.go:117] "RemoveContainer" containerID="bf3c114529f9790daef73bbc5dcad6e21ea3099eb2d7c3da40301b455b0aef23" Dec 04 11:22:48 crc kubenswrapper[4831]: E1204 11:22:48.277839 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:23:00 crc kubenswrapper[4831]: I1204 11:23:00.277617 4831 scope.go:117] "RemoveContainer" containerID="bf3c114529f9790daef73bbc5dcad6e21ea3099eb2d7c3da40301b455b0aef23" Dec 04 11:23:00 crc kubenswrapper[4831]: E1204 11:23:00.278590 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:23:04 crc kubenswrapper[4831]: I1204 11:23:04.545621 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kcqlv"] Dec 04 11:23:04 crc kubenswrapper[4831]: E1204 11:23:04.546938 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8e126ce-a093-4fa0-b9f3-81c430d9479c" containerName="collect-profiles" Dec 04 11:23:04 crc kubenswrapper[4831]: I1204 11:23:04.546962 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8e126ce-a093-4fa0-b9f3-81c430d9479c" containerName="collect-profiles" Dec 04 11:23:04 crc kubenswrapper[4831]: I1204 11:23:04.547363 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8e126ce-a093-4fa0-b9f3-81c430d9479c" containerName="collect-profiles" Dec 04 11:23:04 crc kubenswrapper[4831]: I1204 11:23:04.552268 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kcqlv" Dec 04 11:23:04 crc kubenswrapper[4831]: I1204 11:23:04.567071 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kcqlv"] Dec 04 11:23:04 crc kubenswrapper[4831]: I1204 11:23:04.617922 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13e5eda8-90b8-443c-b358-a0f0d8338085-catalog-content\") pod \"redhat-operators-kcqlv\" (UID: \"13e5eda8-90b8-443c-b358-a0f0d8338085\") " pod="openshift-marketplace/redhat-operators-kcqlv" Dec 04 11:23:04 crc kubenswrapper[4831]: I1204 11:23:04.618236 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13e5eda8-90b8-443c-b358-a0f0d8338085-utilities\") pod \"redhat-operators-kcqlv\" (UID: \"13e5eda8-90b8-443c-b358-a0f0d8338085\") " pod="openshift-marketplace/redhat-operators-kcqlv" Dec 04 11:23:04 crc kubenswrapper[4831]: I1204 11:23:04.618307 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kjkc\" (UniqueName: \"kubernetes.io/projected/13e5eda8-90b8-443c-b358-a0f0d8338085-kube-api-access-5kjkc\") pod \"redhat-operators-kcqlv\" (UID: \"13e5eda8-90b8-443c-b358-a0f0d8338085\") " pod="openshift-marketplace/redhat-operators-kcqlv" Dec 04 11:23:04 crc kubenswrapper[4831]: I1204 11:23:04.721113 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13e5eda8-90b8-443c-b358-a0f0d8338085-utilities\") pod \"redhat-operators-kcqlv\" (UID: \"13e5eda8-90b8-443c-b358-a0f0d8338085\") " pod="openshift-marketplace/redhat-operators-kcqlv" Dec 04 11:23:04 crc kubenswrapper[4831]: I1204 11:23:04.721172 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kjkc\" (UniqueName: \"kubernetes.io/projected/13e5eda8-90b8-443c-b358-a0f0d8338085-kube-api-access-5kjkc\") pod \"redhat-operators-kcqlv\" (UID: \"13e5eda8-90b8-443c-b358-a0f0d8338085\") " pod="openshift-marketplace/redhat-operators-kcqlv" Dec 04 11:23:04 crc kubenswrapper[4831]: I1204 11:23:04.721301 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13e5eda8-90b8-443c-b358-a0f0d8338085-catalog-content\") pod \"redhat-operators-kcqlv\" (UID: \"13e5eda8-90b8-443c-b358-a0f0d8338085\") " pod="openshift-marketplace/redhat-operators-kcqlv" Dec 04 11:23:04 crc kubenswrapper[4831]: I1204 11:23:04.721770 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13e5eda8-90b8-443c-b358-a0f0d8338085-utilities\") pod \"redhat-operators-kcqlv\" (UID: \"13e5eda8-90b8-443c-b358-a0f0d8338085\") " pod="openshift-marketplace/redhat-operators-kcqlv" Dec 04 11:23:04 crc kubenswrapper[4831]: I1204 11:23:04.721770 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13e5eda8-90b8-443c-b358-a0f0d8338085-catalog-content\") pod \"redhat-operators-kcqlv\" (UID: \"13e5eda8-90b8-443c-b358-a0f0d8338085\") " pod="openshift-marketplace/redhat-operators-kcqlv" Dec 04 11:23:04 crc kubenswrapper[4831]: I1204 11:23:04.748034 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kjkc\" (UniqueName: \"kubernetes.io/projected/13e5eda8-90b8-443c-b358-a0f0d8338085-kube-api-access-5kjkc\") pod \"redhat-operators-kcqlv\" (UID: \"13e5eda8-90b8-443c-b358-a0f0d8338085\") " pod="openshift-marketplace/redhat-operators-kcqlv" Dec 04 11:23:04 crc kubenswrapper[4831]: I1204 11:23:04.883108 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kcqlv" Dec 04 11:23:05 crc kubenswrapper[4831]: I1204 11:23:05.408486 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kcqlv"] Dec 04 11:23:06 crc kubenswrapper[4831]: I1204 11:23:06.369631 4831 generic.go:334] "Generic (PLEG): container finished" podID="13e5eda8-90b8-443c-b358-a0f0d8338085" containerID="235befc0183d8d07a6816837fbe86b64308fcc481c0595fff950c9be61e27512" exitCode=0 Dec 04 11:23:06 crc kubenswrapper[4831]: I1204 11:23:06.369724 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kcqlv" event={"ID":"13e5eda8-90b8-443c-b358-a0f0d8338085","Type":"ContainerDied","Data":"235befc0183d8d07a6816837fbe86b64308fcc481c0595fff950c9be61e27512"} Dec 04 11:23:06 crc kubenswrapper[4831]: I1204 11:23:06.369785 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kcqlv" event={"ID":"13e5eda8-90b8-443c-b358-a0f0d8338085","Type":"ContainerStarted","Data":"0f960eef0fb7642ef50a1570b9f38f277eba9b2e990baa086e13dcb04ff48ac9"} Dec 04 11:23:06 crc kubenswrapper[4831]: I1204 11:23:06.372567 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 11:23:08 crc kubenswrapper[4831]: I1204 11:23:08.397935 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kcqlv" event={"ID":"13e5eda8-90b8-443c-b358-a0f0d8338085","Type":"ContainerStarted","Data":"67b6defa56844e2c16c31ad6a4a6bda137075bca0c73f17d11f98a574e332a63"} Dec 04 11:23:11 crc kubenswrapper[4831]: I1204 11:23:11.432632 4831 generic.go:334] "Generic (PLEG): container finished" podID="13e5eda8-90b8-443c-b358-a0f0d8338085" containerID="67b6defa56844e2c16c31ad6a4a6bda137075bca0c73f17d11f98a574e332a63" exitCode=0 Dec 04 11:23:11 crc kubenswrapper[4831]: I1204 11:23:11.432713 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kcqlv" event={"ID":"13e5eda8-90b8-443c-b358-a0f0d8338085","Type":"ContainerDied","Data":"67b6defa56844e2c16c31ad6a4a6bda137075bca0c73f17d11f98a574e332a63"} Dec 04 11:23:12 crc kubenswrapper[4831]: I1204 11:23:12.443617 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kcqlv" event={"ID":"13e5eda8-90b8-443c-b358-a0f0d8338085","Type":"ContainerStarted","Data":"6ac975200fc97ae47c42480f97bb40034f4522b0fc176d4a3b44c837aa0ec5e5"} Dec 04 11:23:12 crc kubenswrapper[4831]: I1204 11:23:12.469035 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kcqlv" podStartSLOduration=2.941200791 podStartE2EDuration="8.469014159s" podCreationTimestamp="2025-12-04 11:23:04 +0000 UTC" firstStartedPulling="2025-12-04 11:23:06.372215623 +0000 UTC m=+4083.321390937" lastFinishedPulling="2025-12-04 11:23:11.900028981 +0000 UTC m=+4088.849204305" observedRunningTime="2025-12-04 11:23:12.46423327 +0000 UTC m=+4089.413408584" watchObservedRunningTime="2025-12-04 11:23:12.469014159 +0000 UTC m=+4089.418189463" Dec 04 11:23:14 crc kubenswrapper[4831]: I1204 11:23:14.883827 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kcqlv" Dec 04 11:23:14 crc kubenswrapper[4831]: I1204 11:23:14.884135 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kcqlv" Dec 04 11:23:15 crc kubenswrapper[4831]: I1204 11:23:15.277221 4831 scope.go:117] "RemoveContainer" containerID="bf3c114529f9790daef73bbc5dcad6e21ea3099eb2d7c3da40301b455b0aef23" Dec 04 11:23:15 crc kubenswrapper[4831]: E1204 11:23:15.277546 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:23:15 crc kubenswrapper[4831]: I1204 11:23:15.937827 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kcqlv" podUID="13e5eda8-90b8-443c-b358-a0f0d8338085" containerName="registry-server" probeResult="failure" output=< Dec 04 11:23:15 crc kubenswrapper[4831]: timeout: failed to connect service ":50051" within 1s Dec 04 11:23:15 crc kubenswrapper[4831]: > Dec 04 11:23:24 crc kubenswrapper[4831]: I1204 11:23:24.961993 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kcqlv" Dec 04 11:23:25 crc kubenswrapper[4831]: I1204 11:23:25.037014 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kcqlv" Dec 04 11:23:25 crc kubenswrapper[4831]: I1204 11:23:25.208248 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kcqlv"] Dec 04 11:23:26 crc kubenswrapper[4831]: I1204 11:23:26.276435 4831 scope.go:117] "RemoveContainer" containerID="bf3c114529f9790daef73bbc5dcad6e21ea3099eb2d7c3da40301b455b0aef23" Dec 04 11:23:26 crc kubenswrapper[4831]: E1204 11:23:26.277120 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:23:26 crc kubenswrapper[4831]: I1204 11:23:26.592099 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kcqlv" podUID="13e5eda8-90b8-443c-b358-a0f0d8338085" containerName="registry-server" containerID="cri-o://6ac975200fc97ae47c42480f97bb40034f4522b0fc176d4a3b44c837aa0ec5e5" gracePeriod=2 Dec 04 11:23:33 crc kubenswrapper[4831]: I1204 11:23:33.163761 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kcqlv_13e5eda8-90b8-443c-b358-a0f0d8338085/registry-server/0.log" Dec 04 11:23:33 crc kubenswrapper[4831]: I1204 11:23:33.165520 4831 generic.go:334] "Generic (PLEG): container finished" podID="13e5eda8-90b8-443c-b358-a0f0d8338085" containerID="6ac975200fc97ae47c42480f97bb40034f4522b0fc176d4a3b44c837aa0ec5e5" exitCode=137 Dec 04 11:23:33 crc kubenswrapper[4831]: I1204 11:23:33.165567 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kcqlv" event={"ID":"13e5eda8-90b8-443c-b358-a0f0d8338085","Type":"ContainerDied","Data":"6ac975200fc97ae47c42480f97bb40034f4522b0fc176d4a3b44c837aa0ec5e5"} Dec 04 11:23:33 crc kubenswrapper[4831]: I1204 11:23:33.428813 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kcqlv_13e5eda8-90b8-443c-b358-a0f0d8338085/registry-server/0.log" Dec 04 11:23:33 crc kubenswrapper[4831]: I1204 11:23:33.430065 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kcqlv" Dec 04 11:23:33 crc kubenswrapper[4831]: I1204 11:23:33.533906 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13e5eda8-90b8-443c-b358-a0f0d8338085-catalog-content\") pod \"13e5eda8-90b8-443c-b358-a0f0d8338085\" (UID: \"13e5eda8-90b8-443c-b358-a0f0d8338085\") " Dec 04 11:23:33 crc kubenswrapper[4831]: I1204 11:23:33.533956 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13e5eda8-90b8-443c-b358-a0f0d8338085-utilities\") pod \"13e5eda8-90b8-443c-b358-a0f0d8338085\" (UID: \"13e5eda8-90b8-443c-b358-a0f0d8338085\") " Dec 04 11:23:33 crc kubenswrapper[4831]: I1204 11:23:33.534036 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kjkc\" (UniqueName: \"kubernetes.io/projected/13e5eda8-90b8-443c-b358-a0f0d8338085-kube-api-access-5kjkc\") pod \"13e5eda8-90b8-443c-b358-a0f0d8338085\" (UID: \"13e5eda8-90b8-443c-b358-a0f0d8338085\") " Dec 04 11:23:33 crc kubenswrapper[4831]: I1204 11:23:33.535153 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13e5eda8-90b8-443c-b358-a0f0d8338085-utilities" (OuterVolumeSpecName: "utilities") pod "13e5eda8-90b8-443c-b358-a0f0d8338085" (UID: "13e5eda8-90b8-443c-b358-a0f0d8338085"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:23:33 crc kubenswrapper[4831]: I1204 11:23:33.543066 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13e5eda8-90b8-443c-b358-a0f0d8338085-kube-api-access-5kjkc" (OuterVolumeSpecName: "kube-api-access-5kjkc") pod "13e5eda8-90b8-443c-b358-a0f0d8338085" (UID: "13e5eda8-90b8-443c-b358-a0f0d8338085"). InnerVolumeSpecName "kube-api-access-5kjkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:23:33 crc kubenswrapper[4831]: I1204 11:23:33.636693 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13e5eda8-90b8-443c-b358-a0f0d8338085-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 11:23:33 crc kubenswrapper[4831]: I1204 11:23:33.636742 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kjkc\" (UniqueName: \"kubernetes.io/projected/13e5eda8-90b8-443c-b358-a0f0d8338085-kube-api-access-5kjkc\") on node \"crc\" DevicePath \"\"" Dec 04 11:23:33 crc kubenswrapper[4831]: I1204 11:23:33.681445 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13e5eda8-90b8-443c-b358-a0f0d8338085-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "13e5eda8-90b8-443c-b358-a0f0d8338085" (UID: "13e5eda8-90b8-443c-b358-a0f0d8338085"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:23:33 crc kubenswrapper[4831]: I1204 11:23:33.738594 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13e5eda8-90b8-443c-b358-a0f0d8338085-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 11:23:34 crc kubenswrapper[4831]: I1204 11:23:34.177462 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kcqlv_13e5eda8-90b8-443c-b358-a0f0d8338085/registry-server/0.log" Dec 04 11:23:34 crc kubenswrapper[4831]: I1204 11:23:34.179432 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kcqlv" event={"ID":"13e5eda8-90b8-443c-b358-a0f0d8338085","Type":"ContainerDied","Data":"0f960eef0fb7642ef50a1570b9f38f277eba9b2e990baa086e13dcb04ff48ac9"} Dec 04 11:23:34 crc kubenswrapper[4831]: I1204 11:23:34.179480 4831 scope.go:117] "RemoveContainer" containerID="6ac975200fc97ae47c42480f97bb40034f4522b0fc176d4a3b44c837aa0ec5e5" Dec 04 11:23:34 crc kubenswrapper[4831]: I1204 11:23:34.179621 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kcqlv" Dec 04 11:23:34 crc kubenswrapper[4831]: I1204 11:23:34.207565 4831 scope.go:117] "RemoveContainer" containerID="67b6defa56844e2c16c31ad6a4a6bda137075bca0c73f17d11f98a574e332a63" Dec 04 11:23:34 crc kubenswrapper[4831]: I1204 11:23:34.232039 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kcqlv"] Dec 04 11:23:34 crc kubenswrapper[4831]: I1204 11:23:34.244110 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kcqlv"] Dec 04 11:23:34 crc kubenswrapper[4831]: I1204 11:23:34.248955 4831 scope.go:117] "RemoveContainer" containerID="235befc0183d8d07a6816837fbe86b64308fcc481c0595fff950c9be61e27512" Dec 04 11:23:35 crc kubenswrapper[4831]: I1204 11:23:35.289245 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13e5eda8-90b8-443c-b358-a0f0d8338085" path="/var/lib/kubelet/pods/13e5eda8-90b8-443c-b358-a0f0d8338085/volumes" Dec 04 11:23:39 crc kubenswrapper[4831]: I1204 11:23:39.899169 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6l57c"] Dec 04 11:23:39 crc kubenswrapper[4831]: E1204 11:23:39.900320 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13e5eda8-90b8-443c-b358-a0f0d8338085" containerName="extract-content" Dec 04 11:23:39 crc kubenswrapper[4831]: I1204 11:23:39.900337 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="13e5eda8-90b8-443c-b358-a0f0d8338085" containerName="extract-content" Dec 04 11:23:39 crc kubenswrapper[4831]: E1204 11:23:39.900370 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13e5eda8-90b8-443c-b358-a0f0d8338085" containerName="extract-utilities" Dec 04 11:23:39 crc kubenswrapper[4831]: I1204 11:23:39.900379 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="13e5eda8-90b8-443c-b358-a0f0d8338085" containerName="extract-utilities" Dec 04 11:23:39 crc kubenswrapper[4831]: E1204 11:23:39.900418 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13e5eda8-90b8-443c-b358-a0f0d8338085" containerName="registry-server" Dec 04 11:23:39 crc kubenswrapper[4831]: I1204 11:23:39.900430 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="13e5eda8-90b8-443c-b358-a0f0d8338085" containerName="registry-server" Dec 04 11:23:39 crc kubenswrapper[4831]: I1204 11:23:39.900694 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="13e5eda8-90b8-443c-b358-a0f0d8338085" containerName="registry-server" Dec 04 11:23:39 crc kubenswrapper[4831]: I1204 11:23:39.902632 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6l57c" Dec 04 11:23:39 crc kubenswrapper[4831]: I1204 11:23:39.911322 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6l57c"] Dec 04 11:23:40 crc kubenswrapper[4831]: I1204 11:23:40.061194 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6-utilities\") pod \"certified-operators-6l57c\" (UID: \"bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6\") " pod="openshift-marketplace/certified-operators-6l57c" Dec 04 11:23:40 crc kubenswrapper[4831]: I1204 11:23:40.061272 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6-catalog-content\") pod \"certified-operators-6l57c\" (UID: \"bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6\") " pod="openshift-marketplace/certified-operators-6l57c" Dec 04 11:23:40 crc kubenswrapper[4831]: I1204 11:23:40.061625 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqwgm\" (UniqueName: \"kubernetes.io/projected/bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6-kube-api-access-cqwgm\") pod \"certified-operators-6l57c\" (UID: \"bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6\") " pod="openshift-marketplace/certified-operators-6l57c" Dec 04 11:23:40 crc kubenswrapper[4831]: I1204 11:23:40.163069 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqwgm\" (UniqueName: \"kubernetes.io/projected/bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6-kube-api-access-cqwgm\") pod \"certified-operators-6l57c\" (UID: \"bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6\") " pod="openshift-marketplace/certified-operators-6l57c" Dec 04 11:23:40 crc kubenswrapper[4831]: I1204 11:23:40.163193 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6-utilities\") pod \"certified-operators-6l57c\" (UID: \"bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6\") " pod="openshift-marketplace/certified-operators-6l57c" Dec 04 11:23:40 crc kubenswrapper[4831]: I1204 11:23:40.163277 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6-catalog-content\") pod \"certified-operators-6l57c\" (UID: \"bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6\") " pod="openshift-marketplace/certified-operators-6l57c" Dec 04 11:23:40 crc kubenswrapper[4831]: I1204 11:23:40.163922 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6-catalog-content\") pod \"certified-operators-6l57c\" (UID: \"bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6\") " pod="openshift-marketplace/certified-operators-6l57c" Dec 04 11:23:40 crc kubenswrapper[4831]: I1204 11:23:40.163949 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6-utilities\") pod \"certified-operators-6l57c\" (UID: \"bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6\") " pod="openshift-marketplace/certified-operators-6l57c" Dec 04 11:23:40 crc kubenswrapper[4831]: I1204 11:23:40.204002 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqwgm\" (UniqueName: \"kubernetes.io/projected/bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6-kube-api-access-cqwgm\") pod \"certified-operators-6l57c\" (UID: \"bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6\") " pod="openshift-marketplace/certified-operators-6l57c" Dec 04 11:23:40 crc kubenswrapper[4831]: I1204 11:23:40.227008 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6l57c" Dec 04 11:23:40 crc kubenswrapper[4831]: I1204 11:23:40.277254 4831 scope.go:117] "RemoveContainer" containerID="bf3c114529f9790daef73bbc5dcad6e21ea3099eb2d7c3da40301b455b0aef23" Dec 04 11:23:40 crc kubenswrapper[4831]: E1204 11:23:40.277737 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:23:40 crc kubenswrapper[4831]: I1204 11:23:40.638756 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6l57c"] Dec 04 11:23:41 crc kubenswrapper[4831]: I1204 11:23:41.260370 4831 generic.go:334] "Generic (PLEG): container finished" podID="bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6" containerID="cccfffe9d5b1d55aa038b98649d16eb5cbda769ddd060928294bd2f336725f62" exitCode=0 Dec 04 11:23:41 crc kubenswrapper[4831]: I1204 11:23:41.260734 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6l57c" event={"ID":"bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6","Type":"ContainerDied","Data":"cccfffe9d5b1d55aa038b98649d16eb5cbda769ddd060928294bd2f336725f62"} Dec 04 11:23:41 crc kubenswrapper[4831]: I1204 11:23:41.261650 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6l57c" event={"ID":"bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6","Type":"ContainerStarted","Data":"c6382608412f3f911ac814902e657e905821b8c79bd74433ccc93d678c7cedce"} Dec 04 11:23:45 crc kubenswrapper[4831]: I1204 11:23:45.303989 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6l57c" event={"ID":"bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6","Type":"ContainerStarted","Data":"5d92902598d9166d1f3bbb8d255b5db1fa206d107e25a860e8c07fae0ef308c5"} Dec 04 11:23:46 crc kubenswrapper[4831]: I1204 11:23:46.318597 4831 generic.go:334] "Generic (PLEG): container finished" podID="bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6" containerID="5d92902598d9166d1f3bbb8d255b5db1fa206d107e25a860e8c07fae0ef308c5" exitCode=0 Dec 04 11:23:46 crc kubenswrapper[4831]: I1204 11:23:46.318681 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6l57c" event={"ID":"bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6","Type":"ContainerDied","Data":"5d92902598d9166d1f3bbb8d255b5db1fa206d107e25a860e8c07fae0ef308c5"} Dec 04 11:23:47 crc kubenswrapper[4831]: I1204 11:23:47.340344 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6l57c" event={"ID":"bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6","Type":"ContainerStarted","Data":"78f76c7ee340b13366ecb640beb28406770cf4571272ed61b8d4514da981d7f2"} Dec 04 11:23:47 crc kubenswrapper[4831]: I1204 11:23:47.365292 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6l57c" podStartSLOduration=2.640611627 podStartE2EDuration="8.365277485s" podCreationTimestamp="2025-12-04 11:23:39 +0000 UTC" firstStartedPulling="2025-12-04 11:23:41.262874408 +0000 UTC m=+4118.212049722" lastFinishedPulling="2025-12-04 11:23:46.987540266 +0000 UTC m=+4123.936715580" observedRunningTime="2025-12-04 11:23:47.35954501 +0000 UTC m=+4124.308720324" watchObservedRunningTime="2025-12-04 11:23:47.365277485 +0000 UTC m=+4124.314452789" Dec 04 11:23:50 crc kubenswrapper[4831]: I1204 11:23:50.228493 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6l57c" Dec 04 11:23:50 crc kubenswrapper[4831]: I1204 11:23:50.230278 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6l57c" Dec 04 11:23:50 crc kubenswrapper[4831]: I1204 11:23:50.299031 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6l57c" Dec 04 11:23:53 crc kubenswrapper[4831]: I1204 11:23:53.298124 4831 scope.go:117] "RemoveContainer" containerID="bf3c114529f9790daef73bbc5dcad6e21ea3099eb2d7c3da40301b455b0aef23" Dec 04 11:23:53 crc kubenswrapper[4831]: E1204 11:23:53.299571 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:23:59 crc kubenswrapper[4831]: I1204 11:23:59.770466 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4kzhs"] Dec 04 11:23:59 crc kubenswrapper[4831]: I1204 11:23:59.773189 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4kzhs" Dec 04 11:23:59 crc kubenswrapper[4831]: I1204 11:23:59.793176 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4kzhs"] Dec 04 11:23:59 crc kubenswrapper[4831]: I1204 11:23:59.889330 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xxzn\" (UniqueName: \"kubernetes.io/projected/12b96afe-e9f3-48e1-97e7-1de77e5f79ca-kube-api-access-7xxzn\") pod \"redhat-marketplace-4kzhs\" (UID: \"12b96afe-e9f3-48e1-97e7-1de77e5f79ca\") " pod="openshift-marketplace/redhat-marketplace-4kzhs" Dec 04 11:23:59 crc kubenswrapper[4831]: I1204 11:23:59.889581 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12b96afe-e9f3-48e1-97e7-1de77e5f79ca-utilities\") pod \"redhat-marketplace-4kzhs\" (UID: \"12b96afe-e9f3-48e1-97e7-1de77e5f79ca\") " pod="openshift-marketplace/redhat-marketplace-4kzhs" Dec 04 11:23:59 crc kubenswrapper[4831]: I1204 11:23:59.889757 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12b96afe-e9f3-48e1-97e7-1de77e5f79ca-catalog-content\") pod \"redhat-marketplace-4kzhs\" (UID: \"12b96afe-e9f3-48e1-97e7-1de77e5f79ca\") " pod="openshift-marketplace/redhat-marketplace-4kzhs" Dec 04 11:23:59 crc kubenswrapper[4831]: I1204 11:23:59.991905 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xxzn\" (UniqueName: \"kubernetes.io/projected/12b96afe-e9f3-48e1-97e7-1de77e5f79ca-kube-api-access-7xxzn\") pod \"redhat-marketplace-4kzhs\" (UID: \"12b96afe-e9f3-48e1-97e7-1de77e5f79ca\") " pod="openshift-marketplace/redhat-marketplace-4kzhs" Dec 04 11:23:59 crc kubenswrapper[4831]: I1204 11:23:59.991965 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12b96afe-e9f3-48e1-97e7-1de77e5f79ca-utilities\") pod \"redhat-marketplace-4kzhs\" (UID: \"12b96afe-e9f3-48e1-97e7-1de77e5f79ca\") " pod="openshift-marketplace/redhat-marketplace-4kzhs" Dec 04 11:23:59 crc kubenswrapper[4831]: I1204 11:23:59.992002 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12b96afe-e9f3-48e1-97e7-1de77e5f79ca-catalog-content\") pod \"redhat-marketplace-4kzhs\" (UID: \"12b96afe-e9f3-48e1-97e7-1de77e5f79ca\") " pod="openshift-marketplace/redhat-marketplace-4kzhs" Dec 04 11:23:59 crc kubenswrapper[4831]: I1204 11:23:59.992741 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12b96afe-e9f3-48e1-97e7-1de77e5f79ca-utilities\") pod \"redhat-marketplace-4kzhs\" (UID: \"12b96afe-e9f3-48e1-97e7-1de77e5f79ca\") " pod="openshift-marketplace/redhat-marketplace-4kzhs" Dec 04 11:23:59 crc kubenswrapper[4831]: I1204 11:23:59.992828 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12b96afe-e9f3-48e1-97e7-1de77e5f79ca-catalog-content\") pod \"redhat-marketplace-4kzhs\" (UID: \"12b96afe-e9f3-48e1-97e7-1de77e5f79ca\") " pod="openshift-marketplace/redhat-marketplace-4kzhs" Dec 04 11:24:00 crc kubenswrapper[4831]: I1204 11:24:00.010506 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xxzn\" (UniqueName: \"kubernetes.io/projected/12b96afe-e9f3-48e1-97e7-1de77e5f79ca-kube-api-access-7xxzn\") pod \"redhat-marketplace-4kzhs\" (UID: \"12b96afe-e9f3-48e1-97e7-1de77e5f79ca\") " pod="openshift-marketplace/redhat-marketplace-4kzhs" Dec 04 11:24:00 crc kubenswrapper[4831]: I1204 11:24:00.103261 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4kzhs" Dec 04 11:24:00 crc kubenswrapper[4831]: I1204 11:24:00.315531 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6l57c" Dec 04 11:24:00 crc kubenswrapper[4831]: I1204 11:24:00.569992 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4kzhs"] Dec 04 11:24:01 crc kubenswrapper[4831]: I1204 11:24:01.488558 4831 generic.go:334] "Generic (PLEG): container finished" podID="12b96afe-e9f3-48e1-97e7-1de77e5f79ca" containerID="74a9dfa2f4b43b5e0465567da0f87cc00a6c930825686393f912a800439ab095" exitCode=0 Dec 04 11:24:01 crc kubenswrapper[4831]: I1204 11:24:01.488644 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4kzhs" event={"ID":"12b96afe-e9f3-48e1-97e7-1de77e5f79ca","Type":"ContainerDied","Data":"74a9dfa2f4b43b5e0465567da0f87cc00a6c930825686393f912a800439ab095"} Dec 04 11:24:01 crc kubenswrapper[4831]: I1204 11:24:01.488856 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4kzhs" event={"ID":"12b96afe-e9f3-48e1-97e7-1de77e5f79ca","Type":"ContainerStarted","Data":"c7559d3cd5a84af3538742f66be8a6592e5c3a30251d026dfde12e32a8d50ed0"} Dec 04 11:24:02 crc kubenswrapper[4831]: I1204 11:24:02.564702 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6l57c"] Dec 04 11:24:02 crc kubenswrapper[4831]: I1204 11:24:02.567980 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6l57c" podUID="bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6" containerName="registry-server" containerID="cri-o://78f76c7ee340b13366ecb640beb28406770cf4571272ed61b8d4514da981d7f2" gracePeriod=2 Dec 04 11:24:02 crc kubenswrapper[4831]: E1204 11:24:02.718937 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc7e200c_e8a3_4afa_8f36_a45bcd7d1ed6.slice/crio-78f76c7ee340b13366ecb640beb28406770cf4571272ed61b8d4514da981d7f2.scope\": RecentStats: unable to find data in memory cache]" Dec 04 11:24:03 crc kubenswrapper[4831]: I1204 11:24:03.076753 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6l57c" Dec 04 11:24:03 crc kubenswrapper[4831]: I1204 11:24:03.278586 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6-utilities\") pod \"bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6\" (UID: \"bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6\") " Dec 04 11:24:03 crc kubenswrapper[4831]: I1204 11:24:03.278862 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6-catalog-content\") pod \"bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6\" (UID: \"bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6\") " Dec 04 11:24:03 crc kubenswrapper[4831]: I1204 11:24:03.278934 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqwgm\" (UniqueName: \"kubernetes.io/projected/bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6-kube-api-access-cqwgm\") pod \"bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6\" (UID: \"bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6\") " Dec 04 11:24:03 crc kubenswrapper[4831]: I1204 11:24:03.279822 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6-utilities" (OuterVolumeSpecName: "utilities") pod "bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6" (UID: "bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:24:03 crc kubenswrapper[4831]: I1204 11:24:03.283106 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 11:24:03 crc kubenswrapper[4831]: I1204 11:24:03.289211 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6-kube-api-access-cqwgm" (OuterVolumeSpecName: "kube-api-access-cqwgm") pod "bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6" (UID: "bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6"). InnerVolumeSpecName "kube-api-access-cqwgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:24:03 crc kubenswrapper[4831]: I1204 11:24:03.336743 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6" (UID: "bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:24:03 crc kubenswrapper[4831]: I1204 11:24:03.384434 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 11:24:03 crc kubenswrapper[4831]: I1204 11:24:03.384623 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqwgm\" (UniqueName: \"kubernetes.io/projected/bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6-kube-api-access-cqwgm\") on node \"crc\" DevicePath \"\"" Dec 04 11:24:03 crc kubenswrapper[4831]: I1204 11:24:03.512225 4831 generic.go:334] "Generic (PLEG): container finished" podID="bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6" containerID="78f76c7ee340b13366ecb640beb28406770cf4571272ed61b8d4514da981d7f2" exitCode=0 Dec 04 11:24:03 crc kubenswrapper[4831]: I1204 11:24:03.512306 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6l57c" event={"ID":"bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6","Type":"ContainerDied","Data":"78f76c7ee340b13366ecb640beb28406770cf4571272ed61b8d4514da981d7f2"} Dec 04 11:24:03 crc kubenswrapper[4831]: I1204 11:24:03.512605 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6l57c" event={"ID":"bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6","Type":"ContainerDied","Data":"c6382608412f3f911ac814902e657e905821b8c79bd74433ccc93d678c7cedce"} Dec 04 11:24:03 crc kubenswrapper[4831]: I1204 11:24:03.512627 4831 scope.go:117] "RemoveContainer" containerID="78f76c7ee340b13366ecb640beb28406770cf4571272ed61b8d4514da981d7f2" Dec 04 11:24:03 crc kubenswrapper[4831]: I1204 11:24:03.512364 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6l57c" Dec 04 11:24:03 crc kubenswrapper[4831]: I1204 11:24:03.515891 4831 generic.go:334] "Generic (PLEG): container finished" podID="12b96afe-e9f3-48e1-97e7-1de77e5f79ca" containerID="9662f097d7626ca240f3c9d375b08213980d933a4152925b2e972c1ba5259e23" exitCode=0 Dec 04 11:24:03 crc kubenswrapper[4831]: I1204 11:24:03.515933 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4kzhs" event={"ID":"12b96afe-e9f3-48e1-97e7-1de77e5f79ca","Type":"ContainerDied","Data":"9662f097d7626ca240f3c9d375b08213980d933a4152925b2e972c1ba5259e23"} Dec 04 11:24:03 crc kubenswrapper[4831]: I1204 11:24:03.538639 4831 scope.go:117] "RemoveContainer" containerID="5d92902598d9166d1f3bbb8d255b5db1fa206d107e25a860e8c07fae0ef308c5" Dec 04 11:24:03 crc kubenswrapper[4831]: I1204 11:24:03.567288 4831 scope.go:117] "RemoveContainer" containerID="cccfffe9d5b1d55aa038b98649d16eb5cbda769ddd060928294bd2f336725f62" Dec 04 11:24:03 crc kubenswrapper[4831]: I1204 11:24:03.567334 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6l57c"] Dec 04 11:24:03 crc kubenswrapper[4831]: I1204 11:24:03.580818 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6l57c"] Dec 04 11:24:03 crc kubenswrapper[4831]: I1204 11:24:03.636058 4831 scope.go:117] "RemoveContainer" containerID="78f76c7ee340b13366ecb640beb28406770cf4571272ed61b8d4514da981d7f2" Dec 04 11:24:03 crc kubenswrapper[4831]: E1204 11:24:03.636900 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78f76c7ee340b13366ecb640beb28406770cf4571272ed61b8d4514da981d7f2\": container with ID starting with 78f76c7ee340b13366ecb640beb28406770cf4571272ed61b8d4514da981d7f2 not found: ID does not exist" containerID="78f76c7ee340b13366ecb640beb28406770cf4571272ed61b8d4514da981d7f2" Dec 04 11:24:03 crc kubenswrapper[4831]: I1204 11:24:03.636947 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78f76c7ee340b13366ecb640beb28406770cf4571272ed61b8d4514da981d7f2"} err="failed to get container status \"78f76c7ee340b13366ecb640beb28406770cf4571272ed61b8d4514da981d7f2\": rpc error: code = NotFound desc = could not find container \"78f76c7ee340b13366ecb640beb28406770cf4571272ed61b8d4514da981d7f2\": container with ID starting with 78f76c7ee340b13366ecb640beb28406770cf4571272ed61b8d4514da981d7f2 not found: ID does not exist" Dec 04 11:24:03 crc kubenswrapper[4831]: I1204 11:24:03.636976 4831 scope.go:117] "RemoveContainer" containerID="5d92902598d9166d1f3bbb8d255b5db1fa206d107e25a860e8c07fae0ef308c5" Dec 04 11:24:03 crc kubenswrapper[4831]: E1204 11:24:03.637338 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d92902598d9166d1f3bbb8d255b5db1fa206d107e25a860e8c07fae0ef308c5\": container with ID starting with 5d92902598d9166d1f3bbb8d255b5db1fa206d107e25a860e8c07fae0ef308c5 not found: ID does not exist" containerID="5d92902598d9166d1f3bbb8d255b5db1fa206d107e25a860e8c07fae0ef308c5" Dec 04 11:24:03 crc kubenswrapper[4831]: I1204 11:24:03.637395 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d92902598d9166d1f3bbb8d255b5db1fa206d107e25a860e8c07fae0ef308c5"} err="failed to get container status \"5d92902598d9166d1f3bbb8d255b5db1fa206d107e25a860e8c07fae0ef308c5\": rpc error: code = NotFound desc = could not find container \"5d92902598d9166d1f3bbb8d255b5db1fa206d107e25a860e8c07fae0ef308c5\": container with ID starting with 5d92902598d9166d1f3bbb8d255b5db1fa206d107e25a860e8c07fae0ef308c5 not found: ID does not exist" Dec 04 11:24:03 crc kubenswrapper[4831]: I1204 11:24:03.637429 4831 scope.go:117] "RemoveContainer" containerID="cccfffe9d5b1d55aa038b98649d16eb5cbda769ddd060928294bd2f336725f62" Dec 04 11:24:03 crc kubenswrapper[4831]: E1204 11:24:03.637943 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cccfffe9d5b1d55aa038b98649d16eb5cbda769ddd060928294bd2f336725f62\": container with ID starting with cccfffe9d5b1d55aa038b98649d16eb5cbda769ddd060928294bd2f336725f62 not found: ID does not exist" containerID="cccfffe9d5b1d55aa038b98649d16eb5cbda769ddd060928294bd2f336725f62" Dec 04 11:24:03 crc kubenswrapper[4831]: I1204 11:24:03.637975 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cccfffe9d5b1d55aa038b98649d16eb5cbda769ddd060928294bd2f336725f62"} err="failed to get container status \"cccfffe9d5b1d55aa038b98649d16eb5cbda769ddd060928294bd2f336725f62\": rpc error: code = NotFound desc = could not find container \"cccfffe9d5b1d55aa038b98649d16eb5cbda769ddd060928294bd2f336725f62\": container with ID starting with cccfffe9d5b1d55aa038b98649d16eb5cbda769ddd060928294bd2f336725f62 not found: ID does not exist" Dec 04 11:24:04 crc kubenswrapper[4831]: I1204 11:24:04.528995 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4kzhs" event={"ID":"12b96afe-e9f3-48e1-97e7-1de77e5f79ca","Type":"ContainerStarted","Data":"c1b15e2cf99793f10f94e19b0482b3d26ba358b3ed987ad26d7f9e8f064944a1"} Dec 04 11:24:04 crc kubenswrapper[4831]: I1204 11:24:04.558734 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4kzhs" podStartSLOduration=3.010585408 podStartE2EDuration="5.558633801s" podCreationTimestamp="2025-12-04 11:23:59 +0000 UTC" firstStartedPulling="2025-12-04 11:24:01.49275141 +0000 UTC m=+4138.441926734" lastFinishedPulling="2025-12-04 11:24:04.040799813 +0000 UTC m=+4140.989975127" observedRunningTime="2025-12-04 11:24:04.550981855 +0000 UTC m=+4141.500157179" watchObservedRunningTime="2025-12-04 11:24:04.558633801 +0000 UTC m=+4141.507809115" Dec 04 11:24:05 crc kubenswrapper[4831]: I1204 11:24:05.277400 4831 scope.go:117] "RemoveContainer" containerID="bf3c114529f9790daef73bbc5dcad6e21ea3099eb2d7c3da40301b455b0aef23" Dec 04 11:24:05 crc kubenswrapper[4831]: E1204 11:24:05.277898 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:24:05 crc kubenswrapper[4831]: I1204 11:24:05.289717 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6" path="/var/lib/kubelet/pods/bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6/volumes" Dec 04 11:24:10 crc kubenswrapper[4831]: I1204 11:24:10.103478 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4kzhs" Dec 04 11:24:10 crc kubenswrapper[4831]: I1204 11:24:10.104092 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4kzhs" Dec 04 11:24:10 crc kubenswrapper[4831]: I1204 11:24:10.157810 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4kzhs" Dec 04 11:24:10 crc kubenswrapper[4831]: I1204 11:24:10.721299 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4kzhs" Dec 04 11:24:11 crc kubenswrapper[4831]: I1204 11:24:11.103804 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4kzhs"] Dec 04 11:24:12 crc kubenswrapper[4831]: I1204 11:24:12.628093 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4kzhs" podUID="12b96afe-e9f3-48e1-97e7-1de77e5f79ca" containerName="registry-server" containerID="cri-o://c1b15e2cf99793f10f94e19b0482b3d26ba358b3ed987ad26d7f9e8f064944a1" gracePeriod=2 Dec 04 11:24:13 crc kubenswrapper[4831]: I1204 11:24:13.143176 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4kzhs" Dec 04 11:24:13 crc kubenswrapper[4831]: I1204 11:24:13.310737 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xxzn\" (UniqueName: \"kubernetes.io/projected/12b96afe-e9f3-48e1-97e7-1de77e5f79ca-kube-api-access-7xxzn\") pod \"12b96afe-e9f3-48e1-97e7-1de77e5f79ca\" (UID: \"12b96afe-e9f3-48e1-97e7-1de77e5f79ca\") " Dec 04 11:24:13 crc kubenswrapper[4831]: I1204 11:24:13.310889 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12b96afe-e9f3-48e1-97e7-1de77e5f79ca-catalog-content\") pod \"12b96afe-e9f3-48e1-97e7-1de77e5f79ca\" (UID: \"12b96afe-e9f3-48e1-97e7-1de77e5f79ca\") " Dec 04 11:24:13 crc kubenswrapper[4831]: I1204 11:24:13.311048 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12b96afe-e9f3-48e1-97e7-1de77e5f79ca-utilities\") pod \"12b96afe-e9f3-48e1-97e7-1de77e5f79ca\" (UID: \"12b96afe-e9f3-48e1-97e7-1de77e5f79ca\") " Dec 04 11:24:13 crc kubenswrapper[4831]: I1204 11:24:13.311888 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12b96afe-e9f3-48e1-97e7-1de77e5f79ca-utilities" (OuterVolumeSpecName: "utilities") pod "12b96afe-e9f3-48e1-97e7-1de77e5f79ca" (UID: "12b96afe-e9f3-48e1-97e7-1de77e5f79ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:24:13 crc kubenswrapper[4831]: I1204 11:24:13.316020 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12b96afe-e9f3-48e1-97e7-1de77e5f79ca-kube-api-access-7xxzn" (OuterVolumeSpecName: "kube-api-access-7xxzn") pod "12b96afe-e9f3-48e1-97e7-1de77e5f79ca" (UID: "12b96afe-e9f3-48e1-97e7-1de77e5f79ca"). InnerVolumeSpecName "kube-api-access-7xxzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:24:13 crc kubenswrapper[4831]: I1204 11:24:13.327999 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12b96afe-e9f3-48e1-97e7-1de77e5f79ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "12b96afe-e9f3-48e1-97e7-1de77e5f79ca" (UID: "12b96afe-e9f3-48e1-97e7-1de77e5f79ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:24:13 crc kubenswrapper[4831]: I1204 11:24:13.413527 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xxzn\" (UniqueName: \"kubernetes.io/projected/12b96afe-e9f3-48e1-97e7-1de77e5f79ca-kube-api-access-7xxzn\") on node \"crc\" DevicePath \"\"" Dec 04 11:24:13 crc kubenswrapper[4831]: I1204 11:24:13.413566 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12b96afe-e9f3-48e1-97e7-1de77e5f79ca-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 11:24:13 crc kubenswrapper[4831]: I1204 11:24:13.413579 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12b96afe-e9f3-48e1-97e7-1de77e5f79ca-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 11:24:13 crc kubenswrapper[4831]: I1204 11:24:13.640647 4831 generic.go:334] "Generic (PLEG): container finished" podID="12b96afe-e9f3-48e1-97e7-1de77e5f79ca" containerID="c1b15e2cf99793f10f94e19b0482b3d26ba358b3ed987ad26d7f9e8f064944a1" exitCode=0 Dec 04 11:24:13 crc kubenswrapper[4831]: I1204 11:24:13.640728 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4kzhs" event={"ID":"12b96afe-e9f3-48e1-97e7-1de77e5f79ca","Type":"ContainerDied","Data":"c1b15e2cf99793f10f94e19b0482b3d26ba358b3ed987ad26d7f9e8f064944a1"} Dec 04 11:24:13 crc kubenswrapper[4831]: I1204 11:24:13.640758 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4kzhs" event={"ID":"12b96afe-e9f3-48e1-97e7-1de77e5f79ca","Type":"ContainerDied","Data":"c7559d3cd5a84af3538742f66be8a6592e5c3a30251d026dfde12e32a8d50ed0"} Dec 04 11:24:13 crc kubenswrapper[4831]: I1204 11:24:13.640776 4831 scope.go:117] "RemoveContainer" containerID="c1b15e2cf99793f10f94e19b0482b3d26ba358b3ed987ad26d7f9e8f064944a1" Dec 04 11:24:13 crc kubenswrapper[4831]: I1204 11:24:13.640919 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4kzhs" Dec 04 11:24:13 crc kubenswrapper[4831]: I1204 11:24:13.676152 4831 scope.go:117] "RemoveContainer" containerID="9662f097d7626ca240f3c9d375b08213980d933a4152925b2e972c1ba5259e23" Dec 04 11:24:13 crc kubenswrapper[4831]: I1204 11:24:13.682774 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4kzhs"] Dec 04 11:24:13 crc kubenswrapper[4831]: I1204 11:24:13.689920 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4kzhs"] Dec 04 11:24:13 crc kubenswrapper[4831]: I1204 11:24:13.707944 4831 scope.go:117] "RemoveContainer" containerID="74a9dfa2f4b43b5e0465567da0f87cc00a6c930825686393f912a800439ab095" Dec 04 11:24:13 crc kubenswrapper[4831]: I1204 11:24:13.753877 4831 scope.go:117] "RemoveContainer" containerID="c1b15e2cf99793f10f94e19b0482b3d26ba358b3ed987ad26d7f9e8f064944a1" Dec 04 11:24:13 crc kubenswrapper[4831]: E1204 11:24:13.754245 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1b15e2cf99793f10f94e19b0482b3d26ba358b3ed987ad26d7f9e8f064944a1\": container with ID starting with c1b15e2cf99793f10f94e19b0482b3d26ba358b3ed987ad26d7f9e8f064944a1 not found: ID does not exist" containerID="c1b15e2cf99793f10f94e19b0482b3d26ba358b3ed987ad26d7f9e8f064944a1" Dec 04 11:24:13 crc kubenswrapper[4831]: I1204 11:24:13.754272 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1b15e2cf99793f10f94e19b0482b3d26ba358b3ed987ad26d7f9e8f064944a1"} err="failed to get container status \"c1b15e2cf99793f10f94e19b0482b3d26ba358b3ed987ad26d7f9e8f064944a1\": rpc error: code = NotFound desc = could not find container \"c1b15e2cf99793f10f94e19b0482b3d26ba358b3ed987ad26d7f9e8f064944a1\": container with ID starting with c1b15e2cf99793f10f94e19b0482b3d26ba358b3ed987ad26d7f9e8f064944a1 not found: ID does not exist" Dec 04 11:24:13 crc kubenswrapper[4831]: I1204 11:24:13.754300 4831 scope.go:117] "RemoveContainer" containerID="9662f097d7626ca240f3c9d375b08213980d933a4152925b2e972c1ba5259e23" Dec 04 11:24:13 crc kubenswrapper[4831]: E1204 11:24:13.754540 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9662f097d7626ca240f3c9d375b08213980d933a4152925b2e972c1ba5259e23\": container with ID starting with 9662f097d7626ca240f3c9d375b08213980d933a4152925b2e972c1ba5259e23 not found: ID does not exist" containerID="9662f097d7626ca240f3c9d375b08213980d933a4152925b2e972c1ba5259e23" Dec 04 11:24:13 crc kubenswrapper[4831]: I1204 11:24:13.754562 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9662f097d7626ca240f3c9d375b08213980d933a4152925b2e972c1ba5259e23"} err="failed to get container status \"9662f097d7626ca240f3c9d375b08213980d933a4152925b2e972c1ba5259e23\": rpc error: code = NotFound desc = could not find container \"9662f097d7626ca240f3c9d375b08213980d933a4152925b2e972c1ba5259e23\": container with ID starting with 9662f097d7626ca240f3c9d375b08213980d933a4152925b2e972c1ba5259e23 not found: ID does not exist" Dec 04 11:24:13 crc kubenswrapper[4831]: I1204 11:24:13.754580 4831 scope.go:117] "RemoveContainer" containerID="74a9dfa2f4b43b5e0465567da0f87cc00a6c930825686393f912a800439ab095" Dec 04 11:24:13 crc kubenswrapper[4831]: E1204 11:24:13.755123 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74a9dfa2f4b43b5e0465567da0f87cc00a6c930825686393f912a800439ab095\": container with ID starting with 74a9dfa2f4b43b5e0465567da0f87cc00a6c930825686393f912a800439ab095 not found: ID does not exist" containerID="74a9dfa2f4b43b5e0465567da0f87cc00a6c930825686393f912a800439ab095" Dec 04 11:24:13 crc kubenswrapper[4831]: I1204 11:24:13.755146 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74a9dfa2f4b43b5e0465567da0f87cc00a6c930825686393f912a800439ab095"} err="failed to get container status \"74a9dfa2f4b43b5e0465567da0f87cc00a6c930825686393f912a800439ab095\": rpc error: code = NotFound desc = could not find container \"74a9dfa2f4b43b5e0465567da0f87cc00a6c930825686393f912a800439ab095\": container with ID starting with 74a9dfa2f4b43b5e0465567da0f87cc00a6c930825686393f912a800439ab095 not found: ID does not exist" Dec 04 11:24:15 crc kubenswrapper[4831]: I1204 11:24:15.291459 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12b96afe-e9f3-48e1-97e7-1de77e5f79ca" path="/var/lib/kubelet/pods/12b96afe-e9f3-48e1-97e7-1de77e5f79ca/volumes" Dec 04 11:24:17 crc kubenswrapper[4831]: I1204 11:24:17.277153 4831 scope.go:117] "RemoveContainer" containerID="bf3c114529f9790daef73bbc5dcad6e21ea3099eb2d7c3da40301b455b0aef23" Dec 04 11:24:17 crc kubenswrapper[4831]: E1204 11:24:17.277707 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:24:29 crc kubenswrapper[4831]: I1204 11:24:29.276320 4831 scope.go:117] "RemoveContainer" containerID="bf3c114529f9790daef73bbc5dcad6e21ea3099eb2d7c3da40301b455b0aef23" Dec 04 11:24:30 crc kubenswrapper[4831]: I1204 11:24:30.819841 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" event={"ID":"8475bb26-8864-4d49-935b-db7d4cb73387","Type":"ContainerStarted","Data":"f3c94a5910884e1f52811f0e2b03ff5546d8c61b5cdda975f96f71ab26e2e6af"} Dec 04 11:25:01 crc kubenswrapper[4831]: I1204 11:25:01.443752 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-27cd9"] Dec 04 11:25:01 crc kubenswrapper[4831]: E1204 11:25:01.444709 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6" containerName="extract-utilities" Dec 04 11:25:01 crc kubenswrapper[4831]: I1204 11:25:01.444726 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6" containerName="extract-utilities" Dec 04 11:25:01 crc kubenswrapper[4831]: E1204 11:25:01.444751 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12b96afe-e9f3-48e1-97e7-1de77e5f79ca" containerName="extract-utilities" Dec 04 11:25:01 crc kubenswrapper[4831]: I1204 11:25:01.444759 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="12b96afe-e9f3-48e1-97e7-1de77e5f79ca" containerName="extract-utilities" Dec 04 11:25:01 crc kubenswrapper[4831]: E1204 11:25:01.444787 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12b96afe-e9f3-48e1-97e7-1de77e5f79ca" containerName="extract-content" Dec 04 11:25:01 crc kubenswrapper[4831]: I1204 11:25:01.444793 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="12b96afe-e9f3-48e1-97e7-1de77e5f79ca" containerName="extract-content" Dec 04 11:25:01 crc kubenswrapper[4831]: E1204 11:25:01.444804 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6" containerName="registry-server" Dec 04 11:25:01 crc kubenswrapper[4831]: I1204 11:25:01.444810 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6" containerName="registry-server" Dec 04 11:25:01 crc kubenswrapper[4831]: E1204 11:25:01.444823 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12b96afe-e9f3-48e1-97e7-1de77e5f79ca" containerName="registry-server" Dec 04 11:25:01 crc kubenswrapper[4831]: I1204 11:25:01.444830 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="12b96afe-e9f3-48e1-97e7-1de77e5f79ca" containerName="registry-server" Dec 04 11:25:01 crc kubenswrapper[4831]: E1204 11:25:01.444847 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6" containerName="extract-content" Dec 04 11:25:01 crc kubenswrapper[4831]: I1204 11:25:01.444852 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6" containerName="extract-content" Dec 04 11:25:01 crc kubenswrapper[4831]: I1204 11:25:01.445079 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="12b96afe-e9f3-48e1-97e7-1de77e5f79ca" containerName="registry-server" Dec 04 11:25:01 crc kubenswrapper[4831]: I1204 11:25:01.445095 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc7e200c-e8a3-4afa-8f36-a45bcd7d1ed6" containerName="registry-server" Dec 04 11:25:01 crc kubenswrapper[4831]: I1204 11:25:01.446550 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-27cd9" Dec 04 11:25:01 crc kubenswrapper[4831]: I1204 11:25:01.461221 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-27cd9"] Dec 04 11:25:01 crc kubenswrapper[4831]: I1204 11:25:01.604921 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/632bebf3-ceaf-4754-9b9f-a637cb642d9e-utilities\") pod \"community-operators-27cd9\" (UID: \"632bebf3-ceaf-4754-9b9f-a637cb642d9e\") " pod="openshift-marketplace/community-operators-27cd9" Dec 04 11:25:01 crc kubenswrapper[4831]: I1204 11:25:01.605350 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz8jg\" (UniqueName: \"kubernetes.io/projected/632bebf3-ceaf-4754-9b9f-a637cb642d9e-kube-api-access-bz8jg\") pod \"community-operators-27cd9\" (UID: \"632bebf3-ceaf-4754-9b9f-a637cb642d9e\") " pod="openshift-marketplace/community-operators-27cd9" Dec 04 11:25:01 crc kubenswrapper[4831]: I1204 11:25:01.605476 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/632bebf3-ceaf-4754-9b9f-a637cb642d9e-catalog-content\") pod \"community-operators-27cd9\" (UID: \"632bebf3-ceaf-4754-9b9f-a637cb642d9e\") " pod="openshift-marketplace/community-operators-27cd9" Dec 04 11:25:01 crc kubenswrapper[4831]: I1204 11:25:01.708466 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz8jg\" (UniqueName: \"kubernetes.io/projected/632bebf3-ceaf-4754-9b9f-a637cb642d9e-kube-api-access-bz8jg\") pod \"community-operators-27cd9\" (UID: \"632bebf3-ceaf-4754-9b9f-a637cb642d9e\") " pod="openshift-marketplace/community-operators-27cd9" Dec 04 11:25:01 crc kubenswrapper[4831]: I1204 11:25:01.708966 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/632bebf3-ceaf-4754-9b9f-a637cb642d9e-catalog-content\") pod \"community-operators-27cd9\" (UID: \"632bebf3-ceaf-4754-9b9f-a637cb642d9e\") " pod="openshift-marketplace/community-operators-27cd9" Dec 04 11:25:01 crc kubenswrapper[4831]: I1204 11:25:01.709289 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/632bebf3-ceaf-4754-9b9f-a637cb642d9e-utilities\") pod \"community-operators-27cd9\" (UID: \"632bebf3-ceaf-4754-9b9f-a637cb642d9e\") " pod="openshift-marketplace/community-operators-27cd9" Dec 04 11:25:01 crc kubenswrapper[4831]: I1204 11:25:01.709495 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/632bebf3-ceaf-4754-9b9f-a637cb642d9e-catalog-content\") pod \"community-operators-27cd9\" (UID: \"632bebf3-ceaf-4754-9b9f-a637cb642d9e\") " pod="openshift-marketplace/community-operators-27cd9" Dec 04 11:25:01 crc kubenswrapper[4831]: I1204 11:25:01.709816 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/632bebf3-ceaf-4754-9b9f-a637cb642d9e-utilities\") pod \"community-operators-27cd9\" (UID: \"632bebf3-ceaf-4754-9b9f-a637cb642d9e\") " pod="openshift-marketplace/community-operators-27cd9" Dec 04 11:25:01 crc kubenswrapper[4831]: I1204 11:25:01.729491 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz8jg\" (UniqueName: \"kubernetes.io/projected/632bebf3-ceaf-4754-9b9f-a637cb642d9e-kube-api-access-bz8jg\") pod \"community-operators-27cd9\" (UID: \"632bebf3-ceaf-4754-9b9f-a637cb642d9e\") " pod="openshift-marketplace/community-operators-27cd9" Dec 04 11:25:01 crc kubenswrapper[4831]: I1204 11:25:01.778881 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-27cd9" Dec 04 11:25:02 crc kubenswrapper[4831]: I1204 11:25:02.444571 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-27cd9"] Dec 04 11:25:03 crc kubenswrapper[4831]: I1204 11:25:03.116106 4831 generic.go:334] "Generic (PLEG): container finished" podID="632bebf3-ceaf-4754-9b9f-a637cb642d9e" containerID="81797313a9263e63348122b4f1ac04a26bf0ae045f0ce26b0982825a13e741c6" exitCode=0 Dec 04 11:25:03 crc kubenswrapper[4831]: I1204 11:25:03.116154 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-27cd9" event={"ID":"632bebf3-ceaf-4754-9b9f-a637cb642d9e","Type":"ContainerDied","Data":"81797313a9263e63348122b4f1ac04a26bf0ae045f0ce26b0982825a13e741c6"} Dec 04 11:25:03 crc kubenswrapper[4831]: I1204 11:25:03.116467 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-27cd9" event={"ID":"632bebf3-ceaf-4754-9b9f-a637cb642d9e","Type":"ContainerStarted","Data":"c9489002f8016bf86e83ec0eb6df59ffab78c49e44a17501364ec1ee6c24435b"} Dec 04 11:25:04 crc kubenswrapper[4831]: I1204 11:25:04.132817 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-27cd9" event={"ID":"632bebf3-ceaf-4754-9b9f-a637cb642d9e","Type":"ContainerStarted","Data":"c40c4f77f6e405a20850139816509f70555200f446aea5ac37b122a64ed30dac"} Dec 04 11:25:05 crc kubenswrapper[4831]: I1204 11:25:05.144982 4831 generic.go:334] "Generic (PLEG): container finished" podID="632bebf3-ceaf-4754-9b9f-a637cb642d9e" containerID="c40c4f77f6e405a20850139816509f70555200f446aea5ac37b122a64ed30dac" exitCode=0 Dec 04 11:25:05 crc kubenswrapper[4831]: I1204 11:25:05.145048 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-27cd9" event={"ID":"632bebf3-ceaf-4754-9b9f-a637cb642d9e","Type":"ContainerDied","Data":"c40c4f77f6e405a20850139816509f70555200f446aea5ac37b122a64ed30dac"} Dec 04 11:25:08 crc kubenswrapper[4831]: I1204 11:25:08.183886 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-27cd9" event={"ID":"632bebf3-ceaf-4754-9b9f-a637cb642d9e","Type":"ContainerStarted","Data":"7d1a8ea9c81a97e98b8b4d856977d5400d1cc52e06283bcfb0806e6a5197cc86"} Dec 04 11:25:08 crc kubenswrapper[4831]: I1204 11:25:08.215078 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-27cd9" podStartSLOduration=3.290433073 podStartE2EDuration="7.215033865s" podCreationTimestamp="2025-12-04 11:25:01 +0000 UTC" firstStartedPulling="2025-12-04 11:25:03.118390089 +0000 UTC m=+4200.067565413" lastFinishedPulling="2025-12-04 11:25:07.042990891 +0000 UTC m=+4203.992166205" observedRunningTime="2025-12-04 11:25:08.205785316 +0000 UTC m=+4205.154960630" watchObservedRunningTime="2025-12-04 11:25:08.215033865 +0000 UTC m=+4205.164209199" Dec 04 11:25:11 crc kubenswrapper[4831]: I1204 11:25:11.779130 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-27cd9" Dec 04 11:25:11 crc kubenswrapper[4831]: I1204 11:25:11.779715 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-27cd9" Dec 04 11:25:11 crc kubenswrapper[4831]: I1204 11:25:11.826135 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-27cd9" Dec 04 11:25:12 crc kubenswrapper[4831]: I1204 11:25:12.525709 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-27cd9" Dec 04 11:25:12 crc kubenswrapper[4831]: I1204 11:25:12.576098 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-27cd9"] Dec 04 11:25:14 crc kubenswrapper[4831]: I1204 11:25:14.241871 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-27cd9" podUID="632bebf3-ceaf-4754-9b9f-a637cb642d9e" containerName="registry-server" containerID="cri-o://7d1a8ea9c81a97e98b8b4d856977d5400d1cc52e06283bcfb0806e6a5197cc86" gracePeriod=2 Dec 04 11:25:15 crc kubenswrapper[4831]: I1204 11:25:15.260638 4831 generic.go:334] "Generic (PLEG): container finished" podID="632bebf3-ceaf-4754-9b9f-a637cb642d9e" containerID="7d1a8ea9c81a97e98b8b4d856977d5400d1cc52e06283bcfb0806e6a5197cc86" exitCode=0 Dec 04 11:25:15 crc kubenswrapper[4831]: I1204 11:25:15.260895 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-27cd9" event={"ID":"632bebf3-ceaf-4754-9b9f-a637cb642d9e","Type":"ContainerDied","Data":"7d1a8ea9c81a97e98b8b4d856977d5400d1cc52e06283bcfb0806e6a5197cc86"} Dec 04 11:25:15 crc kubenswrapper[4831]: I1204 11:25:15.398532 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-27cd9" Dec 04 11:25:15 crc kubenswrapper[4831]: I1204 11:25:15.536693 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/632bebf3-ceaf-4754-9b9f-a637cb642d9e-utilities\") pod \"632bebf3-ceaf-4754-9b9f-a637cb642d9e\" (UID: \"632bebf3-ceaf-4754-9b9f-a637cb642d9e\") " Dec 04 11:25:15 crc kubenswrapper[4831]: I1204 11:25:15.536771 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/632bebf3-ceaf-4754-9b9f-a637cb642d9e-catalog-content\") pod \"632bebf3-ceaf-4754-9b9f-a637cb642d9e\" (UID: \"632bebf3-ceaf-4754-9b9f-a637cb642d9e\") " Dec 04 11:25:15 crc kubenswrapper[4831]: I1204 11:25:15.536815 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bz8jg\" (UniqueName: \"kubernetes.io/projected/632bebf3-ceaf-4754-9b9f-a637cb642d9e-kube-api-access-bz8jg\") pod \"632bebf3-ceaf-4754-9b9f-a637cb642d9e\" (UID: \"632bebf3-ceaf-4754-9b9f-a637cb642d9e\") " Dec 04 11:25:15 crc kubenswrapper[4831]: I1204 11:25:15.537916 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/632bebf3-ceaf-4754-9b9f-a637cb642d9e-utilities" (OuterVolumeSpecName: "utilities") pod "632bebf3-ceaf-4754-9b9f-a637cb642d9e" (UID: "632bebf3-ceaf-4754-9b9f-a637cb642d9e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:25:15 crc kubenswrapper[4831]: I1204 11:25:15.550452 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/632bebf3-ceaf-4754-9b9f-a637cb642d9e-kube-api-access-bz8jg" (OuterVolumeSpecName: "kube-api-access-bz8jg") pod "632bebf3-ceaf-4754-9b9f-a637cb642d9e" (UID: "632bebf3-ceaf-4754-9b9f-a637cb642d9e"). InnerVolumeSpecName "kube-api-access-bz8jg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:25:15 crc kubenswrapper[4831]: I1204 11:25:15.600680 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/632bebf3-ceaf-4754-9b9f-a637cb642d9e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "632bebf3-ceaf-4754-9b9f-a637cb642d9e" (UID: "632bebf3-ceaf-4754-9b9f-a637cb642d9e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:25:15 crc kubenswrapper[4831]: I1204 11:25:15.640074 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/632bebf3-ceaf-4754-9b9f-a637cb642d9e-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 11:25:15 crc kubenswrapper[4831]: I1204 11:25:15.640117 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/632bebf3-ceaf-4754-9b9f-a637cb642d9e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 11:25:15 crc kubenswrapper[4831]: I1204 11:25:15.640136 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bz8jg\" (UniqueName: \"kubernetes.io/projected/632bebf3-ceaf-4754-9b9f-a637cb642d9e-kube-api-access-bz8jg\") on node \"crc\" DevicePath \"\"" Dec 04 11:25:16 crc kubenswrapper[4831]: I1204 11:25:16.274287 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-27cd9" event={"ID":"632bebf3-ceaf-4754-9b9f-a637cb642d9e","Type":"ContainerDied","Data":"c9489002f8016bf86e83ec0eb6df59ffab78c49e44a17501364ec1ee6c24435b"} Dec 04 11:25:16 crc kubenswrapper[4831]: I1204 11:25:16.274699 4831 scope.go:117] "RemoveContainer" containerID="7d1a8ea9c81a97e98b8b4d856977d5400d1cc52e06283bcfb0806e6a5197cc86" Dec 04 11:25:16 crc kubenswrapper[4831]: I1204 11:25:16.274384 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-27cd9" Dec 04 11:25:16 crc kubenswrapper[4831]: I1204 11:25:16.311739 4831 scope.go:117] "RemoveContainer" containerID="c40c4f77f6e405a20850139816509f70555200f446aea5ac37b122a64ed30dac" Dec 04 11:25:16 crc kubenswrapper[4831]: I1204 11:25:16.332637 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-27cd9"] Dec 04 11:25:16 crc kubenswrapper[4831]: I1204 11:25:16.344677 4831 scope.go:117] "RemoveContainer" containerID="81797313a9263e63348122b4f1ac04a26bf0ae045f0ce26b0982825a13e741c6" Dec 04 11:25:16 crc kubenswrapper[4831]: I1204 11:25:16.346918 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-27cd9"] Dec 04 11:25:17 crc kubenswrapper[4831]: I1204 11:25:17.287578 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="632bebf3-ceaf-4754-9b9f-a637cb642d9e" path="/var/lib/kubelet/pods/632bebf3-ceaf-4754-9b9f-a637cb642d9e/volumes" Dec 04 11:26:51 crc kubenswrapper[4831]: I1204 11:26:51.971801 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:26:51 crc kubenswrapper[4831]: I1204 11:26:51.972260 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:27:21 crc kubenswrapper[4831]: I1204 11:27:21.971572 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:27:21 crc kubenswrapper[4831]: I1204 11:27:21.972305 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:27:51 crc kubenswrapper[4831]: I1204 11:27:51.972138 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:27:51 crc kubenswrapper[4831]: I1204 11:27:51.972831 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:27:51 crc kubenswrapper[4831]: I1204 11:27:51.972889 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" Dec 04 11:27:51 crc kubenswrapper[4831]: I1204 11:27:51.973818 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f3c94a5910884e1f52811f0e2b03ff5546d8c61b5cdda975f96f71ab26e2e6af"} pod="openshift-machine-config-operator/machine-config-daemon-g76nn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 11:27:51 crc kubenswrapper[4831]: I1204 11:27:51.973887 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" containerID="cri-o://f3c94a5910884e1f52811f0e2b03ff5546d8c61b5cdda975f96f71ab26e2e6af" gracePeriod=600 Dec 04 11:27:52 crc kubenswrapper[4831]: I1204 11:27:52.796206 4831 generic.go:334] "Generic (PLEG): container finished" podID="8475bb26-8864-4d49-935b-db7d4cb73387" containerID="f3c94a5910884e1f52811f0e2b03ff5546d8c61b5cdda975f96f71ab26e2e6af" exitCode=0 Dec 04 11:27:52 crc kubenswrapper[4831]: I1204 11:27:52.796272 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" event={"ID":"8475bb26-8864-4d49-935b-db7d4cb73387","Type":"ContainerDied","Data":"f3c94a5910884e1f52811f0e2b03ff5546d8c61b5cdda975f96f71ab26e2e6af"} Dec 04 11:27:52 crc kubenswrapper[4831]: I1204 11:27:52.796525 4831 scope.go:117] "RemoveContainer" containerID="bf3c114529f9790daef73bbc5dcad6e21ea3099eb2d7c3da40301b455b0aef23" Dec 04 11:27:53 crc kubenswrapper[4831]: I1204 11:27:53.808930 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" event={"ID":"8475bb26-8864-4d49-935b-db7d4cb73387","Type":"ContainerStarted","Data":"8b9b26402f08722e527a1c5e7795687992156ab52df4d8ec561593a4f32e874b"} Dec 04 11:29:12 crc kubenswrapper[4831]: E1204 11:29:12.512955 4831 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.146:46224->38.102.83.146:38009: write tcp 38.102.83.146:46224->38.102.83.146:38009: write: connection reset by peer Dec 04 11:30:00 crc kubenswrapper[4831]: I1204 11:30:00.164181 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414130-p4gcj"] Dec 04 11:30:00 crc kubenswrapper[4831]: E1204 11:30:00.165063 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="632bebf3-ceaf-4754-9b9f-a637cb642d9e" containerName="extract-utilities" Dec 04 11:30:00 crc kubenswrapper[4831]: I1204 11:30:00.165076 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="632bebf3-ceaf-4754-9b9f-a637cb642d9e" containerName="extract-utilities" Dec 04 11:30:00 crc kubenswrapper[4831]: E1204 11:30:00.165091 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="632bebf3-ceaf-4754-9b9f-a637cb642d9e" containerName="registry-server" Dec 04 11:30:00 crc kubenswrapper[4831]: I1204 11:30:00.165097 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="632bebf3-ceaf-4754-9b9f-a637cb642d9e" containerName="registry-server" Dec 04 11:30:00 crc kubenswrapper[4831]: E1204 11:30:00.165126 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="632bebf3-ceaf-4754-9b9f-a637cb642d9e" containerName="extract-content" Dec 04 11:30:00 crc kubenswrapper[4831]: I1204 11:30:00.165132 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="632bebf3-ceaf-4754-9b9f-a637cb642d9e" containerName="extract-content" Dec 04 11:30:00 crc kubenswrapper[4831]: I1204 11:30:00.165327 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="632bebf3-ceaf-4754-9b9f-a637cb642d9e" containerName="registry-server" Dec 04 11:30:00 crc kubenswrapper[4831]: I1204 11:30:00.166026 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414130-p4gcj" Dec 04 11:30:00 crc kubenswrapper[4831]: I1204 11:30:00.168904 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 11:30:00 crc kubenswrapper[4831]: I1204 11:30:00.182627 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414130-p4gcj"] Dec 04 11:30:00 crc kubenswrapper[4831]: I1204 11:30:00.185444 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 11:30:00 crc kubenswrapper[4831]: I1204 11:30:00.235060 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2n8p\" (UniqueName: \"kubernetes.io/projected/2da7165f-72a7-49ad-b749-75bd32283da9-kube-api-access-h2n8p\") pod \"collect-profiles-29414130-p4gcj\" (UID: \"2da7165f-72a7-49ad-b749-75bd32283da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414130-p4gcj" Dec 04 11:30:00 crc kubenswrapper[4831]: I1204 11:30:00.235128 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2da7165f-72a7-49ad-b749-75bd32283da9-config-volume\") pod \"collect-profiles-29414130-p4gcj\" (UID: \"2da7165f-72a7-49ad-b749-75bd32283da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414130-p4gcj" Dec 04 11:30:00 crc kubenswrapper[4831]: I1204 11:30:00.235177 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2da7165f-72a7-49ad-b749-75bd32283da9-secret-volume\") pod \"collect-profiles-29414130-p4gcj\" (UID: \"2da7165f-72a7-49ad-b749-75bd32283da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414130-p4gcj" Dec 04 11:30:00 crc kubenswrapper[4831]: I1204 11:30:00.337309 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2n8p\" (UniqueName: \"kubernetes.io/projected/2da7165f-72a7-49ad-b749-75bd32283da9-kube-api-access-h2n8p\") pod \"collect-profiles-29414130-p4gcj\" (UID: \"2da7165f-72a7-49ad-b749-75bd32283da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414130-p4gcj" Dec 04 11:30:00 crc kubenswrapper[4831]: I1204 11:30:00.337364 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2da7165f-72a7-49ad-b749-75bd32283da9-config-volume\") pod \"collect-profiles-29414130-p4gcj\" (UID: \"2da7165f-72a7-49ad-b749-75bd32283da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414130-p4gcj" Dec 04 11:30:00 crc kubenswrapper[4831]: I1204 11:30:00.337394 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2da7165f-72a7-49ad-b749-75bd32283da9-secret-volume\") pod \"collect-profiles-29414130-p4gcj\" (UID: \"2da7165f-72a7-49ad-b749-75bd32283da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414130-p4gcj" Dec 04 11:30:00 crc kubenswrapper[4831]: I1204 11:30:00.338262 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2da7165f-72a7-49ad-b749-75bd32283da9-config-volume\") pod \"collect-profiles-29414130-p4gcj\" (UID: \"2da7165f-72a7-49ad-b749-75bd32283da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414130-p4gcj" Dec 04 11:30:00 crc kubenswrapper[4831]: I1204 11:30:00.344144 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2da7165f-72a7-49ad-b749-75bd32283da9-secret-volume\") pod \"collect-profiles-29414130-p4gcj\" (UID: \"2da7165f-72a7-49ad-b749-75bd32283da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414130-p4gcj" Dec 04 11:30:00 crc kubenswrapper[4831]: I1204 11:30:00.353314 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2n8p\" (UniqueName: \"kubernetes.io/projected/2da7165f-72a7-49ad-b749-75bd32283da9-kube-api-access-h2n8p\") pod \"collect-profiles-29414130-p4gcj\" (UID: \"2da7165f-72a7-49ad-b749-75bd32283da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414130-p4gcj" Dec 04 11:30:00 crc kubenswrapper[4831]: I1204 11:30:00.487139 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414130-p4gcj" Dec 04 11:30:01 crc kubenswrapper[4831]: I1204 11:30:01.047011 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414130-p4gcj"] Dec 04 11:30:02 crc kubenswrapper[4831]: I1204 11:30:02.028031 4831 generic.go:334] "Generic (PLEG): container finished" podID="2da7165f-72a7-49ad-b749-75bd32283da9" containerID="b20eaa4bd2bb4d82aa6ce2d889d6e2afec542701f0b5484817f314e97a4154cd" exitCode=0 Dec 04 11:30:02 crc kubenswrapper[4831]: I1204 11:30:02.028276 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414130-p4gcj" event={"ID":"2da7165f-72a7-49ad-b749-75bd32283da9","Type":"ContainerDied","Data":"b20eaa4bd2bb4d82aa6ce2d889d6e2afec542701f0b5484817f314e97a4154cd"} Dec 04 11:30:02 crc kubenswrapper[4831]: I1204 11:30:02.028735 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414130-p4gcj" event={"ID":"2da7165f-72a7-49ad-b749-75bd32283da9","Type":"ContainerStarted","Data":"e3f382e0a477078d4c3ec7f4d08bd8b72c7937228d72bd199e403d1ea3a56de4"} Dec 04 11:30:03 crc kubenswrapper[4831]: I1204 11:30:03.608593 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414130-p4gcj" Dec 04 11:30:03 crc kubenswrapper[4831]: I1204 11:30:03.705779 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2da7165f-72a7-49ad-b749-75bd32283da9-config-volume\") pod \"2da7165f-72a7-49ad-b749-75bd32283da9\" (UID: \"2da7165f-72a7-49ad-b749-75bd32283da9\") " Dec 04 11:30:03 crc kubenswrapper[4831]: I1204 11:30:03.705862 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2n8p\" (UniqueName: \"kubernetes.io/projected/2da7165f-72a7-49ad-b749-75bd32283da9-kube-api-access-h2n8p\") pod \"2da7165f-72a7-49ad-b749-75bd32283da9\" (UID: \"2da7165f-72a7-49ad-b749-75bd32283da9\") " Dec 04 11:30:03 crc kubenswrapper[4831]: I1204 11:30:03.706217 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2da7165f-72a7-49ad-b749-75bd32283da9-secret-volume\") pod \"2da7165f-72a7-49ad-b749-75bd32283da9\" (UID: \"2da7165f-72a7-49ad-b749-75bd32283da9\") " Dec 04 11:30:03 crc kubenswrapper[4831]: I1204 11:30:03.706752 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2da7165f-72a7-49ad-b749-75bd32283da9-config-volume" (OuterVolumeSpecName: "config-volume") pod "2da7165f-72a7-49ad-b749-75bd32283da9" (UID: "2da7165f-72a7-49ad-b749-75bd32283da9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 11:30:03 crc kubenswrapper[4831]: I1204 11:30:03.713783 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2da7165f-72a7-49ad-b749-75bd32283da9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2da7165f-72a7-49ad-b749-75bd32283da9" (UID: "2da7165f-72a7-49ad-b749-75bd32283da9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 11:30:03 crc kubenswrapper[4831]: I1204 11:30:03.714107 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2da7165f-72a7-49ad-b749-75bd32283da9-kube-api-access-h2n8p" (OuterVolumeSpecName: "kube-api-access-h2n8p") pod "2da7165f-72a7-49ad-b749-75bd32283da9" (UID: "2da7165f-72a7-49ad-b749-75bd32283da9"). InnerVolumeSpecName "kube-api-access-h2n8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:30:03 crc kubenswrapper[4831]: I1204 11:30:03.808625 4831 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2da7165f-72a7-49ad-b749-75bd32283da9-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 11:30:03 crc kubenswrapper[4831]: I1204 11:30:03.808700 4831 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2da7165f-72a7-49ad-b749-75bd32283da9-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 11:30:03 crc kubenswrapper[4831]: I1204 11:30:03.808719 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2n8p\" (UniqueName: \"kubernetes.io/projected/2da7165f-72a7-49ad-b749-75bd32283da9-kube-api-access-h2n8p\") on node \"crc\" DevicePath \"\"" Dec 04 11:30:04 crc kubenswrapper[4831]: I1204 11:30:04.049556 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414130-p4gcj" event={"ID":"2da7165f-72a7-49ad-b749-75bd32283da9","Type":"ContainerDied","Data":"e3f382e0a477078d4c3ec7f4d08bd8b72c7937228d72bd199e403d1ea3a56de4"} Dec 04 11:30:04 crc kubenswrapper[4831]: I1204 11:30:04.049594 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3f382e0a477078d4c3ec7f4d08bd8b72c7937228d72bd199e403d1ea3a56de4" Dec 04 11:30:04 crc kubenswrapper[4831]: I1204 11:30:04.049621 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414130-p4gcj" Dec 04 11:30:04 crc kubenswrapper[4831]: I1204 11:30:04.684397 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414085-vrmpn"] Dec 04 11:30:04 crc kubenswrapper[4831]: I1204 11:30:04.694780 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414085-vrmpn"] Dec 04 11:30:05 crc kubenswrapper[4831]: I1204 11:30:05.291617 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2b62802-882f-4b75-95ca-736ffafb4d63" path="/var/lib/kubelet/pods/f2b62802-882f-4b75-95ca-736ffafb4d63/volumes" Dec 04 11:30:21 crc kubenswrapper[4831]: I1204 11:30:21.971025 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:30:21 crc kubenswrapper[4831]: I1204 11:30:21.971450 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:30:27 crc kubenswrapper[4831]: I1204 11:30:27.583553 4831 scope.go:117] "RemoveContainer" containerID="06412bf6396e3da51e6721e086e6137cda6d472fcd88c2b07fa29fd2f28c8fd2" Dec 04 11:30:51 crc kubenswrapper[4831]: I1204 11:30:51.971307 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:30:51 crc kubenswrapper[4831]: I1204 11:30:51.972358 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:31:21 crc kubenswrapper[4831]: I1204 11:31:21.971746 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:31:21 crc kubenswrapper[4831]: I1204 11:31:21.972343 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:31:21 crc kubenswrapper[4831]: I1204 11:31:21.972399 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" Dec 04 11:31:21 crc kubenswrapper[4831]: I1204 11:31:21.973235 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8b9b26402f08722e527a1c5e7795687992156ab52df4d8ec561593a4f32e874b"} pod="openshift-machine-config-operator/machine-config-daemon-g76nn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 11:31:21 crc kubenswrapper[4831]: I1204 11:31:21.973329 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" containerID="cri-o://8b9b26402f08722e527a1c5e7795687992156ab52df4d8ec561593a4f32e874b" gracePeriod=600 Dec 04 11:31:22 crc kubenswrapper[4831]: E1204 11:31:22.604855 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:31:22 crc kubenswrapper[4831]: I1204 11:31:22.802178 4831 generic.go:334] "Generic (PLEG): container finished" podID="8475bb26-8864-4d49-935b-db7d4cb73387" containerID="8b9b26402f08722e527a1c5e7795687992156ab52df4d8ec561593a4f32e874b" exitCode=0 Dec 04 11:31:22 crc kubenswrapper[4831]: I1204 11:31:22.802219 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" event={"ID":"8475bb26-8864-4d49-935b-db7d4cb73387","Type":"ContainerDied","Data":"8b9b26402f08722e527a1c5e7795687992156ab52df4d8ec561593a4f32e874b"} Dec 04 11:31:22 crc kubenswrapper[4831]: I1204 11:31:22.802256 4831 scope.go:117] "RemoveContainer" containerID="f3c94a5910884e1f52811f0e2b03ff5546d8c61b5cdda975f96f71ab26e2e6af" Dec 04 11:31:22 crc kubenswrapper[4831]: I1204 11:31:22.802988 4831 scope.go:117] "RemoveContainer" containerID="8b9b26402f08722e527a1c5e7795687992156ab52df4d8ec561593a4f32e874b" Dec 04 11:31:22 crc kubenswrapper[4831]: E1204 11:31:22.803311 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:31:35 crc kubenswrapper[4831]: I1204 11:31:35.277942 4831 scope.go:117] "RemoveContainer" containerID="8b9b26402f08722e527a1c5e7795687992156ab52df4d8ec561593a4f32e874b" Dec 04 11:31:35 crc kubenswrapper[4831]: E1204 11:31:35.278621 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:31:46 crc kubenswrapper[4831]: I1204 11:31:46.277298 4831 scope.go:117] "RemoveContainer" containerID="8b9b26402f08722e527a1c5e7795687992156ab52df4d8ec561593a4f32e874b" Dec 04 11:31:46 crc kubenswrapper[4831]: E1204 11:31:46.277959 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:31:58 crc kubenswrapper[4831]: I1204 11:31:58.276991 4831 scope.go:117] "RemoveContainer" containerID="8b9b26402f08722e527a1c5e7795687992156ab52df4d8ec561593a4f32e874b" Dec 04 11:31:58 crc kubenswrapper[4831]: E1204 11:31:58.277695 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:32:13 crc kubenswrapper[4831]: I1204 11:32:13.283625 4831 scope.go:117] "RemoveContainer" containerID="8b9b26402f08722e527a1c5e7795687992156ab52df4d8ec561593a4f32e874b" Dec 04 11:32:13 crc kubenswrapper[4831]: E1204 11:32:13.284389 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:32:26 crc kubenswrapper[4831]: I1204 11:32:26.278842 4831 scope.go:117] "RemoveContainer" containerID="8b9b26402f08722e527a1c5e7795687992156ab52df4d8ec561593a4f32e874b" Dec 04 11:32:26 crc kubenswrapper[4831]: E1204 11:32:26.279647 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:32:39 crc kubenswrapper[4831]: I1204 11:32:39.277072 4831 scope.go:117] "RemoveContainer" containerID="8b9b26402f08722e527a1c5e7795687992156ab52df4d8ec561593a4f32e874b" Dec 04 11:32:39 crc kubenswrapper[4831]: E1204 11:32:39.278952 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:32:52 crc kubenswrapper[4831]: I1204 11:32:52.277040 4831 scope.go:117] "RemoveContainer" containerID="8b9b26402f08722e527a1c5e7795687992156ab52df4d8ec561593a4f32e874b" Dec 04 11:32:52 crc kubenswrapper[4831]: E1204 11:32:52.277622 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:33:07 crc kubenswrapper[4831]: I1204 11:33:07.279848 4831 scope.go:117] "RemoveContainer" containerID="8b9b26402f08722e527a1c5e7795687992156ab52df4d8ec561593a4f32e874b" Dec 04 11:33:07 crc kubenswrapper[4831]: E1204 11:33:07.280911 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:33:22 crc kubenswrapper[4831]: I1204 11:33:22.276939 4831 scope.go:117] "RemoveContainer" containerID="8b9b26402f08722e527a1c5e7795687992156ab52df4d8ec561593a4f32e874b" Dec 04 11:33:22 crc kubenswrapper[4831]: E1204 11:33:22.277990 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:33:26 crc kubenswrapper[4831]: I1204 11:33:26.509432 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w5mrk"] Dec 04 11:33:26 crc kubenswrapper[4831]: E1204 11:33:26.510475 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2da7165f-72a7-49ad-b749-75bd32283da9" containerName="collect-profiles" Dec 04 11:33:26 crc kubenswrapper[4831]: I1204 11:33:26.510493 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="2da7165f-72a7-49ad-b749-75bd32283da9" containerName="collect-profiles" Dec 04 11:33:26 crc kubenswrapper[4831]: I1204 11:33:26.510811 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="2da7165f-72a7-49ad-b749-75bd32283da9" containerName="collect-profiles" Dec 04 11:33:26 crc kubenswrapper[4831]: I1204 11:33:26.512728 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w5mrk" Dec 04 11:33:26 crc kubenswrapper[4831]: I1204 11:33:26.527308 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w5mrk"] Dec 04 11:33:26 crc kubenswrapper[4831]: I1204 11:33:26.610318 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd902396-b033-4df4-8eed-0f8199ba5c8d-utilities\") pod \"redhat-operators-w5mrk\" (UID: \"bd902396-b033-4df4-8eed-0f8199ba5c8d\") " pod="openshift-marketplace/redhat-operators-w5mrk" Dec 04 11:33:26 crc kubenswrapper[4831]: I1204 11:33:26.610486 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drktd\" (UniqueName: \"kubernetes.io/projected/bd902396-b033-4df4-8eed-0f8199ba5c8d-kube-api-access-drktd\") pod \"redhat-operators-w5mrk\" (UID: \"bd902396-b033-4df4-8eed-0f8199ba5c8d\") " pod="openshift-marketplace/redhat-operators-w5mrk" Dec 04 11:33:26 crc kubenswrapper[4831]: I1204 11:33:26.610538 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd902396-b033-4df4-8eed-0f8199ba5c8d-catalog-content\") pod \"redhat-operators-w5mrk\" (UID: \"bd902396-b033-4df4-8eed-0f8199ba5c8d\") " pod="openshift-marketplace/redhat-operators-w5mrk" Dec 04 11:33:26 crc kubenswrapper[4831]: I1204 11:33:26.712674 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd902396-b033-4df4-8eed-0f8199ba5c8d-utilities\") pod \"redhat-operators-w5mrk\" (UID: \"bd902396-b033-4df4-8eed-0f8199ba5c8d\") " pod="openshift-marketplace/redhat-operators-w5mrk" Dec 04 11:33:26 crc kubenswrapper[4831]: I1204 11:33:26.712780 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drktd\" (UniqueName: \"kubernetes.io/projected/bd902396-b033-4df4-8eed-0f8199ba5c8d-kube-api-access-drktd\") pod \"redhat-operators-w5mrk\" (UID: \"bd902396-b033-4df4-8eed-0f8199ba5c8d\") " pod="openshift-marketplace/redhat-operators-w5mrk" Dec 04 11:33:26 crc kubenswrapper[4831]: I1204 11:33:26.712812 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd902396-b033-4df4-8eed-0f8199ba5c8d-catalog-content\") pod \"redhat-operators-w5mrk\" (UID: \"bd902396-b033-4df4-8eed-0f8199ba5c8d\") " pod="openshift-marketplace/redhat-operators-w5mrk" Dec 04 11:33:26 crc kubenswrapper[4831]: I1204 11:33:26.713204 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd902396-b033-4df4-8eed-0f8199ba5c8d-utilities\") pod \"redhat-operators-w5mrk\" (UID: \"bd902396-b033-4df4-8eed-0f8199ba5c8d\") " pod="openshift-marketplace/redhat-operators-w5mrk" Dec 04 11:33:26 crc kubenswrapper[4831]: I1204 11:33:26.713263 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd902396-b033-4df4-8eed-0f8199ba5c8d-catalog-content\") pod \"redhat-operators-w5mrk\" (UID: \"bd902396-b033-4df4-8eed-0f8199ba5c8d\") " pod="openshift-marketplace/redhat-operators-w5mrk" Dec 04 11:33:26 crc kubenswrapper[4831]: I1204 11:33:26.750039 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drktd\" (UniqueName: \"kubernetes.io/projected/bd902396-b033-4df4-8eed-0f8199ba5c8d-kube-api-access-drktd\") pod \"redhat-operators-w5mrk\" (UID: \"bd902396-b033-4df4-8eed-0f8199ba5c8d\") " pod="openshift-marketplace/redhat-operators-w5mrk" Dec 04 11:33:26 crc kubenswrapper[4831]: I1204 11:33:26.835551 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w5mrk" Dec 04 11:33:27 crc kubenswrapper[4831]: I1204 11:33:27.341864 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w5mrk"] Dec 04 11:33:27 crc kubenswrapper[4831]: I1204 11:33:27.986218 4831 generic.go:334] "Generic (PLEG): container finished" podID="bd902396-b033-4df4-8eed-0f8199ba5c8d" containerID="a6aebdbd7c2bd64603252342a307be16189c75c06a626117d3abcd3bc9cf81c2" exitCode=0 Dec 04 11:33:27 crc kubenswrapper[4831]: I1204 11:33:27.986328 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5mrk" event={"ID":"bd902396-b033-4df4-8eed-0f8199ba5c8d","Type":"ContainerDied","Data":"a6aebdbd7c2bd64603252342a307be16189c75c06a626117d3abcd3bc9cf81c2"} Dec 04 11:33:27 crc kubenswrapper[4831]: I1204 11:33:27.987038 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5mrk" event={"ID":"bd902396-b033-4df4-8eed-0f8199ba5c8d","Type":"ContainerStarted","Data":"661a3cc6dd1220a2e2fac448bbaa51d682a28367c12db32da76658f347caf101"} Dec 04 11:33:27 crc kubenswrapper[4831]: I1204 11:33:27.988735 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 11:33:29 crc kubenswrapper[4831]: I1204 11:33:29.000307 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5mrk" event={"ID":"bd902396-b033-4df4-8eed-0f8199ba5c8d","Type":"ContainerStarted","Data":"5fc0fdd7de510bc6bcd9cd7a1ede5896d9564a4eb16ac6998fec3c8ccecbb6db"} Dec 04 11:33:31 crc kubenswrapper[4831]: I1204 11:33:31.019577 4831 generic.go:334] "Generic (PLEG): container finished" podID="bd902396-b033-4df4-8eed-0f8199ba5c8d" containerID="5fc0fdd7de510bc6bcd9cd7a1ede5896d9564a4eb16ac6998fec3c8ccecbb6db" exitCode=0 Dec 04 11:33:31 crc kubenswrapper[4831]: I1204 11:33:31.019685 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5mrk" event={"ID":"bd902396-b033-4df4-8eed-0f8199ba5c8d","Type":"ContainerDied","Data":"5fc0fdd7de510bc6bcd9cd7a1ede5896d9564a4eb16ac6998fec3c8ccecbb6db"} Dec 04 11:33:34 crc kubenswrapper[4831]: I1204 11:33:34.048204 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5mrk" event={"ID":"bd902396-b033-4df4-8eed-0f8199ba5c8d","Type":"ContainerStarted","Data":"24a4d3ad24c001aeaf229bd9e82a4be77c59c367963a78b7f020a4bc1d9bb0e7"} Dec 04 11:33:34 crc kubenswrapper[4831]: I1204 11:33:34.097460 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w5mrk" podStartSLOduration=3.228828044 podStartE2EDuration="8.097400211s" podCreationTimestamp="2025-12-04 11:33:26 +0000 UTC" firstStartedPulling="2025-12-04 11:33:27.988449572 +0000 UTC m=+4704.937624886" lastFinishedPulling="2025-12-04 11:33:32.857021739 +0000 UTC m=+4709.806197053" observedRunningTime="2025-12-04 11:33:34.083896958 +0000 UTC m=+4711.033072272" watchObservedRunningTime="2025-12-04 11:33:34.097400211 +0000 UTC m=+4711.046575515" Dec 04 11:33:35 crc kubenswrapper[4831]: I1204 11:33:35.276829 4831 scope.go:117] "RemoveContainer" containerID="8b9b26402f08722e527a1c5e7795687992156ab52df4d8ec561593a4f32e874b" Dec 04 11:33:35 crc kubenswrapper[4831]: E1204 11:33:35.277772 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:33:36 crc kubenswrapper[4831]: I1204 11:33:36.836902 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w5mrk" Dec 04 11:33:36 crc kubenswrapper[4831]: I1204 11:33:36.837573 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w5mrk" Dec 04 11:33:37 crc kubenswrapper[4831]: I1204 11:33:37.891404 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w5mrk" podUID="bd902396-b033-4df4-8eed-0f8199ba5c8d" containerName="registry-server" probeResult="failure" output=< Dec 04 11:33:37 crc kubenswrapper[4831]: timeout: failed to connect service ":50051" within 1s Dec 04 11:33:37 crc kubenswrapper[4831]: > Dec 04 11:33:43 crc kubenswrapper[4831]: I1204 11:33:43.804478 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-k2ktk"] Dec 04 11:33:43 crc kubenswrapper[4831]: I1204 11:33:43.807513 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k2ktk" Dec 04 11:33:43 crc kubenswrapper[4831]: I1204 11:33:43.820936 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k2ktk"] Dec 04 11:33:43 crc kubenswrapper[4831]: I1204 11:33:43.874309 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khqzm\" (UniqueName: \"kubernetes.io/projected/1a16bde0-04a0-4433-9708-8b6be7bb7ec0-kube-api-access-khqzm\") pod \"certified-operators-k2ktk\" (UID: \"1a16bde0-04a0-4433-9708-8b6be7bb7ec0\") " pod="openshift-marketplace/certified-operators-k2ktk" Dec 04 11:33:43 crc kubenswrapper[4831]: I1204 11:33:43.874412 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a16bde0-04a0-4433-9708-8b6be7bb7ec0-catalog-content\") pod \"certified-operators-k2ktk\" (UID: \"1a16bde0-04a0-4433-9708-8b6be7bb7ec0\") " pod="openshift-marketplace/certified-operators-k2ktk" Dec 04 11:33:43 crc kubenswrapper[4831]: I1204 11:33:43.874458 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a16bde0-04a0-4433-9708-8b6be7bb7ec0-utilities\") pod \"certified-operators-k2ktk\" (UID: \"1a16bde0-04a0-4433-9708-8b6be7bb7ec0\") " pod="openshift-marketplace/certified-operators-k2ktk" Dec 04 11:33:43 crc kubenswrapper[4831]: I1204 11:33:43.976758 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khqzm\" (UniqueName: \"kubernetes.io/projected/1a16bde0-04a0-4433-9708-8b6be7bb7ec0-kube-api-access-khqzm\") pod \"certified-operators-k2ktk\" (UID: \"1a16bde0-04a0-4433-9708-8b6be7bb7ec0\") " pod="openshift-marketplace/certified-operators-k2ktk" Dec 04 11:33:43 crc kubenswrapper[4831]: I1204 11:33:43.976893 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a16bde0-04a0-4433-9708-8b6be7bb7ec0-catalog-content\") pod \"certified-operators-k2ktk\" (UID: \"1a16bde0-04a0-4433-9708-8b6be7bb7ec0\") " pod="openshift-marketplace/certified-operators-k2ktk" Dec 04 11:33:43 crc kubenswrapper[4831]: I1204 11:33:43.976955 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a16bde0-04a0-4433-9708-8b6be7bb7ec0-utilities\") pod \"certified-operators-k2ktk\" (UID: \"1a16bde0-04a0-4433-9708-8b6be7bb7ec0\") " pod="openshift-marketplace/certified-operators-k2ktk" Dec 04 11:33:43 crc kubenswrapper[4831]: I1204 11:33:43.977520 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a16bde0-04a0-4433-9708-8b6be7bb7ec0-utilities\") pod \"certified-operators-k2ktk\" (UID: \"1a16bde0-04a0-4433-9708-8b6be7bb7ec0\") " pod="openshift-marketplace/certified-operators-k2ktk" Dec 04 11:33:43 crc kubenswrapper[4831]: I1204 11:33:43.977651 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a16bde0-04a0-4433-9708-8b6be7bb7ec0-catalog-content\") pod \"certified-operators-k2ktk\" (UID: \"1a16bde0-04a0-4433-9708-8b6be7bb7ec0\") " pod="openshift-marketplace/certified-operators-k2ktk" Dec 04 11:33:43 crc kubenswrapper[4831]: I1204 11:33:43.996589 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khqzm\" (UniqueName: \"kubernetes.io/projected/1a16bde0-04a0-4433-9708-8b6be7bb7ec0-kube-api-access-khqzm\") pod \"certified-operators-k2ktk\" (UID: \"1a16bde0-04a0-4433-9708-8b6be7bb7ec0\") " pod="openshift-marketplace/certified-operators-k2ktk" Dec 04 11:33:44 crc kubenswrapper[4831]: I1204 11:33:44.154151 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k2ktk" Dec 04 11:33:44 crc kubenswrapper[4831]: I1204 11:33:44.727099 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k2ktk"] Dec 04 11:33:44 crc kubenswrapper[4831]: W1204 11:33:44.736142 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a16bde0_04a0_4433_9708_8b6be7bb7ec0.slice/crio-3d1996f6d67c1cfbbf5ae938fd0145939bd6d22e813bb19363a92ed5a8be971a WatchSource:0}: Error finding container 3d1996f6d67c1cfbbf5ae938fd0145939bd6d22e813bb19363a92ed5a8be971a: Status 404 returned error can't find the container with id 3d1996f6d67c1cfbbf5ae938fd0145939bd6d22e813bb19363a92ed5a8be971a Dec 04 11:33:45 crc kubenswrapper[4831]: I1204 11:33:45.149378 4831 generic.go:334] "Generic (PLEG): container finished" podID="1a16bde0-04a0-4433-9708-8b6be7bb7ec0" containerID="2c2bf7ef278885b4f7b51254081ebadca4d29ec8c23395b510cd707ee4c466d8" exitCode=0 Dec 04 11:33:45 crc kubenswrapper[4831]: I1204 11:33:45.149507 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k2ktk" event={"ID":"1a16bde0-04a0-4433-9708-8b6be7bb7ec0","Type":"ContainerDied","Data":"2c2bf7ef278885b4f7b51254081ebadca4d29ec8c23395b510cd707ee4c466d8"} Dec 04 11:33:45 crc kubenswrapper[4831]: I1204 11:33:45.149765 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k2ktk" event={"ID":"1a16bde0-04a0-4433-9708-8b6be7bb7ec0","Type":"ContainerStarted","Data":"3d1996f6d67c1cfbbf5ae938fd0145939bd6d22e813bb19363a92ed5a8be971a"} Dec 04 11:33:46 crc kubenswrapper[4831]: I1204 11:33:46.908727 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w5mrk" Dec 04 11:33:46 crc kubenswrapper[4831]: I1204 11:33:46.974189 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w5mrk" Dec 04 11:33:47 crc kubenswrapper[4831]: I1204 11:33:47.174103 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k2ktk" event={"ID":"1a16bde0-04a0-4433-9708-8b6be7bb7ec0","Type":"ContainerStarted","Data":"e238e221ea80050375711f0c4a80952229318b769494385f7a8f89f4f9a14c1b"} Dec 04 11:33:48 crc kubenswrapper[4831]: I1204 11:33:48.188938 4831 generic.go:334] "Generic (PLEG): container finished" podID="1a16bde0-04a0-4433-9708-8b6be7bb7ec0" containerID="e238e221ea80050375711f0c4a80952229318b769494385f7a8f89f4f9a14c1b" exitCode=0 Dec 04 11:33:48 crc kubenswrapper[4831]: I1204 11:33:48.189042 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k2ktk" event={"ID":"1a16bde0-04a0-4433-9708-8b6be7bb7ec0","Type":"ContainerDied","Data":"e238e221ea80050375711f0c4a80952229318b769494385f7a8f89f4f9a14c1b"} Dec 04 11:33:48 crc kubenswrapper[4831]: I1204 11:33:48.985388 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w5mrk"] Dec 04 11:33:48 crc kubenswrapper[4831]: I1204 11:33:48.986215 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w5mrk" podUID="bd902396-b033-4df4-8eed-0f8199ba5c8d" containerName="registry-server" containerID="cri-o://24a4d3ad24c001aeaf229bd9e82a4be77c59c367963a78b7f020a4bc1d9bb0e7" gracePeriod=2 Dec 04 11:33:49 crc kubenswrapper[4831]: I1204 11:33:49.204591 4831 generic.go:334] "Generic (PLEG): container finished" podID="bd902396-b033-4df4-8eed-0f8199ba5c8d" containerID="24a4d3ad24c001aeaf229bd9e82a4be77c59c367963a78b7f020a4bc1d9bb0e7" exitCode=0 Dec 04 11:33:49 crc kubenswrapper[4831]: I1204 11:33:49.204695 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5mrk" event={"ID":"bd902396-b033-4df4-8eed-0f8199ba5c8d","Type":"ContainerDied","Data":"24a4d3ad24c001aeaf229bd9e82a4be77c59c367963a78b7f020a4bc1d9bb0e7"} Dec 04 11:33:49 crc kubenswrapper[4831]: I1204 11:33:49.208223 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k2ktk" event={"ID":"1a16bde0-04a0-4433-9708-8b6be7bb7ec0","Type":"ContainerStarted","Data":"7437fd19943b72b4b2eab96dd2eea79fc2bc3b81f3899210eb8bdfe97389d91e"} Dec 04 11:33:49 crc kubenswrapper[4831]: I1204 11:33:49.232677 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-k2ktk" podStartSLOduration=2.687613193 podStartE2EDuration="6.232644367s" podCreationTimestamp="2025-12-04 11:33:43 +0000 UTC" firstStartedPulling="2025-12-04 11:33:45.151781313 +0000 UTC m=+4722.100956617" lastFinishedPulling="2025-12-04 11:33:48.696812467 +0000 UTC m=+4725.645987791" observedRunningTime="2025-12-04 11:33:49.227477489 +0000 UTC m=+4726.176652803" watchObservedRunningTime="2025-12-04 11:33:49.232644367 +0000 UTC m=+4726.181819681" Dec 04 11:33:49 crc kubenswrapper[4831]: I1204 11:33:49.278277 4831 scope.go:117] "RemoveContainer" containerID="8b9b26402f08722e527a1c5e7795687992156ab52df4d8ec561593a4f32e874b" Dec 04 11:33:49 crc kubenswrapper[4831]: E1204 11:33:49.278554 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:33:49 crc kubenswrapper[4831]: I1204 11:33:49.468112 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w5mrk" Dec 04 11:33:49 crc kubenswrapper[4831]: I1204 11:33:49.599398 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd902396-b033-4df4-8eed-0f8199ba5c8d-utilities\") pod \"bd902396-b033-4df4-8eed-0f8199ba5c8d\" (UID: \"bd902396-b033-4df4-8eed-0f8199ba5c8d\") " Dec 04 11:33:49 crc kubenswrapper[4831]: I1204 11:33:49.599581 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drktd\" (UniqueName: \"kubernetes.io/projected/bd902396-b033-4df4-8eed-0f8199ba5c8d-kube-api-access-drktd\") pod \"bd902396-b033-4df4-8eed-0f8199ba5c8d\" (UID: \"bd902396-b033-4df4-8eed-0f8199ba5c8d\") " Dec 04 11:33:49 crc kubenswrapper[4831]: I1204 11:33:49.599848 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd902396-b033-4df4-8eed-0f8199ba5c8d-catalog-content\") pod \"bd902396-b033-4df4-8eed-0f8199ba5c8d\" (UID: \"bd902396-b033-4df4-8eed-0f8199ba5c8d\") " Dec 04 11:33:49 crc kubenswrapper[4831]: I1204 11:33:49.600566 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd902396-b033-4df4-8eed-0f8199ba5c8d-utilities" (OuterVolumeSpecName: "utilities") pod "bd902396-b033-4df4-8eed-0f8199ba5c8d" (UID: "bd902396-b033-4df4-8eed-0f8199ba5c8d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:33:49 crc kubenswrapper[4831]: I1204 11:33:49.605227 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd902396-b033-4df4-8eed-0f8199ba5c8d-kube-api-access-drktd" (OuterVolumeSpecName: "kube-api-access-drktd") pod "bd902396-b033-4df4-8eed-0f8199ba5c8d" (UID: "bd902396-b033-4df4-8eed-0f8199ba5c8d"). InnerVolumeSpecName "kube-api-access-drktd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:33:49 crc kubenswrapper[4831]: I1204 11:33:49.702548 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd902396-b033-4df4-8eed-0f8199ba5c8d-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 11:33:49 crc kubenswrapper[4831]: I1204 11:33:49.702595 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drktd\" (UniqueName: \"kubernetes.io/projected/bd902396-b033-4df4-8eed-0f8199ba5c8d-kube-api-access-drktd\") on node \"crc\" DevicePath \"\"" Dec 04 11:33:49 crc kubenswrapper[4831]: I1204 11:33:49.717128 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd902396-b033-4df4-8eed-0f8199ba5c8d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bd902396-b033-4df4-8eed-0f8199ba5c8d" (UID: "bd902396-b033-4df4-8eed-0f8199ba5c8d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:33:49 crc kubenswrapper[4831]: I1204 11:33:49.804158 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd902396-b033-4df4-8eed-0f8199ba5c8d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 11:33:50 crc kubenswrapper[4831]: I1204 11:33:50.221645 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5mrk" event={"ID":"bd902396-b033-4df4-8eed-0f8199ba5c8d","Type":"ContainerDied","Data":"661a3cc6dd1220a2e2fac448bbaa51d682a28367c12db32da76658f347caf101"} Dec 04 11:33:50 crc kubenswrapper[4831]: I1204 11:33:50.221997 4831 scope.go:117] "RemoveContainer" containerID="24a4d3ad24c001aeaf229bd9e82a4be77c59c367963a78b7f020a4bc1d9bb0e7" Dec 04 11:33:50 crc kubenswrapper[4831]: I1204 11:33:50.221749 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w5mrk" Dec 04 11:33:50 crc kubenswrapper[4831]: I1204 11:33:50.258419 4831 scope.go:117] "RemoveContainer" containerID="5fc0fdd7de510bc6bcd9cd7a1ede5896d9564a4eb16ac6998fec3c8ccecbb6db" Dec 04 11:33:50 crc kubenswrapper[4831]: I1204 11:33:50.261014 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w5mrk"] Dec 04 11:33:50 crc kubenswrapper[4831]: I1204 11:33:50.273060 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w5mrk"] Dec 04 11:33:50 crc kubenswrapper[4831]: I1204 11:33:50.286282 4831 scope.go:117] "RemoveContainer" containerID="a6aebdbd7c2bd64603252342a307be16189c75c06a626117d3abcd3bc9cf81c2" Dec 04 11:33:51 crc kubenswrapper[4831]: I1204 11:33:51.290629 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd902396-b033-4df4-8eed-0f8199ba5c8d" path="/var/lib/kubelet/pods/bd902396-b033-4df4-8eed-0f8199ba5c8d/volumes" Dec 04 11:33:54 crc kubenswrapper[4831]: I1204 11:33:54.154933 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-k2ktk" Dec 04 11:33:54 crc kubenswrapper[4831]: I1204 11:33:54.155489 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-k2ktk" Dec 04 11:33:54 crc kubenswrapper[4831]: I1204 11:33:54.210969 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-k2ktk" Dec 04 11:33:54 crc kubenswrapper[4831]: I1204 11:33:54.322809 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-k2ktk" Dec 04 11:33:55 crc kubenswrapper[4831]: I1204 11:33:55.382345 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k2ktk"] Dec 04 11:33:56 crc kubenswrapper[4831]: I1204 11:33:56.279804 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-k2ktk" podUID="1a16bde0-04a0-4433-9708-8b6be7bb7ec0" containerName="registry-server" containerID="cri-o://7437fd19943b72b4b2eab96dd2eea79fc2bc3b81f3899210eb8bdfe97389d91e" gracePeriod=2 Dec 04 11:33:56 crc kubenswrapper[4831]: I1204 11:33:56.776438 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k2ktk" Dec 04 11:33:56 crc kubenswrapper[4831]: I1204 11:33:56.857616 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a16bde0-04a0-4433-9708-8b6be7bb7ec0-utilities\") pod \"1a16bde0-04a0-4433-9708-8b6be7bb7ec0\" (UID: \"1a16bde0-04a0-4433-9708-8b6be7bb7ec0\") " Dec 04 11:33:56 crc kubenswrapper[4831]: I1204 11:33:56.857695 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khqzm\" (UniqueName: \"kubernetes.io/projected/1a16bde0-04a0-4433-9708-8b6be7bb7ec0-kube-api-access-khqzm\") pod \"1a16bde0-04a0-4433-9708-8b6be7bb7ec0\" (UID: \"1a16bde0-04a0-4433-9708-8b6be7bb7ec0\") " Dec 04 11:33:56 crc kubenswrapper[4831]: I1204 11:33:56.857736 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a16bde0-04a0-4433-9708-8b6be7bb7ec0-catalog-content\") pod \"1a16bde0-04a0-4433-9708-8b6be7bb7ec0\" (UID: \"1a16bde0-04a0-4433-9708-8b6be7bb7ec0\") " Dec 04 11:33:56 crc kubenswrapper[4831]: I1204 11:33:56.858410 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a16bde0-04a0-4433-9708-8b6be7bb7ec0-utilities" (OuterVolumeSpecName: "utilities") pod "1a16bde0-04a0-4433-9708-8b6be7bb7ec0" (UID: "1a16bde0-04a0-4433-9708-8b6be7bb7ec0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:33:56 crc kubenswrapper[4831]: I1204 11:33:56.868840 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a16bde0-04a0-4433-9708-8b6be7bb7ec0-kube-api-access-khqzm" (OuterVolumeSpecName: "kube-api-access-khqzm") pod "1a16bde0-04a0-4433-9708-8b6be7bb7ec0" (UID: "1a16bde0-04a0-4433-9708-8b6be7bb7ec0"). InnerVolumeSpecName "kube-api-access-khqzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:33:56 crc kubenswrapper[4831]: I1204 11:33:56.911192 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a16bde0-04a0-4433-9708-8b6be7bb7ec0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1a16bde0-04a0-4433-9708-8b6be7bb7ec0" (UID: "1a16bde0-04a0-4433-9708-8b6be7bb7ec0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:33:56 crc kubenswrapper[4831]: I1204 11:33:56.960880 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a16bde0-04a0-4433-9708-8b6be7bb7ec0-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 11:33:56 crc kubenswrapper[4831]: I1204 11:33:56.961446 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khqzm\" (UniqueName: \"kubernetes.io/projected/1a16bde0-04a0-4433-9708-8b6be7bb7ec0-kube-api-access-khqzm\") on node \"crc\" DevicePath \"\"" Dec 04 11:33:56 crc kubenswrapper[4831]: I1204 11:33:56.961536 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a16bde0-04a0-4433-9708-8b6be7bb7ec0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 11:33:57 crc kubenswrapper[4831]: I1204 11:33:57.291530 4831 generic.go:334] "Generic (PLEG): container finished" podID="1a16bde0-04a0-4433-9708-8b6be7bb7ec0" containerID="7437fd19943b72b4b2eab96dd2eea79fc2bc3b81f3899210eb8bdfe97389d91e" exitCode=0 Dec 04 11:33:57 crc kubenswrapper[4831]: I1204 11:33:57.291627 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k2ktk" Dec 04 11:33:57 crc kubenswrapper[4831]: I1204 11:33:57.292805 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k2ktk" event={"ID":"1a16bde0-04a0-4433-9708-8b6be7bb7ec0","Type":"ContainerDied","Data":"7437fd19943b72b4b2eab96dd2eea79fc2bc3b81f3899210eb8bdfe97389d91e"} Dec 04 11:33:57 crc kubenswrapper[4831]: I1204 11:33:57.292918 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k2ktk" event={"ID":"1a16bde0-04a0-4433-9708-8b6be7bb7ec0","Type":"ContainerDied","Data":"3d1996f6d67c1cfbbf5ae938fd0145939bd6d22e813bb19363a92ed5a8be971a"} Dec 04 11:33:57 crc kubenswrapper[4831]: I1204 11:33:57.292962 4831 scope.go:117] "RemoveContainer" containerID="7437fd19943b72b4b2eab96dd2eea79fc2bc3b81f3899210eb8bdfe97389d91e" Dec 04 11:33:57 crc kubenswrapper[4831]: I1204 11:33:57.336587 4831 scope.go:117] "RemoveContainer" containerID="e238e221ea80050375711f0c4a80952229318b769494385f7a8f89f4f9a14c1b" Dec 04 11:33:57 crc kubenswrapper[4831]: I1204 11:33:57.341875 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k2ktk"] Dec 04 11:33:57 crc kubenswrapper[4831]: I1204 11:33:57.355462 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-k2ktk"] Dec 04 11:33:57 crc kubenswrapper[4831]: I1204 11:33:57.361042 4831 scope.go:117] "RemoveContainer" containerID="2c2bf7ef278885b4f7b51254081ebadca4d29ec8c23395b510cd707ee4c466d8" Dec 04 11:33:58 crc kubenswrapper[4831]: I1204 11:33:58.197466 4831 scope.go:117] "RemoveContainer" containerID="7437fd19943b72b4b2eab96dd2eea79fc2bc3b81f3899210eb8bdfe97389d91e" Dec 04 11:33:58 crc kubenswrapper[4831]: E1204 11:33:58.198014 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7437fd19943b72b4b2eab96dd2eea79fc2bc3b81f3899210eb8bdfe97389d91e\": container with ID starting with 7437fd19943b72b4b2eab96dd2eea79fc2bc3b81f3899210eb8bdfe97389d91e not found: ID does not exist" containerID="7437fd19943b72b4b2eab96dd2eea79fc2bc3b81f3899210eb8bdfe97389d91e" Dec 04 11:33:58 crc kubenswrapper[4831]: I1204 11:33:58.198058 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7437fd19943b72b4b2eab96dd2eea79fc2bc3b81f3899210eb8bdfe97389d91e"} err="failed to get container status \"7437fd19943b72b4b2eab96dd2eea79fc2bc3b81f3899210eb8bdfe97389d91e\": rpc error: code = NotFound desc = could not find container \"7437fd19943b72b4b2eab96dd2eea79fc2bc3b81f3899210eb8bdfe97389d91e\": container with ID starting with 7437fd19943b72b4b2eab96dd2eea79fc2bc3b81f3899210eb8bdfe97389d91e not found: ID does not exist" Dec 04 11:33:58 crc kubenswrapper[4831]: I1204 11:33:58.198088 4831 scope.go:117] "RemoveContainer" containerID="e238e221ea80050375711f0c4a80952229318b769494385f7a8f89f4f9a14c1b" Dec 04 11:33:58 crc kubenswrapper[4831]: E1204 11:33:58.198376 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e238e221ea80050375711f0c4a80952229318b769494385f7a8f89f4f9a14c1b\": container with ID starting with e238e221ea80050375711f0c4a80952229318b769494385f7a8f89f4f9a14c1b not found: ID does not exist" containerID="e238e221ea80050375711f0c4a80952229318b769494385f7a8f89f4f9a14c1b" Dec 04 11:33:58 crc kubenswrapper[4831]: I1204 11:33:58.198404 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e238e221ea80050375711f0c4a80952229318b769494385f7a8f89f4f9a14c1b"} err="failed to get container status \"e238e221ea80050375711f0c4a80952229318b769494385f7a8f89f4f9a14c1b\": rpc error: code = NotFound desc = could not find container \"e238e221ea80050375711f0c4a80952229318b769494385f7a8f89f4f9a14c1b\": container with ID starting with e238e221ea80050375711f0c4a80952229318b769494385f7a8f89f4f9a14c1b not found: ID does not exist" Dec 04 11:33:58 crc kubenswrapper[4831]: I1204 11:33:58.198421 4831 scope.go:117] "RemoveContainer" containerID="2c2bf7ef278885b4f7b51254081ebadca4d29ec8c23395b510cd707ee4c466d8" Dec 04 11:33:58 crc kubenswrapper[4831]: E1204 11:33:58.198646 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c2bf7ef278885b4f7b51254081ebadca4d29ec8c23395b510cd707ee4c466d8\": container with ID starting with 2c2bf7ef278885b4f7b51254081ebadca4d29ec8c23395b510cd707ee4c466d8 not found: ID does not exist" containerID="2c2bf7ef278885b4f7b51254081ebadca4d29ec8c23395b510cd707ee4c466d8" Dec 04 11:33:58 crc kubenswrapper[4831]: I1204 11:33:58.198688 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c2bf7ef278885b4f7b51254081ebadca4d29ec8c23395b510cd707ee4c466d8"} err="failed to get container status \"2c2bf7ef278885b4f7b51254081ebadca4d29ec8c23395b510cd707ee4c466d8\": rpc error: code = NotFound desc = could not find container \"2c2bf7ef278885b4f7b51254081ebadca4d29ec8c23395b510cd707ee4c466d8\": container with ID starting with 2c2bf7ef278885b4f7b51254081ebadca4d29ec8c23395b510cd707ee4c466d8 not found: ID does not exist" Dec 04 11:33:59 crc kubenswrapper[4831]: I1204 11:33:59.288836 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a16bde0-04a0-4433-9708-8b6be7bb7ec0" path="/var/lib/kubelet/pods/1a16bde0-04a0-4433-9708-8b6be7bb7ec0/volumes" Dec 04 11:34:02 crc kubenswrapper[4831]: I1204 11:34:02.277074 4831 scope.go:117] "RemoveContainer" containerID="8b9b26402f08722e527a1c5e7795687992156ab52df4d8ec561593a4f32e874b" Dec 04 11:34:02 crc kubenswrapper[4831]: E1204 11:34:02.277717 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:34:12 crc kubenswrapper[4831]: I1204 11:34:12.819625 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tpvn8"] Dec 04 11:34:12 crc kubenswrapper[4831]: E1204 11:34:12.820746 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a16bde0-04a0-4433-9708-8b6be7bb7ec0" containerName="registry-server" Dec 04 11:34:12 crc kubenswrapper[4831]: I1204 11:34:12.820763 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a16bde0-04a0-4433-9708-8b6be7bb7ec0" containerName="registry-server" Dec 04 11:34:12 crc kubenswrapper[4831]: E1204 11:34:12.820783 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd902396-b033-4df4-8eed-0f8199ba5c8d" containerName="extract-content" Dec 04 11:34:12 crc kubenswrapper[4831]: I1204 11:34:12.820790 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd902396-b033-4df4-8eed-0f8199ba5c8d" containerName="extract-content" Dec 04 11:34:12 crc kubenswrapper[4831]: E1204 11:34:12.820800 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a16bde0-04a0-4433-9708-8b6be7bb7ec0" containerName="extract-utilities" Dec 04 11:34:12 crc kubenswrapper[4831]: I1204 11:34:12.820809 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a16bde0-04a0-4433-9708-8b6be7bb7ec0" containerName="extract-utilities" Dec 04 11:34:12 crc kubenswrapper[4831]: E1204 11:34:12.820851 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd902396-b033-4df4-8eed-0f8199ba5c8d" containerName="registry-server" Dec 04 11:34:12 crc kubenswrapper[4831]: I1204 11:34:12.820859 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd902396-b033-4df4-8eed-0f8199ba5c8d" containerName="registry-server" Dec 04 11:34:12 crc kubenswrapper[4831]: E1204 11:34:12.820879 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd902396-b033-4df4-8eed-0f8199ba5c8d" containerName="extract-utilities" Dec 04 11:34:12 crc kubenswrapper[4831]: I1204 11:34:12.820886 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd902396-b033-4df4-8eed-0f8199ba5c8d" containerName="extract-utilities" Dec 04 11:34:12 crc kubenswrapper[4831]: E1204 11:34:12.820898 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a16bde0-04a0-4433-9708-8b6be7bb7ec0" containerName="extract-content" Dec 04 11:34:12 crc kubenswrapper[4831]: I1204 11:34:12.820905 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a16bde0-04a0-4433-9708-8b6be7bb7ec0" containerName="extract-content" Dec 04 11:34:12 crc kubenswrapper[4831]: I1204 11:34:12.821154 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a16bde0-04a0-4433-9708-8b6be7bb7ec0" containerName="registry-server" Dec 04 11:34:12 crc kubenswrapper[4831]: I1204 11:34:12.822478 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd902396-b033-4df4-8eed-0f8199ba5c8d" containerName="registry-server" Dec 04 11:34:12 crc kubenswrapper[4831]: I1204 11:34:12.824444 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tpvn8" Dec 04 11:34:12 crc kubenswrapper[4831]: I1204 11:34:12.829928 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tpvn8"] Dec 04 11:34:12 crc kubenswrapper[4831]: I1204 11:34:12.995429 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjh9t\" (UniqueName: \"kubernetes.io/projected/201a7fad-48dc-4c5f-a1d6-6b83c5447046-kube-api-access-vjh9t\") pod \"redhat-marketplace-tpvn8\" (UID: \"201a7fad-48dc-4c5f-a1d6-6b83c5447046\") " pod="openshift-marketplace/redhat-marketplace-tpvn8" Dec 04 11:34:12 crc kubenswrapper[4831]: I1204 11:34:12.995579 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/201a7fad-48dc-4c5f-a1d6-6b83c5447046-catalog-content\") pod \"redhat-marketplace-tpvn8\" (UID: \"201a7fad-48dc-4c5f-a1d6-6b83c5447046\") " pod="openshift-marketplace/redhat-marketplace-tpvn8" Dec 04 11:34:12 crc kubenswrapper[4831]: I1204 11:34:12.995629 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/201a7fad-48dc-4c5f-a1d6-6b83c5447046-utilities\") pod \"redhat-marketplace-tpvn8\" (UID: \"201a7fad-48dc-4c5f-a1d6-6b83c5447046\") " pod="openshift-marketplace/redhat-marketplace-tpvn8" Dec 04 11:34:13 crc kubenswrapper[4831]: I1204 11:34:13.097178 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjh9t\" (UniqueName: \"kubernetes.io/projected/201a7fad-48dc-4c5f-a1d6-6b83c5447046-kube-api-access-vjh9t\") pod \"redhat-marketplace-tpvn8\" (UID: \"201a7fad-48dc-4c5f-a1d6-6b83c5447046\") " pod="openshift-marketplace/redhat-marketplace-tpvn8" Dec 04 11:34:13 crc kubenswrapper[4831]: I1204 11:34:13.097288 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/201a7fad-48dc-4c5f-a1d6-6b83c5447046-catalog-content\") pod \"redhat-marketplace-tpvn8\" (UID: \"201a7fad-48dc-4c5f-a1d6-6b83c5447046\") " pod="openshift-marketplace/redhat-marketplace-tpvn8" Dec 04 11:34:13 crc kubenswrapper[4831]: I1204 11:34:13.097312 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/201a7fad-48dc-4c5f-a1d6-6b83c5447046-utilities\") pod \"redhat-marketplace-tpvn8\" (UID: \"201a7fad-48dc-4c5f-a1d6-6b83c5447046\") " pod="openshift-marketplace/redhat-marketplace-tpvn8" Dec 04 11:34:13 crc kubenswrapper[4831]: I1204 11:34:13.097779 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/201a7fad-48dc-4c5f-a1d6-6b83c5447046-utilities\") pod \"redhat-marketplace-tpvn8\" (UID: \"201a7fad-48dc-4c5f-a1d6-6b83c5447046\") " pod="openshift-marketplace/redhat-marketplace-tpvn8" Dec 04 11:34:13 crc kubenswrapper[4831]: I1204 11:34:13.097999 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/201a7fad-48dc-4c5f-a1d6-6b83c5447046-catalog-content\") pod \"redhat-marketplace-tpvn8\" (UID: \"201a7fad-48dc-4c5f-a1d6-6b83c5447046\") " pod="openshift-marketplace/redhat-marketplace-tpvn8" Dec 04 11:34:13 crc kubenswrapper[4831]: I1204 11:34:13.129641 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjh9t\" (UniqueName: \"kubernetes.io/projected/201a7fad-48dc-4c5f-a1d6-6b83c5447046-kube-api-access-vjh9t\") pod \"redhat-marketplace-tpvn8\" (UID: \"201a7fad-48dc-4c5f-a1d6-6b83c5447046\") " pod="openshift-marketplace/redhat-marketplace-tpvn8" Dec 04 11:34:13 crc kubenswrapper[4831]: I1204 11:34:13.157506 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tpvn8" Dec 04 11:34:13 crc kubenswrapper[4831]: I1204 11:34:13.782498 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tpvn8"] Dec 04 11:34:14 crc kubenswrapper[4831]: I1204 11:34:14.553780 4831 generic.go:334] "Generic (PLEG): container finished" podID="201a7fad-48dc-4c5f-a1d6-6b83c5447046" containerID="bfc388d7fae140e459d754355d4491f30610f954fbaf51593324b9523f11d9a6" exitCode=0 Dec 04 11:34:14 crc kubenswrapper[4831]: I1204 11:34:14.553858 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tpvn8" event={"ID":"201a7fad-48dc-4c5f-a1d6-6b83c5447046","Type":"ContainerDied","Data":"bfc388d7fae140e459d754355d4491f30610f954fbaf51593324b9523f11d9a6"} Dec 04 11:34:14 crc kubenswrapper[4831]: I1204 11:34:14.554062 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tpvn8" event={"ID":"201a7fad-48dc-4c5f-a1d6-6b83c5447046","Type":"ContainerStarted","Data":"494185bbb23695810ac2a824faa43e23cb8a2466be4e19576bc6b556bab2b44c"} Dec 04 11:34:15 crc kubenswrapper[4831]: I1204 11:34:15.568495 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tpvn8" event={"ID":"201a7fad-48dc-4c5f-a1d6-6b83c5447046","Type":"ContainerStarted","Data":"2dd5e5cd8788b726f551cff31c53a2403d3d826911d22b08341fc5915ff9e9c4"} Dec 04 11:34:16 crc kubenswrapper[4831]: I1204 11:34:16.585813 4831 generic.go:334] "Generic (PLEG): container finished" podID="201a7fad-48dc-4c5f-a1d6-6b83c5447046" containerID="2dd5e5cd8788b726f551cff31c53a2403d3d826911d22b08341fc5915ff9e9c4" exitCode=0 Dec 04 11:34:16 crc kubenswrapper[4831]: I1204 11:34:16.585872 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tpvn8" event={"ID":"201a7fad-48dc-4c5f-a1d6-6b83c5447046","Type":"ContainerDied","Data":"2dd5e5cd8788b726f551cff31c53a2403d3d826911d22b08341fc5915ff9e9c4"} Dec 04 11:34:17 crc kubenswrapper[4831]: I1204 11:34:17.277780 4831 scope.go:117] "RemoveContainer" containerID="8b9b26402f08722e527a1c5e7795687992156ab52df4d8ec561593a4f32e874b" Dec 04 11:34:17 crc kubenswrapper[4831]: E1204 11:34:17.278535 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:34:17 crc kubenswrapper[4831]: I1204 11:34:17.597878 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tpvn8" event={"ID":"201a7fad-48dc-4c5f-a1d6-6b83c5447046","Type":"ContainerStarted","Data":"53f1d91b1aa6e7a78b8c5f46396e83ebcc384c9e706445f0d51b1605c14d7a31"} Dec 04 11:34:17 crc kubenswrapper[4831]: I1204 11:34:17.625476 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tpvn8" podStartSLOduration=3.188055348 podStartE2EDuration="5.625452106s" podCreationTimestamp="2025-12-04 11:34:12 +0000 UTC" firstStartedPulling="2025-12-04 11:34:14.556493217 +0000 UTC m=+4751.505668531" lastFinishedPulling="2025-12-04 11:34:16.993889975 +0000 UTC m=+4753.943065289" observedRunningTime="2025-12-04 11:34:17.620637467 +0000 UTC m=+4754.569812781" watchObservedRunningTime="2025-12-04 11:34:17.625452106 +0000 UTC m=+4754.574627420" Dec 04 11:34:23 crc kubenswrapper[4831]: I1204 11:34:23.158089 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tpvn8" Dec 04 11:34:23 crc kubenswrapper[4831]: I1204 11:34:23.158740 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tpvn8" Dec 04 11:34:23 crc kubenswrapper[4831]: I1204 11:34:23.210224 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tpvn8" Dec 04 11:34:23 crc kubenswrapper[4831]: I1204 11:34:23.969249 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tpvn8" Dec 04 11:34:24 crc kubenswrapper[4831]: I1204 11:34:24.021775 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tpvn8"] Dec 04 11:34:25 crc kubenswrapper[4831]: I1204 11:34:25.709196 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tpvn8" podUID="201a7fad-48dc-4c5f-a1d6-6b83c5447046" containerName="registry-server" containerID="cri-o://53f1d91b1aa6e7a78b8c5f46396e83ebcc384c9e706445f0d51b1605c14d7a31" gracePeriod=2 Dec 04 11:34:26 crc kubenswrapper[4831]: I1204 11:34:26.232566 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tpvn8" Dec 04 11:34:26 crc kubenswrapper[4831]: I1204 11:34:26.429344 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjh9t\" (UniqueName: \"kubernetes.io/projected/201a7fad-48dc-4c5f-a1d6-6b83c5447046-kube-api-access-vjh9t\") pod \"201a7fad-48dc-4c5f-a1d6-6b83c5447046\" (UID: \"201a7fad-48dc-4c5f-a1d6-6b83c5447046\") " Dec 04 11:34:26 crc kubenswrapper[4831]: I1204 11:34:26.429675 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/201a7fad-48dc-4c5f-a1d6-6b83c5447046-catalog-content\") pod \"201a7fad-48dc-4c5f-a1d6-6b83c5447046\" (UID: \"201a7fad-48dc-4c5f-a1d6-6b83c5447046\") " Dec 04 11:34:26 crc kubenswrapper[4831]: I1204 11:34:26.429756 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/201a7fad-48dc-4c5f-a1d6-6b83c5447046-utilities\") pod \"201a7fad-48dc-4c5f-a1d6-6b83c5447046\" (UID: \"201a7fad-48dc-4c5f-a1d6-6b83c5447046\") " Dec 04 11:34:26 crc kubenswrapper[4831]: I1204 11:34:26.430911 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/201a7fad-48dc-4c5f-a1d6-6b83c5447046-utilities" (OuterVolumeSpecName: "utilities") pod "201a7fad-48dc-4c5f-a1d6-6b83c5447046" (UID: "201a7fad-48dc-4c5f-a1d6-6b83c5447046"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:34:26 crc kubenswrapper[4831]: I1204 11:34:26.432365 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/201a7fad-48dc-4c5f-a1d6-6b83c5447046-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 11:34:26 crc kubenswrapper[4831]: I1204 11:34:26.447896 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/201a7fad-48dc-4c5f-a1d6-6b83c5447046-kube-api-access-vjh9t" (OuterVolumeSpecName: "kube-api-access-vjh9t") pod "201a7fad-48dc-4c5f-a1d6-6b83c5447046" (UID: "201a7fad-48dc-4c5f-a1d6-6b83c5447046"). InnerVolumeSpecName "kube-api-access-vjh9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:34:26 crc kubenswrapper[4831]: I1204 11:34:26.465350 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/201a7fad-48dc-4c5f-a1d6-6b83c5447046-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "201a7fad-48dc-4c5f-a1d6-6b83c5447046" (UID: "201a7fad-48dc-4c5f-a1d6-6b83c5447046"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:34:26 crc kubenswrapper[4831]: I1204 11:34:26.534533 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/201a7fad-48dc-4c5f-a1d6-6b83c5447046-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 11:34:26 crc kubenswrapper[4831]: I1204 11:34:26.534862 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjh9t\" (UniqueName: \"kubernetes.io/projected/201a7fad-48dc-4c5f-a1d6-6b83c5447046-kube-api-access-vjh9t\") on node \"crc\" DevicePath \"\"" Dec 04 11:34:26 crc kubenswrapper[4831]: I1204 11:34:26.739063 4831 generic.go:334] "Generic (PLEG): container finished" podID="201a7fad-48dc-4c5f-a1d6-6b83c5447046" containerID="53f1d91b1aa6e7a78b8c5f46396e83ebcc384c9e706445f0d51b1605c14d7a31" exitCode=0 Dec 04 11:34:26 crc kubenswrapper[4831]: I1204 11:34:26.739142 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tpvn8" event={"ID":"201a7fad-48dc-4c5f-a1d6-6b83c5447046","Type":"ContainerDied","Data":"53f1d91b1aa6e7a78b8c5f46396e83ebcc384c9e706445f0d51b1605c14d7a31"} Dec 04 11:34:26 crc kubenswrapper[4831]: I1204 11:34:26.739178 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tpvn8" event={"ID":"201a7fad-48dc-4c5f-a1d6-6b83c5447046","Type":"ContainerDied","Data":"494185bbb23695810ac2a824faa43e23cb8a2466be4e19576bc6b556bab2b44c"} Dec 04 11:34:26 crc kubenswrapper[4831]: I1204 11:34:26.739212 4831 scope.go:117] "RemoveContainer" containerID="53f1d91b1aa6e7a78b8c5f46396e83ebcc384c9e706445f0d51b1605c14d7a31" Dec 04 11:34:26 crc kubenswrapper[4831]: I1204 11:34:26.739561 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tpvn8" Dec 04 11:34:26 crc kubenswrapper[4831]: I1204 11:34:26.766580 4831 scope.go:117] "RemoveContainer" containerID="2dd5e5cd8788b726f551cff31c53a2403d3d826911d22b08341fc5915ff9e9c4" Dec 04 11:34:26 crc kubenswrapper[4831]: I1204 11:34:26.794016 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tpvn8"] Dec 04 11:34:26 crc kubenswrapper[4831]: I1204 11:34:26.806142 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tpvn8"] Dec 04 11:34:26 crc kubenswrapper[4831]: I1204 11:34:26.815509 4831 scope.go:117] "RemoveContainer" containerID="bfc388d7fae140e459d754355d4491f30610f954fbaf51593324b9523f11d9a6" Dec 04 11:34:26 crc kubenswrapper[4831]: I1204 11:34:26.853898 4831 scope.go:117] "RemoveContainer" containerID="53f1d91b1aa6e7a78b8c5f46396e83ebcc384c9e706445f0d51b1605c14d7a31" Dec 04 11:34:26 crc kubenswrapper[4831]: E1204 11:34:26.854863 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53f1d91b1aa6e7a78b8c5f46396e83ebcc384c9e706445f0d51b1605c14d7a31\": container with ID starting with 53f1d91b1aa6e7a78b8c5f46396e83ebcc384c9e706445f0d51b1605c14d7a31 not found: ID does not exist" containerID="53f1d91b1aa6e7a78b8c5f46396e83ebcc384c9e706445f0d51b1605c14d7a31" Dec 04 11:34:26 crc kubenswrapper[4831]: I1204 11:34:26.854894 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53f1d91b1aa6e7a78b8c5f46396e83ebcc384c9e706445f0d51b1605c14d7a31"} err="failed to get container status \"53f1d91b1aa6e7a78b8c5f46396e83ebcc384c9e706445f0d51b1605c14d7a31\": rpc error: code = NotFound desc = could not find container \"53f1d91b1aa6e7a78b8c5f46396e83ebcc384c9e706445f0d51b1605c14d7a31\": container with ID starting with 53f1d91b1aa6e7a78b8c5f46396e83ebcc384c9e706445f0d51b1605c14d7a31 not found: ID does not exist" Dec 04 11:34:26 crc kubenswrapper[4831]: I1204 11:34:26.854917 4831 scope.go:117] "RemoveContainer" containerID="2dd5e5cd8788b726f551cff31c53a2403d3d826911d22b08341fc5915ff9e9c4" Dec 04 11:34:26 crc kubenswrapper[4831]: E1204 11:34:26.855408 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dd5e5cd8788b726f551cff31c53a2403d3d826911d22b08341fc5915ff9e9c4\": container with ID starting with 2dd5e5cd8788b726f551cff31c53a2403d3d826911d22b08341fc5915ff9e9c4 not found: ID does not exist" containerID="2dd5e5cd8788b726f551cff31c53a2403d3d826911d22b08341fc5915ff9e9c4" Dec 04 11:34:26 crc kubenswrapper[4831]: I1204 11:34:26.855452 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dd5e5cd8788b726f551cff31c53a2403d3d826911d22b08341fc5915ff9e9c4"} err="failed to get container status \"2dd5e5cd8788b726f551cff31c53a2403d3d826911d22b08341fc5915ff9e9c4\": rpc error: code = NotFound desc = could not find container \"2dd5e5cd8788b726f551cff31c53a2403d3d826911d22b08341fc5915ff9e9c4\": container with ID starting with 2dd5e5cd8788b726f551cff31c53a2403d3d826911d22b08341fc5915ff9e9c4 not found: ID does not exist" Dec 04 11:34:26 crc kubenswrapper[4831]: I1204 11:34:26.855484 4831 scope.go:117] "RemoveContainer" containerID="bfc388d7fae140e459d754355d4491f30610f954fbaf51593324b9523f11d9a6" Dec 04 11:34:26 crc kubenswrapper[4831]: E1204 11:34:26.855849 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfc388d7fae140e459d754355d4491f30610f954fbaf51593324b9523f11d9a6\": container with ID starting with bfc388d7fae140e459d754355d4491f30610f954fbaf51593324b9523f11d9a6 not found: ID does not exist" containerID="bfc388d7fae140e459d754355d4491f30610f954fbaf51593324b9523f11d9a6" Dec 04 11:34:26 crc kubenswrapper[4831]: I1204 11:34:26.855874 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfc388d7fae140e459d754355d4491f30610f954fbaf51593324b9523f11d9a6"} err="failed to get container status \"bfc388d7fae140e459d754355d4491f30610f954fbaf51593324b9523f11d9a6\": rpc error: code = NotFound desc = could not find container \"bfc388d7fae140e459d754355d4491f30610f954fbaf51593324b9523f11d9a6\": container with ID starting with bfc388d7fae140e459d754355d4491f30610f954fbaf51593324b9523f11d9a6 not found: ID does not exist" Dec 04 11:34:27 crc kubenswrapper[4831]: I1204 11:34:27.292466 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="201a7fad-48dc-4c5f-a1d6-6b83c5447046" path="/var/lib/kubelet/pods/201a7fad-48dc-4c5f-a1d6-6b83c5447046/volumes" Dec 04 11:34:29 crc kubenswrapper[4831]: I1204 11:34:29.276550 4831 scope.go:117] "RemoveContainer" containerID="8b9b26402f08722e527a1c5e7795687992156ab52df4d8ec561593a4f32e874b" Dec 04 11:34:29 crc kubenswrapper[4831]: E1204 11:34:29.277081 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:34:40 crc kubenswrapper[4831]: I1204 11:34:40.276469 4831 scope.go:117] "RemoveContainer" containerID="8b9b26402f08722e527a1c5e7795687992156ab52df4d8ec561593a4f32e874b" Dec 04 11:34:40 crc kubenswrapper[4831]: E1204 11:34:40.285773 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:34:54 crc kubenswrapper[4831]: I1204 11:34:54.276721 4831 scope.go:117] "RemoveContainer" containerID="8b9b26402f08722e527a1c5e7795687992156ab52df4d8ec561593a4f32e874b" Dec 04 11:34:54 crc kubenswrapper[4831]: E1204 11:34:54.277559 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:35:05 crc kubenswrapper[4831]: I1204 11:35:05.277592 4831 scope.go:117] "RemoveContainer" containerID="8b9b26402f08722e527a1c5e7795687992156ab52df4d8ec561593a4f32e874b" Dec 04 11:35:05 crc kubenswrapper[4831]: E1204 11:35:05.278411 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:35:18 crc kubenswrapper[4831]: I1204 11:35:18.276161 4831 scope.go:117] "RemoveContainer" containerID="8b9b26402f08722e527a1c5e7795687992156ab52df4d8ec561593a4f32e874b" Dec 04 11:35:18 crc kubenswrapper[4831]: E1204 11:35:18.276856 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:35:25 crc kubenswrapper[4831]: I1204 11:35:25.389845 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vllj4"] Dec 04 11:35:25 crc kubenswrapper[4831]: E1204 11:35:25.392249 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="201a7fad-48dc-4c5f-a1d6-6b83c5447046" containerName="extract-content" Dec 04 11:35:25 crc kubenswrapper[4831]: I1204 11:35:25.393351 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="201a7fad-48dc-4c5f-a1d6-6b83c5447046" containerName="extract-content" Dec 04 11:35:25 crc kubenswrapper[4831]: E1204 11:35:25.393452 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="201a7fad-48dc-4c5f-a1d6-6b83c5447046" containerName="extract-utilities" Dec 04 11:35:25 crc kubenswrapper[4831]: I1204 11:35:25.393533 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="201a7fad-48dc-4c5f-a1d6-6b83c5447046" containerName="extract-utilities" Dec 04 11:35:25 crc kubenswrapper[4831]: E1204 11:35:25.393620 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="201a7fad-48dc-4c5f-a1d6-6b83c5447046" containerName="registry-server" Dec 04 11:35:25 crc kubenswrapper[4831]: I1204 11:35:25.393710 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="201a7fad-48dc-4c5f-a1d6-6b83c5447046" containerName="registry-server" Dec 04 11:35:25 crc kubenswrapper[4831]: I1204 11:35:25.394042 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="201a7fad-48dc-4c5f-a1d6-6b83c5447046" containerName="registry-server" Dec 04 11:35:25 crc kubenswrapper[4831]: I1204 11:35:25.395782 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vllj4" Dec 04 11:35:25 crc kubenswrapper[4831]: I1204 11:35:25.405699 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vllj4"] Dec 04 11:35:25 crc kubenswrapper[4831]: I1204 11:35:25.586813 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37733112-3c76-4ee9-bd13-13e0c50c3805-catalog-content\") pod \"community-operators-vllj4\" (UID: \"37733112-3c76-4ee9-bd13-13e0c50c3805\") " pod="openshift-marketplace/community-operators-vllj4" Dec 04 11:35:25 crc kubenswrapper[4831]: I1204 11:35:25.586911 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37733112-3c76-4ee9-bd13-13e0c50c3805-utilities\") pod \"community-operators-vllj4\" (UID: \"37733112-3c76-4ee9-bd13-13e0c50c3805\") " pod="openshift-marketplace/community-operators-vllj4" Dec 04 11:35:25 crc kubenswrapper[4831]: I1204 11:35:25.586962 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94p86\" (UniqueName: \"kubernetes.io/projected/37733112-3c76-4ee9-bd13-13e0c50c3805-kube-api-access-94p86\") pod \"community-operators-vllj4\" (UID: \"37733112-3c76-4ee9-bd13-13e0c50c3805\") " pod="openshift-marketplace/community-operators-vllj4" Dec 04 11:35:25 crc kubenswrapper[4831]: I1204 11:35:25.689414 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37733112-3c76-4ee9-bd13-13e0c50c3805-catalog-content\") pod \"community-operators-vllj4\" (UID: \"37733112-3c76-4ee9-bd13-13e0c50c3805\") " pod="openshift-marketplace/community-operators-vllj4" Dec 04 11:35:25 crc kubenswrapper[4831]: I1204 11:35:25.689531 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37733112-3c76-4ee9-bd13-13e0c50c3805-utilities\") pod \"community-operators-vllj4\" (UID: \"37733112-3c76-4ee9-bd13-13e0c50c3805\") " pod="openshift-marketplace/community-operators-vllj4" Dec 04 11:35:25 crc kubenswrapper[4831]: I1204 11:35:25.689595 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94p86\" (UniqueName: \"kubernetes.io/projected/37733112-3c76-4ee9-bd13-13e0c50c3805-kube-api-access-94p86\") pod \"community-operators-vllj4\" (UID: \"37733112-3c76-4ee9-bd13-13e0c50c3805\") " pod="openshift-marketplace/community-operators-vllj4" Dec 04 11:35:25 crc kubenswrapper[4831]: I1204 11:35:25.690157 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37733112-3c76-4ee9-bd13-13e0c50c3805-catalog-content\") pod \"community-operators-vllj4\" (UID: \"37733112-3c76-4ee9-bd13-13e0c50c3805\") " pod="openshift-marketplace/community-operators-vllj4" Dec 04 11:35:25 crc kubenswrapper[4831]: I1204 11:35:25.690157 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37733112-3c76-4ee9-bd13-13e0c50c3805-utilities\") pod \"community-operators-vllj4\" (UID: \"37733112-3c76-4ee9-bd13-13e0c50c3805\") " pod="openshift-marketplace/community-operators-vllj4" Dec 04 11:35:25 crc kubenswrapper[4831]: I1204 11:35:25.715737 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94p86\" (UniqueName: \"kubernetes.io/projected/37733112-3c76-4ee9-bd13-13e0c50c3805-kube-api-access-94p86\") pod \"community-operators-vllj4\" (UID: \"37733112-3c76-4ee9-bd13-13e0c50c3805\") " pod="openshift-marketplace/community-operators-vllj4" Dec 04 11:35:25 crc kubenswrapper[4831]: I1204 11:35:25.726862 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vllj4" Dec 04 11:35:26 crc kubenswrapper[4831]: I1204 11:35:26.259276 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vllj4"] Dec 04 11:35:26 crc kubenswrapper[4831]: W1204 11:35:26.267119 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37733112_3c76_4ee9_bd13_13e0c50c3805.slice/crio-1be6725868fa6d75e77bb320e67120c5b612c5c5a1202ee22c42835ad6611f71 WatchSource:0}: Error finding container 1be6725868fa6d75e77bb320e67120c5b612c5c5a1202ee22c42835ad6611f71: Status 404 returned error can't find the container with id 1be6725868fa6d75e77bb320e67120c5b612c5c5a1202ee22c42835ad6611f71 Dec 04 11:35:26 crc kubenswrapper[4831]: I1204 11:35:26.310145 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vllj4" event={"ID":"37733112-3c76-4ee9-bd13-13e0c50c3805","Type":"ContainerStarted","Data":"1be6725868fa6d75e77bb320e67120c5b612c5c5a1202ee22c42835ad6611f71"} Dec 04 11:35:27 crc kubenswrapper[4831]: I1204 11:35:27.320928 4831 generic.go:334] "Generic (PLEG): container finished" podID="37733112-3c76-4ee9-bd13-13e0c50c3805" containerID="e8cd6ab6be0499380a2d5c23b80cb6df66d9d79fdf2ed8fa5ddc34a52b85a6d1" exitCode=0 Dec 04 11:35:27 crc kubenswrapper[4831]: I1204 11:35:27.320980 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vllj4" event={"ID":"37733112-3c76-4ee9-bd13-13e0c50c3805","Type":"ContainerDied","Data":"e8cd6ab6be0499380a2d5c23b80cb6df66d9d79fdf2ed8fa5ddc34a52b85a6d1"} Dec 04 11:35:29 crc kubenswrapper[4831]: I1204 11:35:29.278284 4831 scope.go:117] "RemoveContainer" containerID="8b9b26402f08722e527a1c5e7795687992156ab52df4d8ec561593a4f32e874b" Dec 04 11:35:29 crc kubenswrapper[4831]: E1204 11:35:29.278511 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:35:29 crc kubenswrapper[4831]: I1204 11:35:29.343406 4831 generic.go:334] "Generic (PLEG): container finished" podID="37733112-3c76-4ee9-bd13-13e0c50c3805" containerID="8bdac41cb50f715596e2d4594b7e2f3052989e4c89b29d15d1d1fc2ff386ce5e" exitCode=0 Dec 04 11:35:29 crc kubenswrapper[4831]: I1204 11:35:29.343462 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vllj4" event={"ID":"37733112-3c76-4ee9-bd13-13e0c50c3805","Type":"ContainerDied","Data":"8bdac41cb50f715596e2d4594b7e2f3052989e4c89b29d15d1d1fc2ff386ce5e"} Dec 04 11:35:30 crc kubenswrapper[4831]: I1204 11:35:30.355877 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vllj4" event={"ID":"37733112-3c76-4ee9-bd13-13e0c50c3805","Type":"ContainerStarted","Data":"b6fc21145fe36384831012c75c11776fa7bd59de6b3215c8be9cdd562a6ea352"} Dec 04 11:35:30 crc kubenswrapper[4831]: I1204 11:35:30.380624 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vllj4" podStartSLOduration=2.824789338 podStartE2EDuration="5.380607297s" podCreationTimestamp="2025-12-04 11:35:25 +0000 UTC" firstStartedPulling="2025-12-04 11:35:27.326463015 +0000 UTC m=+4824.275638329" lastFinishedPulling="2025-12-04 11:35:29.882280984 +0000 UTC m=+4826.831456288" observedRunningTime="2025-12-04 11:35:30.374443412 +0000 UTC m=+4827.323618736" watchObservedRunningTime="2025-12-04 11:35:30.380607297 +0000 UTC m=+4827.329782611" Dec 04 11:35:35 crc kubenswrapper[4831]: I1204 11:35:35.727398 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vllj4" Dec 04 11:35:35 crc kubenswrapper[4831]: I1204 11:35:35.728321 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vllj4" Dec 04 11:35:35 crc kubenswrapper[4831]: I1204 11:35:35.790628 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vllj4" Dec 04 11:35:36 crc kubenswrapper[4831]: I1204 11:35:36.460106 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vllj4" Dec 04 11:35:36 crc kubenswrapper[4831]: I1204 11:35:36.513045 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vllj4"] Dec 04 11:35:38 crc kubenswrapper[4831]: I1204 11:35:38.448771 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vllj4" podUID="37733112-3c76-4ee9-bd13-13e0c50c3805" containerName="registry-server" containerID="cri-o://b6fc21145fe36384831012c75c11776fa7bd59de6b3215c8be9cdd562a6ea352" gracePeriod=2 Dec 04 11:35:38 crc kubenswrapper[4831]: I1204 11:35:38.920764 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vllj4" Dec 04 11:35:39 crc kubenswrapper[4831]: I1204 11:35:39.071227 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37733112-3c76-4ee9-bd13-13e0c50c3805-catalog-content\") pod \"37733112-3c76-4ee9-bd13-13e0c50c3805\" (UID: \"37733112-3c76-4ee9-bd13-13e0c50c3805\") " Dec 04 11:35:39 crc kubenswrapper[4831]: I1204 11:35:39.071311 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94p86\" (UniqueName: \"kubernetes.io/projected/37733112-3c76-4ee9-bd13-13e0c50c3805-kube-api-access-94p86\") pod \"37733112-3c76-4ee9-bd13-13e0c50c3805\" (UID: \"37733112-3c76-4ee9-bd13-13e0c50c3805\") " Dec 04 11:35:39 crc kubenswrapper[4831]: I1204 11:35:39.071349 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37733112-3c76-4ee9-bd13-13e0c50c3805-utilities\") pod \"37733112-3c76-4ee9-bd13-13e0c50c3805\" (UID: \"37733112-3c76-4ee9-bd13-13e0c50c3805\") " Dec 04 11:35:39 crc kubenswrapper[4831]: I1204 11:35:39.073609 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37733112-3c76-4ee9-bd13-13e0c50c3805-utilities" (OuterVolumeSpecName: "utilities") pod "37733112-3c76-4ee9-bd13-13e0c50c3805" (UID: "37733112-3c76-4ee9-bd13-13e0c50c3805"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:35:39 crc kubenswrapper[4831]: I1204 11:35:39.078475 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37733112-3c76-4ee9-bd13-13e0c50c3805-kube-api-access-94p86" (OuterVolumeSpecName: "kube-api-access-94p86") pod "37733112-3c76-4ee9-bd13-13e0c50c3805" (UID: "37733112-3c76-4ee9-bd13-13e0c50c3805"). InnerVolumeSpecName "kube-api-access-94p86". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:35:39 crc kubenswrapper[4831]: I1204 11:35:39.174582 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94p86\" (UniqueName: \"kubernetes.io/projected/37733112-3c76-4ee9-bd13-13e0c50c3805-kube-api-access-94p86\") on node \"crc\" DevicePath \"\"" Dec 04 11:35:39 crc kubenswrapper[4831]: I1204 11:35:39.174629 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37733112-3c76-4ee9-bd13-13e0c50c3805-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 11:35:39 crc kubenswrapper[4831]: I1204 11:35:39.348116 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37733112-3c76-4ee9-bd13-13e0c50c3805-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "37733112-3c76-4ee9-bd13-13e0c50c3805" (UID: "37733112-3c76-4ee9-bd13-13e0c50c3805"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:35:39 crc kubenswrapper[4831]: I1204 11:35:39.380917 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37733112-3c76-4ee9-bd13-13e0c50c3805-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 11:35:39 crc kubenswrapper[4831]: I1204 11:35:39.469047 4831 generic.go:334] "Generic (PLEG): container finished" podID="37733112-3c76-4ee9-bd13-13e0c50c3805" containerID="b6fc21145fe36384831012c75c11776fa7bd59de6b3215c8be9cdd562a6ea352" exitCode=0 Dec 04 11:35:39 crc kubenswrapper[4831]: I1204 11:35:39.469117 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vllj4" event={"ID":"37733112-3c76-4ee9-bd13-13e0c50c3805","Type":"ContainerDied","Data":"b6fc21145fe36384831012c75c11776fa7bd59de6b3215c8be9cdd562a6ea352"} Dec 04 11:35:39 crc kubenswrapper[4831]: I1204 11:35:39.469164 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vllj4" event={"ID":"37733112-3c76-4ee9-bd13-13e0c50c3805","Type":"ContainerDied","Data":"1be6725868fa6d75e77bb320e67120c5b612c5c5a1202ee22c42835ad6611f71"} Dec 04 11:35:39 crc kubenswrapper[4831]: I1204 11:35:39.469160 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vllj4" Dec 04 11:35:39 crc kubenswrapper[4831]: I1204 11:35:39.469183 4831 scope.go:117] "RemoveContainer" containerID="b6fc21145fe36384831012c75c11776fa7bd59de6b3215c8be9cdd562a6ea352" Dec 04 11:35:39 crc kubenswrapper[4831]: I1204 11:35:39.494944 4831 scope.go:117] "RemoveContainer" containerID="8bdac41cb50f715596e2d4594b7e2f3052989e4c89b29d15d1d1fc2ff386ce5e" Dec 04 11:35:39 crc kubenswrapper[4831]: I1204 11:35:39.513817 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vllj4"] Dec 04 11:35:39 crc kubenswrapper[4831]: I1204 11:35:39.528531 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vllj4"] Dec 04 11:35:39 crc kubenswrapper[4831]: I1204 11:35:39.529253 4831 scope.go:117] "RemoveContainer" containerID="e8cd6ab6be0499380a2d5c23b80cb6df66d9d79fdf2ed8fa5ddc34a52b85a6d1" Dec 04 11:35:39 crc kubenswrapper[4831]: I1204 11:35:39.567092 4831 scope.go:117] "RemoveContainer" containerID="b6fc21145fe36384831012c75c11776fa7bd59de6b3215c8be9cdd562a6ea352" Dec 04 11:35:39 crc kubenswrapper[4831]: E1204 11:35:39.567588 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6fc21145fe36384831012c75c11776fa7bd59de6b3215c8be9cdd562a6ea352\": container with ID starting with b6fc21145fe36384831012c75c11776fa7bd59de6b3215c8be9cdd562a6ea352 not found: ID does not exist" containerID="b6fc21145fe36384831012c75c11776fa7bd59de6b3215c8be9cdd562a6ea352" Dec 04 11:35:39 crc kubenswrapper[4831]: I1204 11:35:39.567632 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6fc21145fe36384831012c75c11776fa7bd59de6b3215c8be9cdd562a6ea352"} err="failed to get container status \"b6fc21145fe36384831012c75c11776fa7bd59de6b3215c8be9cdd562a6ea352\": rpc error: code = NotFound desc = could not find container \"b6fc21145fe36384831012c75c11776fa7bd59de6b3215c8be9cdd562a6ea352\": container with ID starting with b6fc21145fe36384831012c75c11776fa7bd59de6b3215c8be9cdd562a6ea352 not found: ID does not exist" Dec 04 11:35:39 crc kubenswrapper[4831]: I1204 11:35:39.567675 4831 scope.go:117] "RemoveContainer" containerID="8bdac41cb50f715596e2d4594b7e2f3052989e4c89b29d15d1d1fc2ff386ce5e" Dec 04 11:35:39 crc kubenswrapper[4831]: E1204 11:35:39.568070 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bdac41cb50f715596e2d4594b7e2f3052989e4c89b29d15d1d1fc2ff386ce5e\": container with ID starting with 8bdac41cb50f715596e2d4594b7e2f3052989e4c89b29d15d1d1fc2ff386ce5e not found: ID does not exist" containerID="8bdac41cb50f715596e2d4594b7e2f3052989e4c89b29d15d1d1fc2ff386ce5e" Dec 04 11:35:39 crc kubenswrapper[4831]: I1204 11:35:39.568130 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bdac41cb50f715596e2d4594b7e2f3052989e4c89b29d15d1d1fc2ff386ce5e"} err="failed to get container status \"8bdac41cb50f715596e2d4594b7e2f3052989e4c89b29d15d1d1fc2ff386ce5e\": rpc error: code = NotFound desc = could not find container \"8bdac41cb50f715596e2d4594b7e2f3052989e4c89b29d15d1d1fc2ff386ce5e\": container with ID starting with 8bdac41cb50f715596e2d4594b7e2f3052989e4c89b29d15d1d1fc2ff386ce5e not found: ID does not exist" Dec 04 11:35:39 crc kubenswrapper[4831]: I1204 11:35:39.568167 4831 scope.go:117] "RemoveContainer" containerID="e8cd6ab6be0499380a2d5c23b80cb6df66d9d79fdf2ed8fa5ddc34a52b85a6d1" Dec 04 11:35:39 crc kubenswrapper[4831]: E1204 11:35:39.568557 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8cd6ab6be0499380a2d5c23b80cb6df66d9d79fdf2ed8fa5ddc34a52b85a6d1\": container with ID starting with e8cd6ab6be0499380a2d5c23b80cb6df66d9d79fdf2ed8fa5ddc34a52b85a6d1 not found: ID does not exist" containerID="e8cd6ab6be0499380a2d5c23b80cb6df66d9d79fdf2ed8fa5ddc34a52b85a6d1" Dec 04 11:35:39 crc kubenswrapper[4831]: I1204 11:35:39.568587 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8cd6ab6be0499380a2d5c23b80cb6df66d9d79fdf2ed8fa5ddc34a52b85a6d1"} err="failed to get container status \"e8cd6ab6be0499380a2d5c23b80cb6df66d9d79fdf2ed8fa5ddc34a52b85a6d1\": rpc error: code = NotFound desc = could not find container \"e8cd6ab6be0499380a2d5c23b80cb6df66d9d79fdf2ed8fa5ddc34a52b85a6d1\": container with ID starting with e8cd6ab6be0499380a2d5c23b80cb6df66d9d79fdf2ed8fa5ddc34a52b85a6d1 not found: ID does not exist" Dec 04 11:35:41 crc kubenswrapper[4831]: I1204 11:35:41.288974 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37733112-3c76-4ee9-bd13-13e0c50c3805" path="/var/lib/kubelet/pods/37733112-3c76-4ee9-bd13-13e0c50c3805/volumes" Dec 04 11:35:42 crc kubenswrapper[4831]: I1204 11:35:42.276484 4831 scope.go:117] "RemoveContainer" containerID="8b9b26402f08722e527a1c5e7795687992156ab52df4d8ec561593a4f32e874b" Dec 04 11:35:42 crc kubenswrapper[4831]: E1204 11:35:42.277079 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:35:55 crc kubenswrapper[4831]: I1204 11:35:55.277582 4831 scope.go:117] "RemoveContainer" containerID="8b9b26402f08722e527a1c5e7795687992156ab52df4d8ec561593a4f32e874b" Dec 04 11:35:55 crc kubenswrapper[4831]: E1204 11:35:55.279267 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:36:08 crc kubenswrapper[4831]: I1204 11:36:08.277384 4831 scope.go:117] "RemoveContainer" containerID="8b9b26402f08722e527a1c5e7795687992156ab52df4d8ec561593a4f32e874b" Dec 04 11:36:08 crc kubenswrapper[4831]: E1204 11:36:08.278263 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:36:21 crc kubenswrapper[4831]: I1204 11:36:21.276559 4831 scope.go:117] "RemoveContainer" containerID="8b9b26402f08722e527a1c5e7795687992156ab52df4d8ec561593a4f32e874b" Dec 04 11:36:21 crc kubenswrapper[4831]: E1204 11:36:21.277616 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:36:35 crc kubenswrapper[4831]: I1204 11:36:35.277452 4831 scope.go:117] "RemoveContainer" containerID="8b9b26402f08722e527a1c5e7795687992156ab52df4d8ec561593a4f32e874b" Dec 04 11:36:36 crc kubenswrapper[4831]: I1204 11:36:36.065416 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" event={"ID":"8475bb26-8864-4d49-935b-db7d4cb73387","Type":"ContainerStarted","Data":"b910b9f9f31d9b7ea2851fe8d4d87344bf502199f6ffd5f020cfbd2d35fdd9bb"} Dec 04 11:38:51 crc kubenswrapper[4831]: I1204 11:38:51.971473 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:38:51 crc kubenswrapper[4831]: I1204 11:38:51.972061 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:39:21 crc kubenswrapper[4831]: I1204 11:39:21.971800 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:39:21 crc kubenswrapper[4831]: I1204 11:39:21.972399 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:39:51 crc kubenswrapper[4831]: I1204 11:39:51.971489 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:39:51 crc kubenswrapper[4831]: I1204 11:39:51.972240 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:39:51 crc kubenswrapper[4831]: I1204 11:39:51.972302 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" Dec 04 11:39:51 crc kubenswrapper[4831]: I1204 11:39:51.973252 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b910b9f9f31d9b7ea2851fe8d4d87344bf502199f6ffd5f020cfbd2d35fdd9bb"} pod="openshift-machine-config-operator/machine-config-daemon-g76nn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 11:39:51 crc kubenswrapper[4831]: I1204 11:39:51.973318 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" containerID="cri-o://b910b9f9f31d9b7ea2851fe8d4d87344bf502199f6ffd5f020cfbd2d35fdd9bb" gracePeriod=600 Dec 04 11:39:52 crc kubenswrapper[4831]: I1204 11:39:52.894702 4831 generic.go:334] "Generic (PLEG): container finished" podID="8475bb26-8864-4d49-935b-db7d4cb73387" containerID="b910b9f9f31d9b7ea2851fe8d4d87344bf502199f6ffd5f020cfbd2d35fdd9bb" exitCode=0 Dec 04 11:39:52 crc kubenswrapper[4831]: I1204 11:39:52.894762 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" event={"ID":"8475bb26-8864-4d49-935b-db7d4cb73387","Type":"ContainerDied","Data":"b910b9f9f31d9b7ea2851fe8d4d87344bf502199f6ffd5f020cfbd2d35fdd9bb"} Dec 04 11:39:52 crc kubenswrapper[4831]: I1204 11:39:52.895073 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" event={"ID":"8475bb26-8864-4d49-935b-db7d4cb73387","Type":"ContainerStarted","Data":"c10daa181ba567efa481df940488e4d09754266337731799be708a612b58e69b"} Dec 04 11:39:52 crc kubenswrapper[4831]: I1204 11:39:52.895101 4831 scope.go:117] "RemoveContainer" containerID="8b9b26402f08722e527a1c5e7795687992156ab52df4d8ec561593a4f32e874b" Dec 04 11:42:21 crc kubenswrapper[4831]: I1204 11:42:21.971223 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:42:21 crc kubenswrapper[4831]: I1204 11:42:21.971795 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:42:51 crc kubenswrapper[4831]: I1204 11:42:51.971742 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:42:51 crc kubenswrapper[4831]: I1204 11:42:51.972286 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:43:21 crc kubenswrapper[4831]: I1204 11:43:21.971859 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:43:21 crc kubenswrapper[4831]: I1204 11:43:21.973068 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:43:21 crc kubenswrapper[4831]: I1204 11:43:21.973143 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" Dec 04 11:43:21 crc kubenswrapper[4831]: I1204 11:43:21.974158 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c10daa181ba567efa481df940488e4d09754266337731799be708a612b58e69b"} pod="openshift-machine-config-operator/machine-config-daemon-g76nn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 11:43:21 crc kubenswrapper[4831]: I1204 11:43:21.974221 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" containerID="cri-o://c10daa181ba567efa481df940488e4d09754266337731799be708a612b58e69b" gracePeriod=600 Dec 04 11:43:22 crc kubenswrapper[4831]: E1204 11:43:22.114197 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:43:22 crc kubenswrapper[4831]: I1204 11:43:22.946021 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" event={"ID":"8475bb26-8864-4d49-935b-db7d4cb73387","Type":"ContainerDied","Data":"c10daa181ba567efa481df940488e4d09754266337731799be708a612b58e69b"} Dec 04 11:43:22 crc kubenswrapper[4831]: I1204 11:43:22.946559 4831 scope.go:117] "RemoveContainer" containerID="b910b9f9f31d9b7ea2851fe8d4d87344bf502199f6ffd5f020cfbd2d35fdd9bb" Dec 04 11:43:22 crc kubenswrapper[4831]: I1204 11:43:22.946080 4831 generic.go:334] "Generic (PLEG): container finished" podID="8475bb26-8864-4d49-935b-db7d4cb73387" containerID="c10daa181ba567efa481df940488e4d09754266337731799be708a612b58e69b" exitCode=0 Dec 04 11:43:22 crc kubenswrapper[4831]: I1204 11:43:22.947056 4831 scope.go:117] "RemoveContainer" containerID="c10daa181ba567efa481df940488e4d09754266337731799be708a612b58e69b" Dec 04 11:43:22 crc kubenswrapper[4831]: E1204 11:43:22.947314 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:43:31 crc kubenswrapper[4831]: I1204 11:43:31.775739 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xjx2g"] Dec 04 11:43:31 crc kubenswrapper[4831]: E1204 11:43:31.776689 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37733112-3c76-4ee9-bd13-13e0c50c3805" containerName="extract-content" Dec 04 11:43:31 crc kubenswrapper[4831]: I1204 11:43:31.776704 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="37733112-3c76-4ee9-bd13-13e0c50c3805" containerName="extract-content" Dec 04 11:43:31 crc kubenswrapper[4831]: E1204 11:43:31.776720 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37733112-3c76-4ee9-bd13-13e0c50c3805" containerName="extract-utilities" Dec 04 11:43:31 crc kubenswrapper[4831]: I1204 11:43:31.776726 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="37733112-3c76-4ee9-bd13-13e0c50c3805" containerName="extract-utilities" Dec 04 11:43:31 crc kubenswrapper[4831]: E1204 11:43:31.776756 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37733112-3c76-4ee9-bd13-13e0c50c3805" containerName="registry-server" Dec 04 11:43:31 crc kubenswrapper[4831]: I1204 11:43:31.776763 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="37733112-3c76-4ee9-bd13-13e0c50c3805" containerName="registry-server" Dec 04 11:43:31 crc kubenswrapper[4831]: I1204 11:43:31.776970 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="37733112-3c76-4ee9-bd13-13e0c50c3805" containerName="registry-server" Dec 04 11:43:31 crc kubenswrapper[4831]: I1204 11:43:31.779453 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xjx2g" Dec 04 11:43:31 crc kubenswrapper[4831]: I1204 11:43:31.789180 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xjx2g"] Dec 04 11:43:31 crc kubenswrapper[4831]: I1204 11:43:31.939602 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccd2f\" (UniqueName: \"kubernetes.io/projected/c2b8f8c5-81f5-41b5-a693-5d0eff54a25b-kube-api-access-ccd2f\") pod \"redhat-operators-xjx2g\" (UID: \"c2b8f8c5-81f5-41b5-a693-5d0eff54a25b\") " pod="openshift-marketplace/redhat-operators-xjx2g" Dec 04 11:43:31 crc kubenswrapper[4831]: I1204 11:43:31.939980 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2b8f8c5-81f5-41b5-a693-5d0eff54a25b-catalog-content\") pod \"redhat-operators-xjx2g\" (UID: \"c2b8f8c5-81f5-41b5-a693-5d0eff54a25b\") " pod="openshift-marketplace/redhat-operators-xjx2g" Dec 04 11:43:31 crc kubenswrapper[4831]: I1204 11:43:31.940277 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2b8f8c5-81f5-41b5-a693-5d0eff54a25b-utilities\") pod \"redhat-operators-xjx2g\" (UID: \"c2b8f8c5-81f5-41b5-a693-5d0eff54a25b\") " pod="openshift-marketplace/redhat-operators-xjx2g" Dec 04 11:43:32 crc kubenswrapper[4831]: I1204 11:43:32.041962 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2b8f8c5-81f5-41b5-a693-5d0eff54a25b-utilities\") pod \"redhat-operators-xjx2g\" (UID: \"c2b8f8c5-81f5-41b5-a693-5d0eff54a25b\") " pod="openshift-marketplace/redhat-operators-xjx2g" Dec 04 11:43:32 crc kubenswrapper[4831]: I1204 11:43:32.042251 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccd2f\" (UniqueName: \"kubernetes.io/projected/c2b8f8c5-81f5-41b5-a693-5d0eff54a25b-kube-api-access-ccd2f\") pod \"redhat-operators-xjx2g\" (UID: \"c2b8f8c5-81f5-41b5-a693-5d0eff54a25b\") " pod="openshift-marketplace/redhat-operators-xjx2g" Dec 04 11:43:32 crc kubenswrapper[4831]: I1204 11:43:32.042367 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2b8f8c5-81f5-41b5-a693-5d0eff54a25b-catalog-content\") pod \"redhat-operators-xjx2g\" (UID: \"c2b8f8c5-81f5-41b5-a693-5d0eff54a25b\") " pod="openshift-marketplace/redhat-operators-xjx2g" Dec 04 11:43:32 crc kubenswrapper[4831]: I1204 11:43:32.043068 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2b8f8c5-81f5-41b5-a693-5d0eff54a25b-catalog-content\") pod \"redhat-operators-xjx2g\" (UID: \"c2b8f8c5-81f5-41b5-a693-5d0eff54a25b\") " pod="openshift-marketplace/redhat-operators-xjx2g" Dec 04 11:43:32 crc kubenswrapper[4831]: I1204 11:43:32.043372 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2b8f8c5-81f5-41b5-a693-5d0eff54a25b-utilities\") pod \"redhat-operators-xjx2g\" (UID: \"c2b8f8c5-81f5-41b5-a693-5d0eff54a25b\") " pod="openshift-marketplace/redhat-operators-xjx2g" Dec 04 11:43:32 crc kubenswrapper[4831]: I1204 11:43:32.072271 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccd2f\" (UniqueName: \"kubernetes.io/projected/c2b8f8c5-81f5-41b5-a693-5d0eff54a25b-kube-api-access-ccd2f\") pod \"redhat-operators-xjx2g\" (UID: \"c2b8f8c5-81f5-41b5-a693-5d0eff54a25b\") " pod="openshift-marketplace/redhat-operators-xjx2g" Dec 04 11:43:32 crc kubenswrapper[4831]: I1204 11:43:32.097163 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xjx2g" Dec 04 11:43:32 crc kubenswrapper[4831]: I1204 11:43:32.637174 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xjx2g"] Dec 04 11:43:32 crc kubenswrapper[4831]: I1204 11:43:32.901239 4831 generic.go:334] "Generic (PLEG): container finished" podID="c2b8f8c5-81f5-41b5-a693-5d0eff54a25b" containerID="5d6a119517e1e2d32ba0274075e0ac91bdd9af031896e6708942ea17711ffa2e" exitCode=0 Dec 04 11:43:32 crc kubenswrapper[4831]: I1204 11:43:32.901304 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xjx2g" event={"ID":"c2b8f8c5-81f5-41b5-a693-5d0eff54a25b","Type":"ContainerDied","Data":"5d6a119517e1e2d32ba0274075e0ac91bdd9af031896e6708942ea17711ffa2e"} Dec 04 11:43:32 crc kubenswrapper[4831]: I1204 11:43:32.901547 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xjx2g" event={"ID":"c2b8f8c5-81f5-41b5-a693-5d0eff54a25b","Type":"ContainerStarted","Data":"9a409d32f86af1ec8fc30c63fefa276263880bd4ab097a604a1023b632e98dda"} Dec 04 11:43:32 crc kubenswrapper[4831]: I1204 11:43:32.914860 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 11:43:33 crc kubenswrapper[4831]: I1204 11:43:33.915640 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xjx2g" event={"ID":"c2b8f8c5-81f5-41b5-a693-5d0eff54a25b","Type":"ContainerStarted","Data":"dc11c26059d0b0d8e6d7b76d475cbb3c668416d9f0d339f356aa16a50b2a6b9a"} Dec 04 11:43:34 crc kubenswrapper[4831]: I1204 11:43:34.276644 4831 scope.go:117] "RemoveContainer" containerID="c10daa181ba567efa481df940488e4d09754266337731799be708a612b58e69b" Dec 04 11:43:34 crc kubenswrapper[4831]: E1204 11:43:34.277154 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:43:36 crc kubenswrapper[4831]: I1204 11:43:36.948366 4831 generic.go:334] "Generic (PLEG): container finished" podID="c2b8f8c5-81f5-41b5-a693-5d0eff54a25b" containerID="dc11c26059d0b0d8e6d7b76d475cbb3c668416d9f0d339f356aa16a50b2a6b9a" exitCode=0 Dec 04 11:43:36 crc kubenswrapper[4831]: I1204 11:43:36.948506 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xjx2g" event={"ID":"c2b8f8c5-81f5-41b5-a693-5d0eff54a25b","Type":"ContainerDied","Data":"dc11c26059d0b0d8e6d7b76d475cbb3c668416d9f0d339f356aa16a50b2a6b9a"} Dec 04 11:43:37 crc kubenswrapper[4831]: I1204 11:43:37.965312 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xjx2g" event={"ID":"c2b8f8c5-81f5-41b5-a693-5d0eff54a25b","Type":"ContainerStarted","Data":"1a7cceeea1a8eea62cb2db38858ea1b945d1fa694a60cb3a12b68e2679a49cde"} Dec 04 11:43:37 crc kubenswrapper[4831]: I1204 11:43:37.999388 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xjx2g" podStartSLOduration=2.591344119 podStartE2EDuration="6.999363826s" podCreationTimestamp="2025-12-04 11:43:31 +0000 UTC" firstStartedPulling="2025-12-04 11:43:32.914352882 +0000 UTC m=+5309.863528196" lastFinishedPulling="2025-12-04 11:43:37.322372589 +0000 UTC m=+5314.271547903" observedRunningTime="2025-12-04 11:43:37.983501978 +0000 UTC m=+5314.932677312" watchObservedRunningTime="2025-12-04 11:43:37.999363826 +0000 UTC m=+5314.948539150" Dec 04 11:43:42 crc kubenswrapper[4831]: I1204 11:43:42.098122 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xjx2g" Dec 04 11:43:42 crc kubenswrapper[4831]: I1204 11:43:42.098977 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xjx2g" Dec 04 11:43:43 crc kubenswrapper[4831]: I1204 11:43:43.143178 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xjx2g" podUID="c2b8f8c5-81f5-41b5-a693-5d0eff54a25b" containerName="registry-server" probeResult="failure" output=< Dec 04 11:43:43 crc kubenswrapper[4831]: timeout: failed to connect service ":50051" within 1s Dec 04 11:43:43 crc kubenswrapper[4831]: > Dec 04 11:43:48 crc kubenswrapper[4831]: I1204 11:43:48.277599 4831 scope.go:117] "RemoveContainer" containerID="c10daa181ba567efa481df940488e4d09754266337731799be708a612b58e69b" Dec 04 11:43:48 crc kubenswrapper[4831]: E1204 11:43:48.278560 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:43:52 crc kubenswrapper[4831]: I1204 11:43:52.149722 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xjx2g" Dec 04 11:43:52 crc kubenswrapper[4831]: I1204 11:43:52.207558 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xjx2g" Dec 04 11:43:52 crc kubenswrapper[4831]: I1204 11:43:52.403372 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xjx2g"] Dec 04 11:43:53 crc kubenswrapper[4831]: I1204 11:43:53.108490 4831 generic.go:334] "Generic (PLEG): container finished" podID="b3441a94-3bf3-4956-8d5b-0b88f451404b" containerID="e62e569f51abaa1285cdede070c8e6bd68d1d47f60fa8a97297ccad898a14d98" exitCode=1 Dec 04 11:43:53 crc kubenswrapper[4831]: I1204 11:43:53.108568 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"b3441a94-3bf3-4956-8d5b-0b88f451404b","Type":"ContainerDied","Data":"e62e569f51abaa1285cdede070c8e6bd68d1d47f60fa8a97297ccad898a14d98"} Dec 04 11:43:54 crc kubenswrapper[4831]: I1204 11:43:54.118277 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xjx2g" podUID="c2b8f8c5-81f5-41b5-a693-5d0eff54a25b" containerName="registry-server" containerID="cri-o://1a7cceeea1a8eea62cb2db38858ea1b945d1fa694a60cb3a12b68e2679a49cde" gracePeriod=2 Dec 04 11:43:54 crc kubenswrapper[4831]: I1204 11:43:54.608405 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 04 11:43:54 crc kubenswrapper[4831]: I1204 11:43:54.615403 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xjx2g" Dec 04 11:43:54 crc kubenswrapper[4831]: I1204 11:43:54.772131 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b3441a94-3bf3-4956-8d5b-0b88f451404b-test-operator-ephemeral-workdir\") pod \"b3441a94-3bf3-4956-8d5b-0b88f451404b\" (UID: \"b3441a94-3bf3-4956-8d5b-0b88f451404b\") " Dec 04 11:43:54 crc kubenswrapper[4831]: I1204 11:43:54.772177 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b3441a94-3bf3-4956-8d5b-0b88f451404b-openstack-config\") pod \"b3441a94-3bf3-4956-8d5b-0b88f451404b\" (UID: \"b3441a94-3bf3-4956-8d5b-0b88f451404b\") " Dec 04 11:43:54 crc kubenswrapper[4831]: I1204 11:43:54.772223 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2b8f8c5-81f5-41b5-a693-5d0eff54a25b-utilities\") pod \"c2b8f8c5-81f5-41b5-a693-5d0eff54a25b\" (UID: \"c2b8f8c5-81f5-41b5-a693-5d0eff54a25b\") " Dec 04 11:43:54 crc kubenswrapper[4831]: I1204 11:43:54.772265 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2b8f8c5-81f5-41b5-a693-5d0eff54a25b-catalog-content\") pod \"c2b8f8c5-81f5-41b5-a693-5d0eff54a25b\" (UID: \"c2b8f8c5-81f5-41b5-a693-5d0eff54a25b\") " Dec 04 11:43:54 crc kubenswrapper[4831]: I1204 11:43:54.772307 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b3441a94-3bf3-4956-8d5b-0b88f451404b-ssh-key\") pod \"b3441a94-3bf3-4956-8d5b-0b88f451404b\" (UID: \"b3441a94-3bf3-4956-8d5b-0b88f451404b\") " Dec 04 11:43:54 crc kubenswrapper[4831]: I1204 11:43:54.772357 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"b3441a94-3bf3-4956-8d5b-0b88f451404b\" (UID: \"b3441a94-3bf3-4956-8d5b-0b88f451404b\") " Dec 04 11:43:54 crc kubenswrapper[4831]: I1204 11:43:54.773112 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2b8f8c5-81f5-41b5-a693-5d0eff54a25b-utilities" (OuterVolumeSpecName: "utilities") pod "c2b8f8c5-81f5-41b5-a693-5d0eff54a25b" (UID: "c2b8f8c5-81f5-41b5-a693-5d0eff54a25b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:43:54 crc kubenswrapper[4831]: I1204 11:43:54.773121 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b3441a94-3bf3-4956-8d5b-0b88f451404b-config-data\") pod \"b3441a94-3bf3-4956-8d5b-0b88f451404b\" (UID: \"b3441a94-3bf3-4956-8d5b-0b88f451404b\") " Dec 04 11:43:54 crc kubenswrapper[4831]: I1204 11:43:54.773186 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b3441a94-3bf3-4956-8d5b-0b88f451404b-ca-certs\") pod \"b3441a94-3bf3-4956-8d5b-0b88f451404b\" (UID: \"b3441a94-3bf3-4956-8d5b-0b88f451404b\") " Dec 04 11:43:54 crc kubenswrapper[4831]: I1204 11:43:54.773224 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b3441a94-3bf3-4956-8d5b-0b88f451404b-test-operator-ephemeral-temporary\") pod \"b3441a94-3bf3-4956-8d5b-0b88f451404b\" (UID: \"b3441a94-3bf3-4956-8d5b-0b88f451404b\") " Dec 04 11:43:54 crc kubenswrapper[4831]: I1204 11:43:54.773261 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxkw7\" (UniqueName: \"kubernetes.io/projected/b3441a94-3bf3-4956-8d5b-0b88f451404b-kube-api-access-cxkw7\") pod \"b3441a94-3bf3-4956-8d5b-0b88f451404b\" (UID: \"b3441a94-3bf3-4956-8d5b-0b88f451404b\") " Dec 04 11:43:54 crc kubenswrapper[4831]: I1204 11:43:54.773308 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccd2f\" (UniqueName: \"kubernetes.io/projected/c2b8f8c5-81f5-41b5-a693-5d0eff54a25b-kube-api-access-ccd2f\") pod \"c2b8f8c5-81f5-41b5-a693-5d0eff54a25b\" (UID: \"c2b8f8c5-81f5-41b5-a693-5d0eff54a25b\") " Dec 04 11:43:54 crc kubenswrapper[4831]: I1204 11:43:54.773338 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b3441a94-3bf3-4956-8d5b-0b88f451404b-openstack-config-secret\") pod \"b3441a94-3bf3-4956-8d5b-0b88f451404b\" (UID: \"b3441a94-3bf3-4956-8d5b-0b88f451404b\") " Dec 04 11:43:54 crc kubenswrapper[4831]: I1204 11:43:54.773676 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3441a94-3bf3-4956-8d5b-0b88f451404b-config-data" (OuterVolumeSpecName: "config-data") pod "b3441a94-3bf3-4956-8d5b-0b88f451404b" (UID: "b3441a94-3bf3-4956-8d5b-0b88f451404b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 11:43:54 crc kubenswrapper[4831]: I1204 11:43:54.773847 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2b8f8c5-81f5-41b5-a693-5d0eff54a25b-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 11:43:54 crc kubenswrapper[4831]: I1204 11:43:54.773861 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b3441a94-3bf3-4956-8d5b-0b88f451404b-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 11:43:54 crc kubenswrapper[4831]: I1204 11:43:54.773864 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3441a94-3bf3-4956-8d5b-0b88f451404b-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "b3441a94-3bf3-4956-8d5b-0b88f451404b" (UID: "b3441a94-3bf3-4956-8d5b-0b88f451404b"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:43:54 crc kubenswrapper[4831]: I1204 11:43:54.777928 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2b8f8c5-81f5-41b5-a693-5d0eff54a25b-kube-api-access-ccd2f" (OuterVolumeSpecName: "kube-api-access-ccd2f") pod "c2b8f8c5-81f5-41b5-a693-5d0eff54a25b" (UID: "c2b8f8c5-81f5-41b5-a693-5d0eff54a25b"). InnerVolumeSpecName "kube-api-access-ccd2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:43:54 crc kubenswrapper[4831]: I1204 11:43:54.778256 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "test-operator-logs") pod "b3441a94-3bf3-4956-8d5b-0b88f451404b" (UID: "b3441a94-3bf3-4956-8d5b-0b88f451404b"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 11:43:54 crc kubenswrapper[4831]: I1204 11:43:54.778284 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3441a94-3bf3-4956-8d5b-0b88f451404b-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "b3441a94-3bf3-4956-8d5b-0b88f451404b" (UID: "b3441a94-3bf3-4956-8d5b-0b88f451404b"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:43:54 crc kubenswrapper[4831]: I1204 11:43:54.784024 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3441a94-3bf3-4956-8d5b-0b88f451404b-kube-api-access-cxkw7" (OuterVolumeSpecName: "kube-api-access-cxkw7") pod "b3441a94-3bf3-4956-8d5b-0b88f451404b" (UID: "b3441a94-3bf3-4956-8d5b-0b88f451404b"). InnerVolumeSpecName "kube-api-access-cxkw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:43:54 crc kubenswrapper[4831]: I1204 11:43:54.810227 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3441a94-3bf3-4956-8d5b-0b88f451404b-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "b3441a94-3bf3-4956-8d5b-0b88f451404b" (UID: "b3441a94-3bf3-4956-8d5b-0b88f451404b"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 11:43:54 crc kubenswrapper[4831]: I1204 11:43:54.817162 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3441a94-3bf3-4956-8d5b-0b88f451404b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b3441a94-3bf3-4956-8d5b-0b88f451404b" (UID: "b3441a94-3bf3-4956-8d5b-0b88f451404b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 11:43:54 crc kubenswrapper[4831]: I1204 11:43:54.817853 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3441a94-3bf3-4956-8d5b-0b88f451404b-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "b3441a94-3bf3-4956-8d5b-0b88f451404b" (UID: "b3441a94-3bf3-4956-8d5b-0b88f451404b"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 11:43:54 crc kubenswrapper[4831]: I1204 11:43:54.832181 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3441a94-3bf3-4956-8d5b-0b88f451404b-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "b3441a94-3bf3-4956-8d5b-0b88f451404b" (UID: "b3441a94-3bf3-4956-8d5b-0b88f451404b"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 11:43:54 crc kubenswrapper[4831]: I1204 11:43:54.876122 4831 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b3441a94-3bf3-4956-8d5b-0b88f451404b-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 04 11:43:54 crc kubenswrapper[4831]: I1204 11:43:54.876155 4831 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b3441a94-3bf3-4956-8d5b-0b88f451404b-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 04 11:43:54 crc kubenswrapper[4831]: I1204 11:43:54.876167 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxkw7\" (UniqueName: \"kubernetes.io/projected/b3441a94-3bf3-4956-8d5b-0b88f451404b-kube-api-access-cxkw7\") on node \"crc\" DevicePath \"\"" Dec 04 11:43:54 crc kubenswrapper[4831]: I1204 11:43:54.876178 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccd2f\" (UniqueName: \"kubernetes.io/projected/c2b8f8c5-81f5-41b5-a693-5d0eff54a25b-kube-api-access-ccd2f\") on node \"crc\" DevicePath \"\"" Dec 04 11:43:54 crc kubenswrapper[4831]: I1204 11:43:54.876187 4831 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b3441a94-3bf3-4956-8d5b-0b88f451404b-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 04 11:43:54 crc kubenswrapper[4831]: I1204 11:43:54.876196 4831 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b3441a94-3bf3-4956-8d5b-0b88f451404b-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 04 11:43:54 crc kubenswrapper[4831]: I1204 11:43:54.876207 4831 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b3441a94-3bf3-4956-8d5b-0b88f451404b-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 04 11:43:54 crc kubenswrapper[4831]: I1204 11:43:54.876215 4831 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b3441a94-3bf3-4956-8d5b-0b88f451404b-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 11:43:54 crc kubenswrapper[4831]: I1204 11:43:54.876249 4831 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Dec 04 11:43:54 crc kubenswrapper[4831]: I1204 11:43:54.900995 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2b8f8c5-81f5-41b5-a693-5d0eff54a25b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2b8f8c5-81f5-41b5-a693-5d0eff54a25b" (UID: "c2b8f8c5-81f5-41b5-a693-5d0eff54a25b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:43:54 crc kubenswrapper[4831]: I1204 11:43:54.904610 4831 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Dec 04 11:43:54 crc kubenswrapper[4831]: I1204 11:43:54.978256 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2b8f8c5-81f5-41b5-a693-5d0eff54a25b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 11:43:54 crc kubenswrapper[4831]: I1204 11:43:54.978301 4831 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Dec 04 11:43:55 crc kubenswrapper[4831]: I1204 11:43:55.131560 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 04 11:43:55 crc kubenswrapper[4831]: I1204 11:43:55.131558 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"b3441a94-3bf3-4956-8d5b-0b88f451404b","Type":"ContainerDied","Data":"929459f7aab8ed6e5e675e06ef649edf6b9472533590b0cbf9f6e0bbec2c230e"} Dec 04 11:43:55 crc kubenswrapper[4831]: I1204 11:43:55.131954 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="929459f7aab8ed6e5e675e06ef649edf6b9472533590b0cbf9f6e0bbec2c230e" Dec 04 11:43:55 crc kubenswrapper[4831]: I1204 11:43:55.134892 4831 generic.go:334] "Generic (PLEG): container finished" podID="c2b8f8c5-81f5-41b5-a693-5d0eff54a25b" containerID="1a7cceeea1a8eea62cb2db38858ea1b945d1fa694a60cb3a12b68e2679a49cde" exitCode=0 Dec 04 11:43:55 crc kubenswrapper[4831]: I1204 11:43:55.134931 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xjx2g" event={"ID":"c2b8f8c5-81f5-41b5-a693-5d0eff54a25b","Type":"ContainerDied","Data":"1a7cceeea1a8eea62cb2db38858ea1b945d1fa694a60cb3a12b68e2679a49cde"} Dec 04 11:43:55 crc kubenswrapper[4831]: I1204 11:43:55.134962 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xjx2g" event={"ID":"c2b8f8c5-81f5-41b5-a693-5d0eff54a25b","Type":"ContainerDied","Data":"9a409d32f86af1ec8fc30c63fefa276263880bd4ab097a604a1023b632e98dda"} Dec 04 11:43:55 crc kubenswrapper[4831]: I1204 11:43:55.134984 4831 scope.go:117] "RemoveContainer" containerID="1a7cceeea1a8eea62cb2db38858ea1b945d1fa694a60cb3a12b68e2679a49cde" Dec 04 11:43:55 crc kubenswrapper[4831]: I1204 11:43:55.134985 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xjx2g" Dec 04 11:43:55 crc kubenswrapper[4831]: I1204 11:43:55.183348 4831 scope.go:117] "RemoveContainer" containerID="dc11c26059d0b0d8e6d7b76d475cbb3c668416d9f0d339f356aa16a50b2a6b9a" Dec 04 11:43:55 crc kubenswrapper[4831]: I1204 11:43:55.196150 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xjx2g"] Dec 04 11:43:55 crc kubenswrapper[4831]: I1204 11:43:55.207645 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xjx2g"] Dec 04 11:43:55 crc kubenswrapper[4831]: I1204 11:43:55.215880 4831 scope.go:117] "RemoveContainer" containerID="5d6a119517e1e2d32ba0274075e0ac91bdd9af031896e6708942ea17711ffa2e" Dec 04 11:43:55 crc kubenswrapper[4831]: I1204 11:43:55.235365 4831 scope.go:117] "RemoveContainer" containerID="1a7cceeea1a8eea62cb2db38858ea1b945d1fa694a60cb3a12b68e2679a49cde" Dec 04 11:43:55 crc kubenswrapper[4831]: E1204 11:43:55.243474 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a7cceeea1a8eea62cb2db38858ea1b945d1fa694a60cb3a12b68e2679a49cde\": container with ID starting with 1a7cceeea1a8eea62cb2db38858ea1b945d1fa694a60cb3a12b68e2679a49cde not found: ID does not exist" containerID="1a7cceeea1a8eea62cb2db38858ea1b945d1fa694a60cb3a12b68e2679a49cde" Dec 04 11:43:55 crc kubenswrapper[4831]: I1204 11:43:55.243532 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a7cceeea1a8eea62cb2db38858ea1b945d1fa694a60cb3a12b68e2679a49cde"} err="failed to get container status \"1a7cceeea1a8eea62cb2db38858ea1b945d1fa694a60cb3a12b68e2679a49cde\": rpc error: code = NotFound desc = could not find container \"1a7cceeea1a8eea62cb2db38858ea1b945d1fa694a60cb3a12b68e2679a49cde\": container with ID starting with 1a7cceeea1a8eea62cb2db38858ea1b945d1fa694a60cb3a12b68e2679a49cde not found: ID does not exist" Dec 04 11:43:55 crc kubenswrapper[4831]: I1204 11:43:55.243567 4831 scope.go:117] "RemoveContainer" containerID="dc11c26059d0b0d8e6d7b76d475cbb3c668416d9f0d339f356aa16a50b2a6b9a" Dec 04 11:43:55 crc kubenswrapper[4831]: E1204 11:43:55.244157 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc11c26059d0b0d8e6d7b76d475cbb3c668416d9f0d339f356aa16a50b2a6b9a\": container with ID starting with dc11c26059d0b0d8e6d7b76d475cbb3c668416d9f0d339f356aa16a50b2a6b9a not found: ID does not exist" containerID="dc11c26059d0b0d8e6d7b76d475cbb3c668416d9f0d339f356aa16a50b2a6b9a" Dec 04 11:43:55 crc kubenswrapper[4831]: I1204 11:43:55.244204 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc11c26059d0b0d8e6d7b76d475cbb3c668416d9f0d339f356aa16a50b2a6b9a"} err="failed to get container status \"dc11c26059d0b0d8e6d7b76d475cbb3c668416d9f0d339f356aa16a50b2a6b9a\": rpc error: code = NotFound desc = could not find container \"dc11c26059d0b0d8e6d7b76d475cbb3c668416d9f0d339f356aa16a50b2a6b9a\": container with ID starting with dc11c26059d0b0d8e6d7b76d475cbb3c668416d9f0d339f356aa16a50b2a6b9a not found: ID does not exist" Dec 04 11:43:55 crc kubenswrapper[4831]: I1204 11:43:55.244232 4831 scope.go:117] "RemoveContainer" containerID="5d6a119517e1e2d32ba0274075e0ac91bdd9af031896e6708942ea17711ffa2e" Dec 04 11:43:55 crc kubenswrapper[4831]: E1204 11:43:55.245719 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d6a119517e1e2d32ba0274075e0ac91bdd9af031896e6708942ea17711ffa2e\": container with ID starting with 5d6a119517e1e2d32ba0274075e0ac91bdd9af031896e6708942ea17711ffa2e not found: ID does not exist" containerID="5d6a119517e1e2d32ba0274075e0ac91bdd9af031896e6708942ea17711ffa2e" Dec 04 11:43:55 crc kubenswrapper[4831]: I1204 11:43:55.245767 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d6a119517e1e2d32ba0274075e0ac91bdd9af031896e6708942ea17711ffa2e"} err="failed to get container status \"5d6a119517e1e2d32ba0274075e0ac91bdd9af031896e6708942ea17711ffa2e\": rpc error: code = NotFound desc = could not find container \"5d6a119517e1e2d32ba0274075e0ac91bdd9af031896e6708942ea17711ffa2e\": container with ID starting with 5d6a119517e1e2d32ba0274075e0ac91bdd9af031896e6708942ea17711ffa2e not found: ID does not exist" Dec 04 11:43:55 crc kubenswrapper[4831]: I1204 11:43:55.294837 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2b8f8c5-81f5-41b5-a693-5d0eff54a25b" path="/var/lib/kubelet/pods/c2b8f8c5-81f5-41b5-a693-5d0eff54a25b/volumes" Dec 04 11:44:03 crc kubenswrapper[4831]: I1204 11:44:03.285854 4831 scope.go:117] "RemoveContainer" containerID="c10daa181ba567efa481df940488e4d09754266337731799be708a612b58e69b" Dec 04 11:44:03 crc kubenswrapper[4831]: E1204 11:44:03.286939 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:44:06 crc kubenswrapper[4831]: I1204 11:44:06.918012 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 04 11:44:06 crc kubenswrapper[4831]: E1204 11:44:06.918907 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3441a94-3bf3-4956-8d5b-0b88f451404b" containerName="tempest-tests-tempest-tests-runner" Dec 04 11:44:06 crc kubenswrapper[4831]: I1204 11:44:06.918919 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3441a94-3bf3-4956-8d5b-0b88f451404b" containerName="tempest-tests-tempest-tests-runner" Dec 04 11:44:06 crc kubenswrapper[4831]: E1204 11:44:06.918939 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2b8f8c5-81f5-41b5-a693-5d0eff54a25b" containerName="extract-utilities" Dec 04 11:44:06 crc kubenswrapper[4831]: I1204 11:44:06.918946 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2b8f8c5-81f5-41b5-a693-5d0eff54a25b" containerName="extract-utilities" Dec 04 11:44:06 crc kubenswrapper[4831]: E1204 11:44:06.918981 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2b8f8c5-81f5-41b5-a693-5d0eff54a25b" containerName="extract-content" Dec 04 11:44:06 crc kubenswrapper[4831]: I1204 11:44:06.918988 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2b8f8c5-81f5-41b5-a693-5d0eff54a25b" containerName="extract-content" Dec 04 11:44:06 crc kubenswrapper[4831]: E1204 11:44:06.918996 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2b8f8c5-81f5-41b5-a693-5d0eff54a25b" containerName="registry-server" Dec 04 11:44:06 crc kubenswrapper[4831]: I1204 11:44:06.919002 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2b8f8c5-81f5-41b5-a693-5d0eff54a25b" containerName="registry-server" Dec 04 11:44:06 crc kubenswrapper[4831]: I1204 11:44:06.919207 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2b8f8c5-81f5-41b5-a693-5d0eff54a25b" containerName="registry-server" Dec 04 11:44:06 crc kubenswrapper[4831]: I1204 11:44:06.919230 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3441a94-3bf3-4956-8d5b-0b88f451404b" containerName="tempest-tests-tempest-tests-runner" Dec 04 11:44:06 crc kubenswrapper[4831]: I1204 11:44:06.919926 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 11:44:06 crc kubenswrapper[4831]: I1204 11:44:06.924110 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-9bj9r" Dec 04 11:44:06 crc kubenswrapper[4831]: I1204 11:44:06.929789 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 04 11:44:07 crc kubenswrapper[4831]: I1204 11:44:07.121612 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"33b0c023-96fd-447b-bc56-2cc465eeeb09\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 11:44:07 crc kubenswrapper[4831]: I1204 11:44:07.121686 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmmv7\" (UniqueName: \"kubernetes.io/projected/33b0c023-96fd-447b-bc56-2cc465eeeb09-kube-api-access-pmmv7\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"33b0c023-96fd-447b-bc56-2cc465eeeb09\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 11:44:07 crc kubenswrapper[4831]: I1204 11:44:07.224519 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"33b0c023-96fd-447b-bc56-2cc465eeeb09\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 11:44:07 crc kubenswrapper[4831]: I1204 11:44:07.224600 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmmv7\" (UniqueName: \"kubernetes.io/projected/33b0c023-96fd-447b-bc56-2cc465eeeb09-kube-api-access-pmmv7\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"33b0c023-96fd-447b-bc56-2cc465eeeb09\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 11:44:07 crc kubenswrapper[4831]: I1204 11:44:07.225783 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"33b0c023-96fd-447b-bc56-2cc465eeeb09\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 11:44:07 crc kubenswrapper[4831]: I1204 11:44:07.249426 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmmv7\" (UniqueName: \"kubernetes.io/projected/33b0c023-96fd-447b-bc56-2cc465eeeb09-kube-api-access-pmmv7\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"33b0c023-96fd-447b-bc56-2cc465eeeb09\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 11:44:07 crc kubenswrapper[4831]: I1204 11:44:07.266866 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"33b0c023-96fd-447b-bc56-2cc465eeeb09\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 11:44:07 crc kubenswrapper[4831]: I1204 11:44:07.560253 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 11:44:08 crc kubenswrapper[4831]: I1204 11:44:08.058851 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 04 11:44:08 crc kubenswrapper[4831]: I1204 11:44:08.260855 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"33b0c023-96fd-447b-bc56-2cc465eeeb09","Type":"ContainerStarted","Data":"b99180c21b0e99c7be26213520d963666bee7627bc7a20e9a1985376e3676d2e"} Dec 04 11:44:10 crc kubenswrapper[4831]: I1204 11:44:10.277459 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"33b0c023-96fd-447b-bc56-2cc465eeeb09","Type":"ContainerStarted","Data":"2478983d7aae2ef5147323afe8e81b98cf467a2610a0c810619be7f346f56aaa"} Dec 04 11:44:10 crc kubenswrapper[4831]: I1204 11:44:10.298021 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=3.132895218 podStartE2EDuration="4.29799876s" podCreationTimestamp="2025-12-04 11:44:06 +0000 UTC" firstStartedPulling="2025-12-04 11:44:08.060256888 +0000 UTC m=+5345.009432192" lastFinishedPulling="2025-12-04 11:44:09.22536041 +0000 UTC m=+5346.174535734" observedRunningTime="2025-12-04 11:44:10.289642745 +0000 UTC m=+5347.238818059" watchObservedRunningTime="2025-12-04 11:44:10.29799876 +0000 UTC m=+5347.247174074" Dec 04 11:44:18 crc kubenswrapper[4831]: I1204 11:44:18.276161 4831 scope.go:117] "RemoveContainer" containerID="c10daa181ba567efa481df940488e4d09754266337731799be708a612b58e69b" Dec 04 11:44:18 crc kubenswrapper[4831]: E1204 11:44:18.277283 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:44:29 crc kubenswrapper[4831]: I1204 11:44:29.304909 4831 scope.go:117] "RemoveContainer" containerID="c10daa181ba567efa481df940488e4d09754266337731799be708a612b58e69b" Dec 04 11:44:29 crc kubenswrapper[4831]: E1204 11:44:29.307370 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:44:38 crc kubenswrapper[4831]: I1204 11:44:38.763404 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-f4zh2"] Dec 04 11:44:38 crc kubenswrapper[4831]: I1204 11:44:38.768162 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f4zh2" Dec 04 11:44:38 crc kubenswrapper[4831]: I1204 11:44:38.787837 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f4zh2"] Dec 04 11:44:38 crc kubenswrapper[4831]: I1204 11:44:38.857381 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c50e513-c427-46a7-9f13-a34494454c5d-utilities\") pod \"certified-operators-f4zh2\" (UID: \"6c50e513-c427-46a7-9f13-a34494454c5d\") " pod="openshift-marketplace/certified-operators-f4zh2" Dec 04 11:44:38 crc kubenswrapper[4831]: I1204 11:44:38.857491 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c50e513-c427-46a7-9f13-a34494454c5d-catalog-content\") pod \"certified-operators-f4zh2\" (UID: \"6c50e513-c427-46a7-9f13-a34494454c5d\") " pod="openshift-marketplace/certified-operators-f4zh2" Dec 04 11:44:38 crc kubenswrapper[4831]: I1204 11:44:38.857556 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2hv8\" (UniqueName: \"kubernetes.io/projected/6c50e513-c427-46a7-9f13-a34494454c5d-kube-api-access-b2hv8\") pod \"certified-operators-f4zh2\" (UID: \"6c50e513-c427-46a7-9f13-a34494454c5d\") " pod="openshift-marketplace/certified-operators-f4zh2" Dec 04 11:44:38 crc kubenswrapper[4831]: I1204 11:44:38.959674 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c50e513-c427-46a7-9f13-a34494454c5d-utilities\") pod \"certified-operators-f4zh2\" (UID: \"6c50e513-c427-46a7-9f13-a34494454c5d\") " pod="openshift-marketplace/certified-operators-f4zh2" Dec 04 11:44:38 crc kubenswrapper[4831]: I1204 11:44:38.959746 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c50e513-c427-46a7-9f13-a34494454c5d-catalog-content\") pod \"certified-operators-f4zh2\" (UID: \"6c50e513-c427-46a7-9f13-a34494454c5d\") " pod="openshift-marketplace/certified-operators-f4zh2" Dec 04 11:44:38 crc kubenswrapper[4831]: I1204 11:44:38.959783 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2hv8\" (UniqueName: \"kubernetes.io/projected/6c50e513-c427-46a7-9f13-a34494454c5d-kube-api-access-b2hv8\") pod \"certified-operators-f4zh2\" (UID: \"6c50e513-c427-46a7-9f13-a34494454c5d\") " pod="openshift-marketplace/certified-operators-f4zh2" Dec 04 11:44:38 crc kubenswrapper[4831]: I1204 11:44:38.960246 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c50e513-c427-46a7-9f13-a34494454c5d-utilities\") pod \"certified-operators-f4zh2\" (UID: \"6c50e513-c427-46a7-9f13-a34494454c5d\") " pod="openshift-marketplace/certified-operators-f4zh2" Dec 04 11:44:38 crc kubenswrapper[4831]: I1204 11:44:38.960288 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c50e513-c427-46a7-9f13-a34494454c5d-catalog-content\") pod \"certified-operators-f4zh2\" (UID: \"6c50e513-c427-46a7-9f13-a34494454c5d\") " pod="openshift-marketplace/certified-operators-f4zh2" Dec 04 11:44:38 crc kubenswrapper[4831]: I1204 11:44:38.988954 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2hv8\" (UniqueName: \"kubernetes.io/projected/6c50e513-c427-46a7-9f13-a34494454c5d-kube-api-access-b2hv8\") pod \"certified-operators-f4zh2\" (UID: \"6c50e513-c427-46a7-9f13-a34494454c5d\") " pod="openshift-marketplace/certified-operators-f4zh2" Dec 04 11:44:39 crc kubenswrapper[4831]: I1204 11:44:39.097927 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f4zh2" Dec 04 11:44:39 crc kubenswrapper[4831]: I1204 11:44:39.650509 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f4zh2"] Dec 04 11:44:40 crc kubenswrapper[4831]: I1204 11:44:40.587203 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f4zh2" event={"ID":"6c50e513-c427-46a7-9f13-a34494454c5d","Type":"ContainerDied","Data":"d7a0b66f9349d0c372c70303bfa747720af12ddad7d3e8af60db7e7366a45f7f"} Dec 04 11:44:40 crc kubenswrapper[4831]: I1204 11:44:40.588828 4831 generic.go:334] "Generic (PLEG): container finished" podID="6c50e513-c427-46a7-9f13-a34494454c5d" containerID="d7a0b66f9349d0c372c70303bfa747720af12ddad7d3e8af60db7e7366a45f7f" exitCode=0 Dec 04 11:44:40 crc kubenswrapper[4831]: I1204 11:44:40.588888 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f4zh2" event={"ID":"6c50e513-c427-46a7-9f13-a34494454c5d","Type":"ContainerStarted","Data":"692ca7e118030801ed2ae018a9e95b0f21d061e498c577c74546032bee33f6e0"} Dec 04 11:44:41 crc kubenswrapper[4831]: I1204 11:44:41.601853 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f4zh2" event={"ID":"6c50e513-c427-46a7-9f13-a34494454c5d","Type":"ContainerStarted","Data":"601185a15465d42346fb15e99c0240b6000b1da8ac5bb815832fdbec6bb3e763"} Dec 04 11:44:42 crc kubenswrapper[4831]: I1204 11:44:42.276797 4831 scope.go:117] "RemoveContainer" containerID="c10daa181ba567efa481df940488e4d09754266337731799be708a612b58e69b" Dec 04 11:44:42 crc kubenswrapper[4831]: E1204 11:44:42.277230 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:44:42 crc kubenswrapper[4831]: I1204 11:44:42.612846 4831 generic.go:334] "Generic (PLEG): container finished" podID="6c50e513-c427-46a7-9f13-a34494454c5d" containerID="601185a15465d42346fb15e99c0240b6000b1da8ac5bb815832fdbec6bb3e763" exitCode=0 Dec 04 11:44:42 crc kubenswrapper[4831]: I1204 11:44:42.612894 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f4zh2" event={"ID":"6c50e513-c427-46a7-9f13-a34494454c5d","Type":"ContainerDied","Data":"601185a15465d42346fb15e99c0240b6000b1da8ac5bb815832fdbec6bb3e763"} Dec 04 11:44:43 crc kubenswrapper[4831]: I1204 11:44:43.625710 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f4zh2" event={"ID":"6c50e513-c427-46a7-9f13-a34494454c5d","Type":"ContainerStarted","Data":"1eac787770d7b5de7e3dc6b830db930b7fadf2d02a3362e0d137b4d44883de2e"} Dec 04 11:44:43 crc kubenswrapper[4831]: I1204 11:44:43.648750 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-f4zh2" podStartSLOduration=3.215717766 podStartE2EDuration="5.648731092s" podCreationTimestamp="2025-12-04 11:44:38 +0000 UTC" firstStartedPulling="2025-12-04 11:44:40.590526396 +0000 UTC m=+5377.539701710" lastFinishedPulling="2025-12-04 11:44:43.023539722 +0000 UTC m=+5379.972715036" observedRunningTime="2025-12-04 11:44:43.641746364 +0000 UTC m=+5380.590921688" watchObservedRunningTime="2025-12-04 11:44:43.648731092 +0000 UTC m=+5380.597906406" Dec 04 11:44:49 crc kubenswrapper[4831]: I1204 11:44:49.099785 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-f4zh2" Dec 04 11:44:49 crc kubenswrapper[4831]: I1204 11:44:49.100403 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-f4zh2" Dec 04 11:44:49 crc kubenswrapper[4831]: I1204 11:44:49.151721 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-f4zh2" Dec 04 11:44:49 crc kubenswrapper[4831]: I1204 11:44:49.735515 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-f4zh2" Dec 04 11:44:49 crc kubenswrapper[4831]: I1204 11:44:49.781700 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f4zh2"] Dec 04 11:44:51 crc kubenswrapper[4831]: I1204 11:44:51.705228 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-f4zh2" podUID="6c50e513-c427-46a7-9f13-a34494454c5d" containerName="registry-server" containerID="cri-o://1eac787770d7b5de7e3dc6b830db930b7fadf2d02a3362e0d137b4d44883de2e" gracePeriod=2 Dec 04 11:44:52 crc kubenswrapper[4831]: I1204 11:44:52.717448 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f4zh2" Dec 04 11:44:52 crc kubenswrapper[4831]: I1204 11:44:52.719523 4831 generic.go:334] "Generic (PLEG): container finished" podID="6c50e513-c427-46a7-9f13-a34494454c5d" containerID="1eac787770d7b5de7e3dc6b830db930b7fadf2d02a3362e0d137b4d44883de2e" exitCode=0 Dec 04 11:44:52 crc kubenswrapper[4831]: I1204 11:44:52.719564 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f4zh2" event={"ID":"6c50e513-c427-46a7-9f13-a34494454c5d","Type":"ContainerDied","Data":"1eac787770d7b5de7e3dc6b830db930b7fadf2d02a3362e0d137b4d44883de2e"} Dec 04 11:44:52 crc kubenswrapper[4831]: I1204 11:44:52.719589 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f4zh2" event={"ID":"6c50e513-c427-46a7-9f13-a34494454c5d","Type":"ContainerDied","Data":"692ca7e118030801ed2ae018a9e95b0f21d061e498c577c74546032bee33f6e0"} Dec 04 11:44:52 crc kubenswrapper[4831]: I1204 11:44:52.719614 4831 scope.go:117] "RemoveContainer" containerID="1eac787770d7b5de7e3dc6b830db930b7fadf2d02a3362e0d137b4d44883de2e" Dec 04 11:44:52 crc kubenswrapper[4831]: I1204 11:44:52.747333 4831 scope.go:117] "RemoveContainer" containerID="601185a15465d42346fb15e99c0240b6000b1da8ac5bb815832fdbec6bb3e763" Dec 04 11:44:52 crc kubenswrapper[4831]: I1204 11:44:52.786870 4831 scope.go:117] "RemoveContainer" containerID="d7a0b66f9349d0c372c70303bfa747720af12ddad7d3e8af60db7e7366a45f7f" Dec 04 11:44:52 crc kubenswrapper[4831]: I1204 11:44:52.817057 4831 scope.go:117] "RemoveContainer" containerID="1eac787770d7b5de7e3dc6b830db930b7fadf2d02a3362e0d137b4d44883de2e" Dec 04 11:44:52 crc kubenswrapper[4831]: E1204 11:44:52.817578 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1eac787770d7b5de7e3dc6b830db930b7fadf2d02a3362e0d137b4d44883de2e\": container with ID starting with 1eac787770d7b5de7e3dc6b830db930b7fadf2d02a3362e0d137b4d44883de2e not found: ID does not exist" containerID="1eac787770d7b5de7e3dc6b830db930b7fadf2d02a3362e0d137b4d44883de2e" Dec 04 11:44:52 crc kubenswrapper[4831]: I1204 11:44:52.817640 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eac787770d7b5de7e3dc6b830db930b7fadf2d02a3362e0d137b4d44883de2e"} err="failed to get container status \"1eac787770d7b5de7e3dc6b830db930b7fadf2d02a3362e0d137b4d44883de2e\": rpc error: code = NotFound desc = could not find container \"1eac787770d7b5de7e3dc6b830db930b7fadf2d02a3362e0d137b4d44883de2e\": container with ID starting with 1eac787770d7b5de7e3dc6b830db930b7fadf2d02a3362e0d137b4d44883de2e not found: ID does not exist" Dec 04 11:44:52 crc kubenswrapper[4831]: I1204 11:44:52.817685 4831 scope.go:117] "RemoveContainer" containerID="601185a15465d42346fb15e99c0240b6000b1da8ac5bb815832fdbec6bb3e763" Dec 04 11:44:52 crc kubenswrapper[4831]: E1204 11:44:52.818086 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"601185a15465d42346fb15e99c0240b6000b1da8ac5bb815832fdbec6bb3e763\": container with ID starting with 601185a15465d42346fb15e99c0240b6000b1da8ac5bb815832fdbec6bb3e763 not found: ID does not exist" containerID="601185a15465d42346fb15e99c0240b6000b1da8ac5bb815832fdbec6bb3e763" Dec 04 11:44:52 crc kubenswrapper[4831]: I1204 11:44:52.818114 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"601185a15465d42346fb15e99c0240b6000b1da8ac5bb815832fdbec6bb3e763"} err="failed to get container status \"601185a15465d42346fb15e99c0240b6000b1da8ac5bb815832fdbec6bb3e763\": rpc error: code = NotFound desc = could not find container \"601185a15465d42346fb15e99c0240b6000b1da8ac5bb815832fdbec6bb3e763\": container with ID starting with 601185a15465d42346fb15e99c0240b6000b1da8ac5bb815832fdbec6bb3e763 not found: ID does not exist" Dec 04 11:44:52 crc kubenswrapper[4831]: I1204 11:44:52.818134 4831 scope.go:117] "RemoveContainer" containerID="d7a0b66f9349d0c372c70303bfa747720af12ddad7d3e8af60db7e7366a45f7f" Dec 04 11:44:52 crc kubenswrapper[4831]: E1204 11:44:52.818378 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7a0b66f9349d0c372c70303bfa747720af12ddad7d3e8af60db7e7366a45f7f\": container with ID starting with d7a0b66f9349d0c372c70303bfa747720af12ddad7d3e8af60db7e7366a45f7f not found: ID does not exist" containerID="d7a0b66f9349d0c372c70303bfa747720af12ddad7d3e8af60db7e7366a45f7f" Dec 04 11:44:52 crc kubenswrapper[4831]: I1204 11:44:52.818403 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7a0b66f9349d0c372c70303bfa747720af12ddad7d3e8af60db7e7366a45f7f"} err="failed to get container status \"d7a0b66f9349d0c372c70303bfa747720af12ddad7d3e8af60db7e7366a45f7f\": rpc error: code = NotFound desc = could not find container \"d7a0b66f9349d0c372c70303bfa747720af12ddad7d3e8af60db7e7366a45f7f\": container with ID starting with d7a0b66f9349d0c372c70303bfa747720af12ddad7d3e8af60db7e7366a45f7f not found: ID does not exist" Dec 04 11:44:52 crc kubenswrapper[4831]: I1204 11:44:52.877225 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c50e513-c427-46a7-9f13-a34494454c5d-catalog-content\") pod \"6c50e513-c427-46a7-9f13-a34494454c5d\" (UID: \"6c50e513-c427-46a7-9f13-a34494454c5d\") " Dec 04 11:44:52 crc kubenswrapper[4831]: I1204 11:44:52.879043 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2hv8\" (UniqueName: \"kubernetes.io/projected/6c50e513-c427-46a7-9f13-a34494454c5d-kube-api-access-b2hv8\") pod \"6c50e513-c427-46a7-9f13-a34494454c5d\" (UID: \"6c50e513-c427-46a7-9f13-a34494454c5d\") " Dec 04 11:44:52 crc kubenswrapper[4831]: I1204 11:44:52.879144 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c50e513-c427-46a7-9f13-a34494454c5d-utilities\") pod \"6c50e513-c427-46a7-9f13-a34494454c5d\" (UID: \"6c50e513-c427-46a7-9f13-a34494454c5d\") " Dec 04 11:44:52 crc kubenswrapper[4831]: I1204 11:44:52.879883 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c50e513-c427-46a7-9f13-a34494454c5d-utilities" (OuterVolumeSpecName: "utilities") pod "6c50e513-c427-46a7-9f13-a34494454c5d" (UID: "6c50e513-c427-46a7-9f13-a34494454c5d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:44:52 crc kubenswrapper[4831]: I1204 11:44:52.879986 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c50e513-c427-46a7-9f13-a34494454c5d-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 11:44:52 crc kubenswrapper[4831]: I1204 11:44:52.891060 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c50e513-c427-46a7-9f13-a34494454c5d-kube-api-access-b2hv8" (OuterVolumeSpecName: "kube-api-access-b2hv8") pod "6c50e513-c427-46a7-9f13-a34494454c5d" (UID: "6c50e513-c427-46a7-9f13-a34494454c5d"). InnerVolumeSpecName "kube-api-access-b2hv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:44:52 crc kubenswrapper[4831]: I1204 11:44:52.929916 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c50e513-c427-46a7-9f13-a34494454c5d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6c50e513-c427-46a7-9f13-a34494454c5d" (UID: "6c50e513-c427-46a7-9f13-a34494454c5d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:44:52 crc kubenswrapper[4831]: I1204 11:44:52.982695 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2hv8\" (UniqueName: \"kubernetes.io/projected/6c50e513-c427-46a7-9f13-a34494454c5d-kube-api-access-b2hv8\") on node \"crc\" DevicePath \"\"" Dec 04 11:44:52 crc kubenswrapper[4831]: I1204 11:44:52.982753 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c50e513-c427-46a7-9f13-a34494454c5d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 11:44:53 crc kubenswrapper[4831]: I1204 11:44:53.731743 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f4zh2" Dec 04 11:44:53 crc kubenswrapper[4831]: I1204 11:44:53.762057 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f4zh2"] Dec 04 11:44:53 crc kubenswrapper[4831]: I1204 11:44:53.771141 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-f4zh2"] Dec 04 11:44:55 crc kubenswrapper[4831]: I1204 11:44:55.289078 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c50e513-c427-46a7-9f13-a34494454c5d" path="/var/lib/kubelet/pods/6c50e513-c427-46a7-9f13-a34494454c5d/volumes" Dec 04 11:44:57 crc kubenswrapper[4831]: I1204 11:44:57.276583 4831 scope.go:117] "RemoveContainer" containerID="c10daa181ba567efa481df940488e4d09754266337731799be708a612b58e69b" Dec 04 11:44:57 crc kubenswrapper[4831]: E1204 11:44:57.277186 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:44:58 crc kubenswrapper[4831]: I1204 11:44:58.755627 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fn8lm/must-gather-fvqbw"] Dec 04 11:44:58 crc kubenswrapper[4831]: E1204 11:44:58.756446 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c50e513-c427-46a7-9f13-a34494454c5d" containerName="extract-content" Dec 04 11:44:58 crc kubenswrapper[4831]: I1204 11:44:58.756462 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c50e513-c427-46a7-9f13-a34494454c5d" containerName="extract-content" Dec 04 11:44:58 crc kubenswrapper[4831]: E1204 11:44:58.756473 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c50e513-c427-46a7-9f13-a34494454c5d" containerName="extract-utilities" Dec 04 11:44:58 crc kubenswrapper[4831]: I1204 11:44:58.756479 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c50e513-c427-46a7-9f13-a34494454c5d" containerName="extract-utilities" Dec 04 11:44:58 crc kubenswrapper[4831]: E1204 11:44:58.756497 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c50e513-c427-46a7-9f13-a34494454c5d" containerName="registry-server" Dec 04 11:44:58 crc kubenswrapper[4831]: I1204 11:44:58.756506 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c50e513-c427-46a7-9f13-a34494454c5d" containerName="registry-server" Dec 04 11:44:58 crc kubenswrapper[4831]: I1204 11:44:58.756731 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c50e513-c427-46a7-9f13-a34494454c5d" containerName="registry-server" Dec 04 11:44:58 crc kubenswrapper[4831]: I1204 11:44:58.757847 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fn8lm/must-gather-fvqbw" Dec 04 11:44:58 crc kubenswrapper[4831]: I1204 11:44:58.764112 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-fn8lm"/"openshift-service-ca.crt" Dec 04 11:44:58 crc kubenswrapper[4831]: I1204 11:44:58.764314 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-fn8lm"/"kube-root-ca.crt" Dec 04 11:44:58 crc kubenswrapper[4831]: I1204 11:44:58.764422 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-fn8lm"/"default-dockercfg-4d2x2" Dec 04 11:44:58 crc kubenswrapper[4831]: I1204 11:44:58.769350 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fn8lm/must-gather-fvqbw"] Dec 04 11:44:58 crc kubenswrapper[4831]: I1204 11:44:58.904365 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4eb9715b-7d23-4281-8ee2-c4f5bbbbb222-must-gather-output\") pod \"must-gather-fvqbw\" (UID: \"4eb9715b-7d23-4281-8ee2-c4f5bbbbb222\") " pod="openshift-must-gather-fn8lm/must-gather-fvqbw" Dec 04 11:44:58 crc kubenswrapper[4831]: I1204 11:44:58.904495 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtv8g\" (UniqueName: \"kubernetes.io/projected/4eb9715b-7d23-4281-8ee2-c4f5bbbbb222-kube-api-access-rtv8g\") pod \"must-gather-fvqbw\" (UID: \"4eb9715b-7d23-4281-8ee2-c4f5bbbbb222\") " pod="openshift-must-gather-fn8lm/must-gather-fvqbw" Dec 04 11:44:59 crc kubenswrapper[4831]: I1204 11:44:59.006687 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4eb9715b-7d23-4281-8ee2-c4f5bbbbb222-must-gather-output\") pod \"must-gather-fvqbw\" (UID: \"4eb9715b-7d23-4281-8ee2-c4f5bbbbb222\") " pod="openshift-must-gather-fn8lm/must-gather-fvqbw" Dec 04 11:44:59 crc kubenswrapper[4831]: I1204 11:44:59.006787 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtv8g\" (UniqueName: \"kubernetes.io/projected/4eb9715b-7d23-4281-8ee2-c4f5bbbbb222-kube-api-access-rtv8g\") pod \"must-gather-fvqbw\" (UID: \"4eb9715b-7d23-4281-8ee2-c4f5bbbbb222\") " pod="openshift-must-gather-fn8lm/must-gather-fvqbw" Dec 04 11:44:59 crc kubenswrapper[4831]: I1204 11:44:59.007226 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4eb9715b-7d23-4281-8ee2-c4f5bbbbb222-must-gather-output\") pod \"must-gather-fvqbw\" (UID: \"4eb9715b-7d23-4281-8ee2-c4f5bbbbb222\") " pod="openshift-must-gather-fn8lm/must-gather-fvqbw" Dec 04 11:44:59 crc kubenswrapper[4831]: I1204 11:44:59.034724 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtv8g\" (UniqueName: \"kubernetes.io/projected/4eb9715b-7d23-4281-8ee2-c4f5bbbbb222-kube-api-access-rtv8g\") pod \"must-gather-fvqbw\" (UID: \"4eb9715b-7d23-4281-8ee2-c4f5bbbbb222\") " pod="openshift-must-gather-fn8lm/must-gather-fvqbw" Dec 04 11:44:59 crc kubenswrapper[4831]: I1204 11:44:59.078861 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fn8lm/must-gather-fvqbw" Dec 04 11:44:59 crc kubenswrapper[4831]: I1204 11:44:59.767282 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fn8lm/must-gather-fvqbw"] Dec 04 11:44:59 crc kubenswrapper[4831]: I1204 11:44:59.784770 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fn8lm/must-gather-fvqbw" event={"ID":"4eb9715b-7d23-4281-8ee2-c4f5bbbbb222","Type":"ContainerStarted","Data":"796e5eca4b2f94511db687dd07d626446019a999c5644f4b5e56898ba72b0068"} Dec 04 11:45:00 crc kubenswrapper[4831]: I1204 11:45:00.185474 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414145-97kqb"] Dec 04 11:45:00 crc kubenswrapper[4831]: I1204 11:45:00.187736 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414145-97kqb" Dec 04 11:45:00 crc kubenswrapper[4831]: I1204 11:45:00.191279 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 11:45:00 crc kubenswrapper[4831]: I1204 11:45:00.191581 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 11:45:00 crc kubenswrapper[4831]: I1204 11:45:00.213752 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414145-97kqb"] Dec 04 11:45:00 crc kubenswrapper[4831]: I1204 11:45:00.349577 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aa8434fd-1732-469d-8a25-e697ddc301d0-secret-volume\") pod \"collect-profiles-29414145-97kqb\" (UID: \"aa8434fd-1732-469d-8a25-e697ddc301d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414145-97kqb" Dec 04 11:45:00 crc kubenswrapper[4831]: I1204 11:45:00.349629 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zfv7\" (UniqueName: \"kubernetes.io/projected/aa8434fd-1732-469d-8a25-e697ddc301d0-kube-api-access-7zfv7\") pod \"collect-profiles-29414145-97kqb\" (UID: \"aa8434fd-1732-469d-8a25-e697ddc301d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414145-97kqb" Dec 04 11:45:00 crc kubenswrapper[4831]: I1204 11:45:00.349768 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aa8434fd-1732-469d-8a25-e697ddc301d0-config-volume\") pod \"collect-profiles-29414145-97kqb\" (UID: \"aa8434fd-1732-469d-8a25-e697ddc301d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414145-97kqb" Dec 04 11:45:00 crc kubenswrapper[4831]: I1204 11:45:00.451794 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aa8434fd-1732-469d-8a25-e697ddc301d0-secret-volume\") pod \"collect-profiles-29414145-97kqb\" (UID: \"aa8434fd-1732-469d-8a25-e697ddc301d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414145-97kqb" Dec 04 11:45:00 crc kubenswrapper[4831]: I1204 11:45:00.451839 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zfv7\" (UniqueName: \"kubernetes.io/projected/aa8434fd-1732-469d-8a25-e697ddc301d0-kube-api-access-7zfv7\") pod \"collect-profiles-29414145-97kqb\" (UID: \"aa8434fd-1732-469d-8a25-e697ddc301d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414145-97kqb" Dec 04 11:45:00 crc kubenswrapper[4831]: I1204 11:45:00.451951 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aa8434fd-1732-469d-8a25-e697ddc301d0-config-volume\") pod \"collect-profiles-29414145-97kqb\" (UID: \"aa8434fd-1732-469d-8a25-e697ddc301d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414145-97kqb" Dec 04 11:45:00 crc kubenswrapper[4831]: I1204 11:45:00.453597 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aa8434fd-1732-469d-8a25-e697ddc301d0-config-volume\") pod \"collect-profiles-29414145-97kqb\" (UID: \"aa8434fd-1732-469d-8a25-e697ddc301d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414145-97kqb" Dec 04 11:45:00 crc kubenswrapper[4831]: I1204 11:45:00.688113 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zfv7\" (UniqueName: \"kubernetes.io/projected/aa8434fd-1732-469d-8a25-e697ddc301d0-kube-api-access-7zfv7\") pod \"collect-profiles-29414145-97kqb\" (UID: \"aa8434fd-1732-469d-8a25-e697ddc301d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414145-97kqb" Dec 04 11:45:00 crc kubenswrapper[4831]: I1204 11:45:00.691171 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aa8434fd-1732-469d-8a25-e697ddc301d0-secret-volume\") pod \"collect-profiles-29414145-97kqb\" (UID: \"aa8434fd-1732-469d-8a25-e697ddc301d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414145-97kqb" Dec 04 11:45:00 crc kubenswrapper[4831]: I1204 11:45:00.821131 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414145-97kqb" Dec 04 11:45:01 crc kubenswrapper[4831]: I1204 11:45:01.320736 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414145-97kqb"] Dec 04 11:45:01 crc kubenswrapper[4831]: W1204 11:45:01.344449 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa8434fd_1732_469d_8a25_e697ddc301d0.slice/crio-fc244dbe1fedab859758a5e9d535f824159c615a5dc0f88aa74dea6c31559276 WatchSource:0}: Error finding container fc244dbe1fedab859758a5e9d535f824159c615a5dc0f88aa74dea6c31559276: Status 404 returned error can't find the container with id fc244dbe1fedab859758a5e9d535f824159c615a5dc0f88aa74dea6c31559276 Dec 04 11:45:01 crc kubenswrapper[4831]: I1204 11:45:01.811794 4831 generic.go:334] "Generic (PLEG): container finished" podID="aa8434fd-1732-469d-8a25-e697ddc301d0" containerID="15f527acefc517e877ad71def985141d2654fca1bf2567b53f5f2725c7d14ccd" exitCode=0 Dec 04 11:45:01 crc kubenswrapper[4831]: I1204 11:45:01.812100 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414145-97kqb" event={"ID":"aa8434fd-1732-469d-8a25-e697ddc301d0","Type":"ContainerDied","Data":"15f527acefc517e877ad71def985141d2654fca1bf2567b53f5f2725c7d14ccd"} Dec 04 11:45:01 crc kubenswrapper[4831]: I1204 11:45:01.812131 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414145-97kqb" event={"ID":"aa8434fd-1732-469d-8a25-e697ddc301d0","Type":"ContainerStarted","Data":"fc244dbe1fedab859758a5e9d535f824159c615a5dc0f88aa74dea6c31559276"} Dec 04 11:45:04 crc kubenswrapper[4831]: I1204 11:45:04.032011 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414145-97kqb" Dec 04 11:45:04 crc kubenswrapper[4831]: I1204 11:45:04.059762 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aa8434fd-1732-469d-8a25-e697ddc301d0-secret-volume\") pod \"aa8434fd-1732-469d-8a25-e697ddc301d0\" (UID: \"aa8434fd-1732-469d-8a25-e697ddc301d0\") " Dec 04 11:45:04 crc kubenswrapper[4831]: I1204 11:45:04.059893 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aa8434fd-1732-469d-8a25-e697ddc301d0-config-volume\") pod \"aa8434fd-1732-469d-8a25-e697ddc301d0\" (UID: \"aa8434fd-1732-469d-8a25-e697ddc301d0\") " Dec 04 11:45:04 crc kubenswrapper[4831]: I1204 11:45:04.059922 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zfv7\" (UniqueName: \"kubernetes.io/projected/aa8434fd-1732-469d-8a25-e697ddc301d0-kube-api-access-7zfv7\") pod \"aa8434fd-1732-469d-8a25-e697ddc301d0\" (UID: \"aa8434fd-1732-469d-8a25-e697ddc301d0\") " Dec 04 11:45:04 crc kubenswrapper[4831]: I1204 11:45:04.061840 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa8434fd-1732-469d-8a25-e697ddc301d0-config-volume" (OuterVolumeSpecName: "config-volume") pod "aa8434fd-1732-469d-8a25-e697ddc301d0" (UID: "aa8434fd-1732-469d-8a25-e697ddc301d0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 11:45:04 crc kubenswrapper[4831]: I1204 11:45:04.076992 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa8434fd-1732-469d-8a25-e697ddc301d0-kube-api-access-7zfv7" (OuterVolumeSpecName: "kube-api-access-7zfv7") pod "aa8434fd-1732-469d-8a25-e697ddc301d0" (UID: "aa8434fd-1732-469d-8a25-e697ddc301d0"). InnerVolumeSpecName "kube-api-access-7zfv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:45:04 crc kubenswrapper[4831]: I1204 11:45:04.084340 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa8434fd-1732-469d-8a25-e697ddc301d0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "aa8434fd-1732-469d-8a25-e697ddc301d0" (UID: "aa8434fd-1732-469d-8a25-e697ddc301d0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 11:45:04 crc kubenswrapper[4831]: I1204 11:45:04.162156 4831 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aa8434fd-1732-469d-8a25-e697ddc301d0-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 11:45:04 crc kubenswrapper[4831]: I1204 11:45:04.162187 4831 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aa8434fd-1732-469d-8a25-e697ddc301d0-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 11:45:04 crc kubenswrapper[4831]: I1204 11:45:04.162197 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zfv7\" (UniqueName: \"kubernetes.io/projected/aa8434fd-1732-469d-8a25-e697ddc301d0-kube-api-access-7zfv7\") on node \"crc\" DevicePath \"\"" Dec 04 11:45:04 crc kubenswrapper[4831]: I1204 11:45:04.841561 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414145-97kqb" event={"ID":"aa8434fd-1732-469d-8a25-e697ddc301d0","Type":"ContainerDied","Data":"fc244dbe1fedab859758a5e9d535f824159c615a5dc0f88aa74dea6c31559276"} Dec 04 11:45:04 crc kubenswrapper[4831]: I1204 11:45:04.841599 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc244dbe1fedab859758a5e9d535f824159c615a5dc0f88aa74dea6c31559276" Dec 04 11:45:04 crc kubenswrapper[4831]: I1204 11:45:04.841639 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414145-97kqb" Dec 04 11:45:05 crc kubenswrapper[4831]: I1204 11:45:05.103376 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414100-xvgt6"] Dec 04 11:45:05 crc kubenswrapper[4831]: I1204 11:45:05.114443 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414100-xvgt6"] Dec 04 11:45:05 crc kubenswrapper[4831]: I1204 11:45:05.302197 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56755d7b-e8d8-44a8-a26d-79482eda17ac" path="/var/lib/kubelet/pods/56755d7b-e8d8-44a8-a26d-79482eda17ac/volumes" Dec 04 11:45:06 crc kubenswrapper[4831]: I1204 11:45:06.869904 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fn8lm/must-gather-fvqbw" event={"ID":"4eb9715b-7d23-4281-8ee2-c4f5bbbbb222","Type":"ContainerStarted","Data":"e151c4513af4e786fb38ea4e89f740a7509cb23d56977680f6f100ea4008b39d"} Dec 04 11:45:06 crc kubenswrapper[4831]: I1204 11:45:06.870532 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fn8lm/must-gather-fvqbw" event={"ID":"4eb9715b-7d23-4281-8ee2-c4f5bbbbb222","Type":"ContainerStarted","Data":"9d7a244aa349ddccf8e79186f17317e7ec24ed12092ab571cf5b107c798cf964"} Dec 04 11:45:06 crc kubenswrapper[4831]: I1204 11:45:06.891519 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fn8lm/must-gather-fvqbw" podStartSLOduration=2.59469257 podStartE2EDuration="8.891497036s" podCreationTimestamp="2025-12-04 11:44:58 +0000 UTC" firstStartedPulling="2025-12-04 11:44:59.763326513 +0000 UTC m=+5396.712501827" lastFinishedPulling="2025-12-04 11:45:06.060130979 +0000 UTC m=+5403.009306293" observedRunningTime="2025-12-04 11:45:06.888175126 +0000 UTC m=+5403.837350450" watchObservedRunningTime="2025-12-04 11:45:06.891497036 +0000 UTC m=+5403.840672350" Dec 04 11:45:10 crc kubenswrapper[4831]: I1204 11:45:10.276891 4831 scope.go:117] "RemoveContainer" containerID="c10daa181ba567efa481df940488e4d09754266337731799be708a612b58e69b" Dec 04 11:45:10 crc kubenswrapper[4831]: E1204 11:45:10.302386 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:45:11 crc kubenswrapper[4831]: I1204 11:45:11.117070 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fn8lm/crc-debug-zdxpx"] Dec 04 11:45:11 crc kubenswrapper[4831]: E1204 11:45:11.117793 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa8434fd-1732-469d-8a25-e697ddc301d0" containerName="collect-profiles" Dec 04 11:45:11 crc kubenswrapper[4831]: I1204 11:45:11.117812 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa8434fd-1732-469d-8a25-e697ddc301d0" containerName="collect-profiles" Dec 04 11:45:11 crc kubenswrapper[4831]: I1204 11:45:11.118036 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa8434fd-1732-469d-8a25-e697ddc301d0" containerName="collect-profiles" Dec 04 11:45:11 crc kubenswrapper[4831]: I1204 11:45:11.118785 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fn8lm/crc-debug-zdxpx" Dec 04 11:45:11 crc kubenswrapper[4831]: I1204 11:45:11.229544 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f2a1c979-d65c-4bb0-af13-231e63ba21b0-host\") pod \"crc-debug-zdxpx\" (UID: \"f2a1c979-d65c-4bb0-af13-231e63ba21b0\") " pod="openshift-must-gather-fn8lm/crc-debug-zdxpx" Dec 04 11:45:11 crc kubenswrapper[4831]: I1204 11:45:11.229970 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn9k8\" (UniqueName: \"kubernetes.io/projected/f2a1c979-d65c-4bb0-af13-231e63ba21b0-kube-api-access-sn9k8\") pod \"crc-debug-zdxpx\" (UID: \"f2a1c979-d65c-4bb0-af13-231e63ba21b0\") " pod="openshift-must-gather-fn8lm/crc-debug-zdxpx" Dec 04 11:45:11 crc kubenswrapper[4831]: I1204 11:45:11.332200 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f2a1c979-d65c-4bb0-af13-231e63ba21b0-host\") pod \"crc-debug-zdxpx\" (UID: \"f2a1c979-d65c-4bb0-af13-231e63ba21b0\") " pod="openshift-must-gather-fn8lm/crc-debug-zdxpx" Dec 04 11:45:11 crc kubenswrapper[4831]: I1204 11:45:11.332355 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sn9k8\" (UniqueName: \"kubernetes.io/projected/f2a1c979-d65c-4bb0-af13-231e63ba21b0-kube-api-access-sn9k8\") pod \"crc-debug-zdxpx\" (UID: \"f2a1c979-d65c-4bb0-af13-231e63ba21b0\") " pod="openshift-must-gather-fn8lm/crc-debug-zdxpx" Dec 04 11:45:11 crc kubenswrapper[4831]: I1204 11:45:11.332375 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f2a1c979-d65c-4bb0-af13-231e63ba21b0-host\") pod \"crc-debug-zdxpx\" (UID: \"f2a1c979-d65c-4bb0-af13-231e63ba21b0\") " pod="openshift-must-gather-fn8lm/crc-debug-zdxpx" Dec 04 11:45:11 crc kubenswrapper[4831]: I1204 11:45:11.351384 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn9k8\" (UniqueName: \"kubernetes.io/projected/f2a1c979-d65c-4bb0-af13-231e63ba21b0-kube-api-access-sn9k8\") pod \"crc-debug-zdxpx\" (UID: \"f2a1c979-d65c-4bb0-af13-231e63ba21b0\") " pod="openshift-must-gather-fn8lm/crc-debug-zdxpx" Dec 04 11:45:11 crc kubenswrapper[4831]: I1204 11:45:11.437685 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fn8lm/crc-debug-zdxpx" Dec 04 11:45:11 crc kubenswrapper[4831]: W1204 11:45:11.473880 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2a1c979_d65c_4bb0_af13_231e63ba21b0.slice/crio-566792998e2f8d89880ae95036f3e6bff3aa1048b065b940fcae3af3d41ff714 WatchSource:0}: Error finding container 566792998e2f8d89880ae95036f3e6bff3aa1048b065b940fcae3af3d41ff714: Status 404 returned error can't find the container with id 566792998e2f8d89880ae95036f3e6bff3aa1048b065b940fcae3af3d41ff714 Dec 04 11:45:11 crc kubenswrapper[4831]: I1204 11:45:11.915361 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fn8lm/crc-debug-zdxpx" event={"ID":"f2a1c979-d65c-4bb0-af13-231e63ba21b0","Type":"ContainerStarted","Data":"566792998e2f8d89880ae95036f3e6bff3aa1048b065b940fcae3af3d41ff714"} Dec 04 11:45:15 crc kubenswrapper[4831]: I1204 11:45:15.756784 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hfps6"] Dec 04 11:45:15 crc kubenswrapper[4831]: I1204 11:45:15.760148 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hfps6" Dec 04 11:45:15 crc kubenswrapper[4831]: I1204 11:45:15.771767 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hfps6"] Dec 04 11:45:15 crc kubenswrapper[4831]: I1204 11:45:15.923106 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5-catalog-content\") pod \"redhat-marketplace-hfps6\" (UID: \"0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5\") " pod="openshift-marketplace/redhat-marketplace-hfps6" Dec 04 11:45:15 crc kubenswrapper[4831]: I1204 11:45:15.923173 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5h7n\" (UniqueName: \"kubernetes.io/projected/0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5-kube-api-access-j5h7n\") pod \"redhat-marketplace-hfps6\" (UID: \"0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5\") " pod="openshift-marketplace/redhat-marketplace-hfps6" Dec 04 11:45:15 crc kubenswrapper[4831]: I1204 11:45:15.923266 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5-utilities\") pod \"redhat-marketplace-hfps6\" (UID: \"0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5\") " pod="openshift-marketplace/redhat-marketplace-hfps6" Dec 04 11:45:16 crc kubenswrapper[4831]: I1204 11:45:16.025883 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5-catalog-content\") pod \"redhat-marketplace-hfps6\" (UID: \"0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5\") " pod="openshift-marketplace/redhat-marketplace-hfps6" Dec 04 11:45:16 crc kubenswrapper[4831]: I1204 11:45:16.025983 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5h7n\" (UniqueName: \"kubernetes.io/projected/0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5-kube-api-access-j5h7n\") pod \"redhat-marketplace-hfps6\" (UID: \"0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5\") " pod="openshift-marketplace/redhat-marketplace-hfps6" Dec 04 11:45:16 crc kubenswrapper[4831]: I1204 11:45:16.026066 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5-utilities\") pod \"redhat-marketplace-hfps6\" (UID: \"0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5\") " pod="openshift-marketplace/redhat-marketplace-hfps6" Dec 04 11:45:16 crc kubenswrapper[4831]: I1204 11:45:16.026538 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5-catalog-content\") pod \"redhat-marketplace-hfps6\" (UID: \"0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5\") " pod="openshift-marketplace/redhat-marketplace-hfps6" Dec 04 11:45:16 crc kubenswrapper[4831]: I1204 11:45:16.026707 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5-utilities\") pod \"redhat-marketplace-hfps6\" (UID: \"0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5\") " pod="openshift-marketplace/redhat-marketplace-hfps6" Dec 04 11:45:16 crc kubenswrapper[4831]: I1204 11:45:16.051470 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5h7n\" (UniqueName: \"kubernetes.io/projected/0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5-kube-api-access-j5h7n\") pod \"redhat-marketplace-hfps6\" (UID: \"0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5\") " pod="openshift-marketplace/redhat-marketplace-hfps6" Dec 04 11:45:16 crc kubenswrapper[4831]: I1204 11:45:16.093173 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hfps6" Dec 04 11:45:16 crc kubenswrapper[4831]: I1204 11:45:16.629886 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hfps6"] Dec 04 11:45:16 crc kubenswrapper[4831]: I1204 11:45:16.986519 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hfps6" event={"ID":"0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5","Type":"ContainerStarted","Data":"98f57286c02e70d87767adce8d08273ddfdf820caf2e1faf4b606a87898337d5"} Dec 04 11:45:16 crc kubenswrapper[4831]: I1204 11:45:16.987643 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hfps6" event={"ID":"0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5","Type":"ContainerStarted","Data":"0ceb884ccbd1793c84048e74bf7529d553da9645f4fbb543b3386d2e343228d6"} Dec 04 11:45:17 crc kubenswrapper[4831]: I1204 11:45:17.998132 4831 generic.go:334] "Generic (PLEG): container finished" podID="0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5" containerID="98f57286c02e70d87767adce8d08273ddfdf820caf2e1faf4b606a87898337d5" exitCode=0 Dec 04 11:45:17 crc kubenswrapper[4831]: I1204 11:45:17.998175 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hfps6" event={"ID":"0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5","Type":"ContainerDied","Data":"98f57286c02e70d87767adce8d08273ddfdf820caf2e1faf4b606a87898337d5"} Dec 04 11:45:22 crc kubenswrapper[4831]: I1204 11:45:22.276330 4831 scope.go:117] "RemoveContainer" containerID="c10daa181ba567efa481df940488e4d09754266337731799be708a612b58e69b" Dec 04 11:45:22 crc kubenswrapper[4831]: E1204 11:45:22.277208 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:45:24 crc kubenswrapper[4831]: I1204 11:45:24.061316 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hfps6" event={"ID":"0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5","Type":"ContainerStarted","Data":"49a6dd85fdf42dbed91d8463e9bbf18898a8cbd728320201f2cc4f9f759c1ff0"} Dec 04 11:45:24 crc kubenswrapper[4831]: I1204 11:45:24.063754 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fn8lm/crc-debug-zdxpx" event={"ID":"f2a1c979-d65c-4bb0-af13-231e63ba21b0","Type":"ContainerStarted","Data":"0bee97bea2a0097dff8c3a4236d8c1f660c86e6601af757b993b06759f308759"} Dec 04 11:45:24 crc kubenswrapper[4831]: I1204 11:45:24.083484 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fn8lm/crc-debug-zdxpx" podStartSLOduration=1.280398282 podStartE2EDuration="13.083457246s" podCreationTimestamp="2025-12-04 11:45:11 +0000 UTC" firstStartedPulling="2025-12-04 11:45:11.476283718 +0000 UTC m=+5408.425459032" lastFinishedPulling="2025-12-04 11:45:23.279342682 +0000 UTC m=+5420.228517996" observedRunningTime="2025-12-04 11:45:24.074621147 +0000 UTC m=+5421.023796461" watchObservedRunningTime="2025-12-04 11:45:24.083457246 +0000 UTC m=+5421.032632560" Dec 04 11:45:25 crc kubenswrapper[4831]: I1204 11:45:25.073999 4831 generic.go:334] "Generic (PLEG): container finished" podID="0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5" containerID="49a6dd85fdf42dbed91d8463e9bbf18898a8cbd728320201f2cc4f9f759c1ff0" exitCode=0 Dec 04 11:45:25 crc kubenswrapper[4831]: I1204 11:45:25.074193 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hfps6" event={"ID":"0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5","Type":"ContainerDied","Data":"49a6dd85fdf42dbed91d8463e9bbf18898a8cbd728320201f2cc4f9f759c1ff0"} Dec 04 11:45:26 crc kubenswrapper[4831]: I1204 11:45:26.087030 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hfps6" event={"ID":"0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5","Type":"ContainerStarted","Data":"e0d23523e51312b7828baff95af25d0c41aafd10dc0b79508ce9bcafc712ce94"} Dec 04 11:45:26 crc kubenswrapper[4831]: I1204 11:45:26.093367 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hfps6" Dec 04 11:45:26 crc kubenswrapper[4831]: I1204 11:45:26.093514 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hfps6" Dec 04 11:45:26 crc kubenswrapper[4831]: I1204 11:45:26.127690 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hfps6" podStartSLOduration=8.532319631 podStartE2EDuration="11.127675812s" podCreationTimestamp="2025-12-04 11:45:15 +0000 UTC" firstStartedPulling="2025-12-04 11:45:23.161892027 +0000 UTC m=+5420.111067341" lastFinishedPulling="2025-12-04 11:45:25.757248198 +0000 UTC m=+5422.706423522" observedRunningTime="2025-12-04 11:45:26.127379494 +0000 UTC m=+5423.076554808" watchObservedRunningTime="2025-12-04 11:45:26.127675812 +0000 UTC m=+5423.076851126" Dec 04 11:45:27 crc kubenswrapper[4831]: I1204 11:45:27.151053 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-hfps6" podUID="0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5" containerName="registry-server" probeResult="failure" output=< Dec 04 11:45:27 crc kubenswrapper[4831]: timeout: failed to connect service ":50051" within 1s Dec 04 11:45:27 crc kubenswrapper[4831]: > Dec 04 11:45:28 crc kubenswrapper[4831]: I1204 11:45:28.597410 4831 scope.go:117] "RemoveContainer" containerID="cb50151ba535d148a6289fdefc036efc96d45e8f164d6959103f20636645b426" Dec 04 11:45:36 crc kubenswrapper[4831]: I1204 11:45:36.152234 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hfps6" Dec 04 11:45:36 crc kubenswrapper[4831]: I1204 11:45:36.209582 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hfps6" Dec 04 11:45:36 crc kubenswrapper[4831]: I1204 11:45:36.277083 4831 scope.go:117] "RemoveContainer" containerID="c10daa181ba567efa481df940488e4d09754266337731799be708a612b58e69b" Dec 04 11:45:36 crc kubenswrapper[4831]: E1204 11:45:36.277349 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:45:36 crc kubenswrapper[4831]: I1204 11:45:36.394365 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hfps6"] Dec 04 11:45:37 crc kubenswrapper[4831]: I1204 11:45:37.210642 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hfps6" podUID="0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5" containerName="registry-server" containerID="cri-o://e0d23523e51312b7828baff95af25d0c41aafd10dc0b79508ce9bcafc712ce94" gracePeriod=2 Dec 04 11:45:38 crc kubenswrapper[4831]: I1204 11:45:38.224248 4831 generic.go:334] "Generic (PLEG): container finished" podID="0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5" containerID="e0d23523e51312b7828baff95af25d0c41aafd10dc0b79508ce9bcafc712ce94" exitCode=0 Dec 04 11:45:38 crc kubenswrapper[4831]: I1204 11:45:38.224342 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hfps6" event={"ID":"0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5","Type":"ContainerDied","Data":"e0d23523e51312b7828baff95af25d0c41aafd10dc0b79508ce9bcafc712ce94"} Dec 04 11:45:38 crc kubenswrapper[4831]: I1204 11:45:38.224876 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hfps6" event={"ID":"0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5","Type":"ContainerDied","Data":"0ceb884ccbd1793c84048e74bf7529d553da9645f4fbb543b3386d2e343228d6"} Dec 04 11:45:38 crc kubenswrapper[4831]: I1204 11:45:38.224903 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ceb884ccbd1793c84048e74bf7529d553da9645f4fbb543b3386d2e343228d6" Dec 04 11:45:38 crc kubenswrapper[4831]: I1204 11:45:38.270187 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hfps6" Dec 04 11:45:38 crc kubenswrapper[4831]: I1204 11:45:38.427429 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5-utilities\") pod \"0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5\" (UID: \"0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5\") " Dec 04 11:45:38 crc kubenswrapper[4831]: I1204 11:45:38.427544 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5h7n\" (UniqueName: \"kubernetes.io/projected/0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5-kube-api-access-j5h7n\") pod \"0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5\" (UID: \"0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5\") " Dec 04 11:45:38 crc kubenswrapper[4831]: I1204 11:45:38.427652 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5-catalog-content\") pod \"0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5\" (UID: \"0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5\") " Dec 04 11:45:38 crc kubenswrapper[4831]: I1204 11:45:38.429096 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5-utilities" (OuterVolumeSpecName: "utilities") pod "0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5" (UID: "0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:45:38 crc kubenswrapper[4831]: I1204 11:45:38.433520 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5-kube-api-access-j5h7n" (OuterVolumeSpecName: "kube-api-access-j5h7n") pod "0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5" (UID: "0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5"). InnerVolumeSpecName "kube-api-access-j5h7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:45:38 crc kubenswrapper[4831]: I1204 11:45:38.443793 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5" (UID: "0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:45:38 crc kubenswrapper[4831]: I1204 11:45:38.530825 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 11:45:38 crc kubenswrapper[4831]: I1204 11:45:38.531167 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5h7n\" (UniqueName: \"kubernetes.io/projected/0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5-kube-api-access-j5h7n\") on node \"crc\" DevicePath \"\"" Dec 04 11:45:38 crc kubenswrapper[4831]: I1204 11:45:38.531183 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 11:45:39 crc kubenswrapper[4831]: I1204 11:45:39.234149 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hfps6" Dec 04 11:45:41 crc kubenswrapper[4831]: I1204 11:45:41.124795 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hfps6"] Dec 04 11:45:41 crc kubenswrapper[4831]: I1204 11:45:41.137406 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hfps6"] Dec 04 11:45:41 crc kubenswrapper[4831]: I1204 11:45:41.293142 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5" path="/var/lib/kubelet/pods/0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5/volumes" Dec 04 11:45:47 crc kubenswrapper[4831]: I1204 11:45:47.276974 4831 scope.go:117] "RemoveContainer" containerID="c10daa181ba567efa481df940488e4d09754266337731799be708a612b58e69b" Dec 04 11:45:47 crc kubenswrapper[4831]: E1204 11:45:47.277512 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:45:55 crc kubenswrapper[4831]: I1204 11:45:55.466894 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dtbj7"] Dec 04 11:45:55 crc kubenswrapper[4831]: E1204 11:45:55.467842 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5" containerName="extract-content" Dec 04 11:45:55 crc kubenswrapper[4831]: I1204 11:45:55.467859 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5" containerName="extract-content" Dec 04 11:45:55 crc kubenswrapper[4831]: E1204 11:45:55.467873 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5" containerName="extract-utilities" Dec 04 11:45:55 crc kubenswrapper[4831]: I1204 11:45:55.467881 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5" containerName="extract-utilities" Dec 04 11:45:55 crc kubenswrapper[4831]: E1204 11:45:55.467896 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5" containerName="registry-server" Dec 04 11:45:55 crc kubenswrapper[4831]: I1204 11:45:55.467903 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5" containerName="registry-server" Dec 04 11:45:55 crc kubenswrapper[4831]: I1204 11:45:55.468120 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fc30a2f-ddba-4f23-b405-fee5f5d0f5d5" containerName="registry-server" Dec 04 11:45:55 crc kubenswrapper[4831]: I1204 11:45:55.469620 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dtbj7" Dec 04 11:45:55 crc kubenswrapper[4831]: I1204 11:45:55.482852 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dtbj7"] Dec 04 11:45:55 crc kubenswrapper[4831]: I1204 11:45:55.597397 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b7bf3c1-2f08-4cb2-9e27-790a3189d643-catalog-content\") pod \"community-operators-dtbj7\" (UID: \"9b7bf3c1-2f08-4cb2-9e27-790a3189d643\") " pod="openshift-marketplace/community-operators-dtbj7" Dec 04 11:45:55 crc kubenswrapper[4831]: I1204 11:45:55.597637 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b7bf3c1-2f08-4cb2-9e27-790a3189d643-utilities\") pod \"community-operators-dtbj7\" (UID: \"9b7bf3c1-2f08-4cb2-9e27-790a3189d643\") " pod="openshift-marketplace/community-operators-dtbj7" Dec 04 11:45:55 crc kubenswrapper[4831]: I1204 11:45:55.598228 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4rbc\" (UniqueName: \"kubernetes.io/projected/9b7bf3c1-2f08-4cb2-9e27-790a3189d643-kube-api-access-x4rbc\") pod \"community-operators-dtbj7\" (UID: \"9b7bf3c1-2f08-4cb2-9e27-790a3189d643\") " pod="openshift-marketplace/community-operators-dtbj7" Dec 04 11:45:55 crc kubenswrapper[4831]: I1204 11:45:55.700571 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4rbc\" (UniqueName: \"kubernetes.io/projected/9b7bf3c1-2f08-4cb2-9e27-790a3189d643-kube-api-access-x4rbc\") pod \"community-operators-dtbj7\" (UID: \"9b7bf3c1-2f08-4cb2-9e27-790a3189d643\") " pod="openshift-marketplace/community-operators-dtbj7" Dec 04 11:45:55 crc kubenswrapper[4831]: I1204 11:45:55.700746 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b7bf3c1-2f08-4cb2-9e27-790a3189d643-catalog-content\") pod \"community-operators-dtbj7\" (UID: \"9b7bf3c1-2f08-4cb2-9e27-790a3189d643\") " pod="openshift-marketplace/community-operators-dtbj7" Dec 04 11:45:55 crc kubenswrapper[4831]: I1204 11:45:55.700773 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b7bf3c1-2f08-4cb2-9e27-790a3189d643-utilities\") pod \"community-operators-dtbj7\" (UID: \"9b7bf3c1-2f08-4cb2-9e27-790a3189d643\") " pod="openshift-marketplace/community-operators-dtbj7" Dec 04 11:45:55 crc kubenswrapper[4831]: I1204 11:45:55.701565 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b7bf3c1-2f08-4cb2-9e27-790a3189d643-utilities\") pod \"community-operators-dtbj7\" (UID: \"9b7bf3c1-2f08-4cb2-9e27-790a3189d643\") " pod="openshift-marketplace/community-operators-dtbj7" Dec 04 11:45:55 crc kubenswrapper[4831]: I1204 11:45:55.701699 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b7bf3c1-2f08-4cb2-9e27-790a3189d643-catalog-content\") pod \"community-operators-dtbj7\" (UID: \"9b7bf3c1-2f08-4cb2-9e27-790a3189d643\") " pod="openshift-marketplace/community-operators-dtbj7" Dec 04 11:45:55 crc kubenswrapper[4831]: I1204 11:45:55.720582 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4rbc\" (UniqueName: \"kubernetes.io/projected/9b7bf3c1-2f08-4cb2-9e27-790a3189d643-kube-api-access-x4rbc\") pod \"community-operators-dtbj7\" (UID: \"9b7bf3c1-2f08-4cb2-9e27-790a3189d643\") " pod="openshift-marketplace/community-operators-dtbj7" Dec 04 11:45:55 crc kubenswrapper[4831]: I1204 11:45:55.800277 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dtbj7" Dec 04 11:45:56 crc kubenswrapper[4831]: I1204 11:45:56.364610 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dtbj7"] Dec 04 11:45:56 crc kubenswrapper[4831]: I1204 11:45:56.441025 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dtbj7" event={"ID":"9b7bf3c1-2f08-4cb2-9e27-790a3189d643","Type":"ContainerStarted","Data":"1bba27f2baf656944b20906ff7895bc733033b26d4ab43cd50533849e87bca04"} Dec 04 11:45:57 crc kubenswrapper[4831]: I1204 11:45:57.451601 4831 generic.go:334] "Generic (PLEG): container finished" podID="9b7bf3c1-2f08-4cb2-9e27-790a3189d643" containerID="9b086cc39854e3fdf29f8a0d772af4fe55b0a257fe2882e775f1a001130997bc" exitCode=0 Dec 04 11:45:57 crc kubenswrapper[4831]: I1204 11:45:57.451692 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dtbj7" event={"ID":"9b7bf3c1-2f08-4cb2-9e27-790a3189d643","Type":"ContainerDied","Data":"9b086cc39854e3fdf29f8a0d772af4fe55b0a257fe2882e775f1a001130997bc"} Dec 04 11:45:58 crc kubenswrapper[4831]: I1204 11:45:58.466947 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dtbj7" event={"ID":"9b7bf3c1-2f08-4cb2-9e27-790a3189d643","Type":"ContainerStarted","Data":"64231ae5237dc01cf420f9202628d489c49d0aaee1ad89d25a678cb7119ebc51"} Dec 04 11:46:00 crc kubenswrapper[4831]: I1204 11:46:00.502460 4831 generic.go:334] "Generic (PLEG): container finished" podID="9b7bf3c1-2f08-4cb2-9e27-790a3189d643" containerID="64231ae5237dc01cf420f9202628d489c49d0aaee1ad89d25a678cb7119ebc51" exitCode=0 Dec 04 11:46:00 crc kubenswrapper[4831]: I1204 11:46:00.502551 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dtbj7" event={"ID":"9b7bf3c1-2f08-4cb2-9e27-790a3189d643","Type":"ContainerDied","Data":"64231ae5237dc01cf420f9202628d489c49d0aaee1ad89d25a678cb7119ebc51"} Dec 04 11:46:01 crc kubenswrapper[4831]: I1204 11:46:01.276456 4831 scope.go:117] "RemoveContainer" containerID="c10daa181ba567efa481df940488e4d09754266337731799be708a612b58e69b" Dec 04 11:46:01 crc kubenswrapper[4831]: E1204 11:46:01.277072 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:46:02 crc kubenswrapper[4831]: I1204 11:46:02.521623 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dtbj7" event={"ID":"9b7bf3c1-2f08-4cb2-9e27-790a3189d643","Type":"ContainerStarted","Data":"427388a6d1330f130dec38f3629855ac1b2ce4a072442d1237a27c260af96940"} Dec 04 11:46:02 crc kubenswrapper[4831]: I1204 11:46:02.549951 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dtbj7" podStartSLOduration=3.892131983 podStartE2EDuration="7.549930271s" podCreationTimestamp="2025-12-04 11:45:55 +0000 UTC" firstStartedPulling="2025-12-04 11:45:57.461517944 +0000 UTC m=+5454.410693258" lastFinishedPulling="2025-12-04 11:46:01.119316232 +0000 UTC m=+5458.068491546" observedRunningTime="2025-12-04 11:46:02.546478608 +0000 UTC m=+5459.495653932" watchObservedRunningTime="2025-12-04 11:46:02.549930271 +0000 UTC m=+5459.499105585" Dec 04 11:46:05 crc kubenswrapper[4831]: I1204 11:46:05.800871 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dtbj7" Dec 04 11:46:05 crc kubenswrapper[4831]: I1204 11:46:05.803011 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dtbj7" Dec 04 11:46:05 crc kubenswrapper[4831]: I1204 11:46:05.857649 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dtbj7" Dec 04 11:46:06 crc kubenswrapper[4831]: I1204 11:46:06.607397 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dtbj7" Dec 04 11:46:06 crc kubenswrapper[4831]: I1204 11:46:06.662799 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dtbj7"] Dec 04 11:46:08 crc kubenswrapper[4831]: I1204 11:46:08.574499 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dtbj7" podUID="9b7bf3c1-2f08-4cb2-9e27-790a3189d643" containerName="registry-server" containerID="cri-o://427388a6d1330f130dec38f3629855ac1b2ce4a072442d1237a27c260af96940" gracePeriod=2 Dec 04 11:46:09 crc kubenswrapper[4831]: I1204 11:46:09.056930 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dtbj7" Dec 04 11:46:09 crc kubenswrapper[4831]: I1204 11:46:09.198569 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b7bf3c1-2f08-4cb2-9e27-790a3189d643-catalog-content\") pod \"9b7bf3c1-2f08-4cb2-9e27-790a3189d643\" (UID: \"9b7bf3c1-2f08-4cb2-9e27-790a3189d643\") " Dec 04 11:46:09 crc kubenswrapper[4831]: I1204 11:46:09.198726 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4rbc\" (UniqueName: \"kubernetes.io/projected/9b7bf3c1-2f08-4cb2-9e27-790a3189d643-kube-api-access-x4rbc\") pod \"9b7bf3c1-2f08-4cb2-9e27-790a3189d643\" (UID: \"9b7bf3c1-2f08-4cb2-9e27-790a3189d643\") " Dec 04 11:46:09 crc kubenswrapper[4831]: I1204 11:46:09.198776 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b7bf3c1-2f08-4cb2-9e27-790a3189d643-utilities\") pod \"9b7bf3c1-2f08-4cb2-9e27-790a3189d643\" (UID: \"9b7bf3c1-2f08-4cb2-9e27-790a3189d643\") " Dec 04 11:46:09 crc kubenswrapper[4831]: I1204 11:46:09.200175 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b7bf3c1-2f08-4cb2-9e27-790a3189d643-utilities" (OuterVolumeSpecName: "utilities") pod "9b7bf3c1-2f08-4cb2-9e27-790a3189d643" (UID: "9b7bf3c1-2f08-4cb2-9e27-790a3189d643"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:46:09 crc kubenswrapper[4831]: I1204 11:46:09.221979 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b7bf3c1-2f08-4cb2-9e27-790a3189d643-kube-api-access-x4rbc" (OuterVolumeSpecName: "kube-api-access-x4rbc") pod "9b7bf3c1-2f08-4cb2-9e27-790a3189d643" (UID: "9b7bf3c1-2f08-4cb2-9e27-790a3189d643"). InnerVolumeSpecName "kube-api-access-x4rbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:46:09 crc kubenswrapper[4831]: I1204 11:46:09.264922 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b7bf3c1-2f08-4cb2-9e27-790a3189d643-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9b7bf3c1-2f08-4cb2-9e27-790a3189d643" (UID: "9b7bf3c1-2f08-4cb2-9e27-790a3189d643"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:46:09 crc kubenswrapper[4831]: I1204 11:46:09.301594 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b7bf3c1-2f08-4cb2-9e27-790a3189d643-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 11:46:09 crc kubenswrapper[4831]: I1204 11:46:09.301636 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4rbc\" (UniqueName: \"kubernetes.io/projected/9b7bf3c1-2f08-4cb2-9e27-790a3189d643-kube-api-access-x4rbc\") on node \"crc\" DevicePath \"\"" Dec 04 11:46:09 crc kubenswrapper[4831]: I1204 11:46:09.301646 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b7bf3c1-2f08-4cb2-9e27-790a3189d643-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 11:46:09 crc kubenswrapper[4831]: I1204 11:46:09.586223 4831 generic.go:334] "Generic (PLEG): container finished" podID="9b7bf3c1-2f08-4cb2-9e27-790a3189d643" containerID="427388a6d1330f130dec38f3629855ac1b2ce4a072442d1237a27c260af96940" exitCode=0 Dec 04 11:46:09 crc kubenswrapper[4831]: I1204 11:46:09.586287 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dtbj7" event={"ID":"9b7bf3c1-2f08-4cb2-9e27-790a3189d643","Type":"ContainerDied","Data":"427388a6d1330f130dec38f3629855ac1b2ce4a072442d1237a27c260af96940"} Dec 04 11:46:09 crc kubenswrapper[4831]: I1204 11:46:09.586334 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dtbj7" Dec 04 11:46:09 crc kubenswrapper[4831]: I1204 11:46:09.586358 4831 scope.go:117] "RemoveContainer" containerID="427388a6d1330f130dec38f3629855ac1b2ce4a072442d1237a27c260af96940" Dec 04 11:46:09 crc kubenswrapper[4831]: I1204 11:46:09.586345 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dtbj7" event={"ID":"9b7bf3c1-2f08-4cb2-9e27-790a3189d643","Type":"ContainerDied","Data":"1bba27f2baf656944b20906ff7895bc733033b26d4ab43cd50533849e87bca04"} Dec 04 11:46:09 crc kubenswrapper[4831]: I1204 11:46:09.611510 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dtbj7"] Dec 04 11:46:09 crc kubenswrapper[4831]: I1204 11:46:09.621074 4831 scope.go:117] "RemoveContainer" containerID="64231ae5237dc01cf420f9202628d489c49d0aaee1ad89d25a678cb7119ebc51" Dec 04 11:46:09 crc kubenswrapper[4831]: I1204 11:46:09.621086 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dtbj7"] Dec 04 11:46:09 crc kubenswrapper[4831]: I1204 11:46:09.644973 4831 scope.go:117] "RemoveContainer" containerID="9b086cc39854e3fdf29f8a0d772af4fe55b0a257fe2882e775f1a001130997bc" Dec 04 11:46:09 crc kubenswrapper[4831]: I1204 11:46:09.691004 4831 scope.go:117] "RemoveContainer" containerID="427388a6d1330f130dec38f3629855ac1b2ce4a072442d1237a27c260af96940" Dec 04 11:46:09 crc kubenswrapper[4831]: E1204 11:46:09.691501 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"427388a6d1330f130dec38f3629855ac1b2ce4a072442d1237a27c260af96940\": container with ID starting with 427388a6d1330f130dec38f3629855ac1b2ce4a072442d1237a27c260af96940 not found: ID does not exist" containerID="427388a6d1330f130dec38f3629855ac1b2ce4a072442d1237a27c260af96940" Dec 04 11:46:09 crc kubenswrapper[4831]: I1204 11:46:09.691549 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"427388a6d1330f130dec38f3629855ac1b2ce4a072442d1237a27c260af96940"} err="failed to get container status \"427388a6d1330f130dec38f3629855ac1b2ce4a072442d1237a27c260af96940\": rpc error: code = NotFound desc = could not find container \"427388a6d1330f130dec38f3629855ac1b2ce4a072442d1237a27c260af96940\": container with ID starting with 427388a6d1330f130dec38f3629855ac1b2ce4a072442d1237a27c260af96940 not found: ID does not exist" Dec 04 11:46:09 crc kubenswrapper[4831]: I1204 11:46:09.691582 4831 scope.go:117] "RemoveContainer" containerID="64231ae5237dc01cf420f9202628d489c49d0aaee1ad89d25a678cb7119ebc51" Dec 04 11:46:09 crc kubenswrapper[4831]: E1204 11:46:09.692058 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64231ae5237dc01cf420f9202628d489c49d0aaee1ad89d25a678cb7119ebc51\": container with ID starting with 64231ae5237dc01cf420f9202628d489c49d0aaee1ad89d25a678cb7119ebc51 not found: ID does not exist" containerID="64231ae5237dc01cf420f9202628d489c49d0aaee1ad89d25a678cb7119ebc51" Dec 04 11:46:09 crc kubenswrapper[4831]: I1204 11:46:09.692085 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64231ae5237dc01cf420f9202628d489c49d0aaee1ad89d25a678cb7119ebc51"} err="failed to get container status \"64231ae5237dc01cf420f9202628d489c49d0aaee1ad89d25a678cb7119ebc51\": rpc error: code = NotFound desc = could not find container \"64231ae5237dc01cf420f9202628d489c49d0aaee1ad89d25a678cb7119ebc51\": container with ID starting with 64231ae5237dc01cf420f9202628d489c49d0aaee1ad89d25a678cb7119ebc51 not found: ID does not exist" Dec 04 11:46:09 crc kubenswrapper[4831]: I1204 11:46:09.692104 4831 scope.go:117] "RemoveContainer" containerID="9b086cc39854e3fdf29f8a0d772af4fe55b0a257fe2882e775f1a001130997bc" Dec 04 11:46:09 crc kubenswrapper[4831]: E1204 11:46:09.693058 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b086cc39854e3fdf29f8a0d772af4fe55b0a257fe2882e775f1a001130997bc\": container with ID starting with 9b086cc39854e3fdf29f8a0d772af4fe55b0a257fe2882e775f1a001130997bc not found: ID does not exist" containerID="9b086cc39854e3fdf29f8a0d772af4fe55b0a257fe2882e775f1a001130997bc" Dec 04 11:46:09 crc kubenswrapper[4831]: I1204 11:46:09.693082 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b086cc39854e3fdf29f8a0d772af4fe55b0a257fe2882e775f1a001130997bc"} err="failed to get container status \"9b086cc39854e3fdf29f8a0d772af4fe55b0a257fe2882e775f1a001130997bc\": rpc error: code = NotFound desc = could not find container \"9b086cc39854e3fdf29f8a0d772af4fe55b0a257fe2882e775f1a001130997bc\": container with ID starting with 9b086cc39854e3fdf29f8a0d772af4fe55b0a257fe2882e775f1a001130997bc not found: ID does not exist" Dec 04 11:46:11 crc kubenswrapper[4831]: I1204 11:46:11.289424 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b7bf3c1-2f08-4cb2-9e27-790a3189d643" path="/var/lib/kubelet/pods/9b7bf3c1-2f08-4cb2-9e27-790a3189d643/volumes" Dec 04 11:46:14 crc kubenswrapper[4831]: I1204 11:46:14.276446 4831 scope.go:117] "RemoveContainer" containerID="c10daa181ba567efa481df940488e4d09754266337731799be708a612b58e69b" Dec 04 11:46:14 crc kubenswrapper[4831]: E1204 11:46:14.277226 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:46:17 crc kubenswrapper[4831]: I1204 11:46:17.672650 4831 generic.go:334] "Generic (PLEG): container finished" podID="f2a1c979-d65c-4bb0-af13-231e63ba21b0" containerID="0bee97bea2a0097dff8c3a4236d8c1f660c86e6601af757b993b06759f308759" exitCode=0 Dec 04 11:46:17 crc kubenswrapper[4831]: I1204 11:46:17.672870 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fn8lm/crc-debug-zdxpx" event={"ID":"f2a1c979-d65c-4bb0-af13-231e63ba21b0","Type":"ContainerDied","Data":"0bee97bea2a0097dff8c3a4236d8c1f660c86e6601af757b993b06759f308759"} Dec 04 11:46:18 crc kubenswrapper[4831]: I1204 11:46:18.807911 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fn8lm/crc-debug-zdxpx" Dec 04 11:46:18 crc kubenswrapper[4831]: I1204 11:46:18.846523 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fn8lm/crc-debug-zdxpx"] Dec 04 11:46:18 crc kubenswrapper[4831]: I1204 11:46:18.855858 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fn8lm/crc-debug-zdxpx"] Dec 04 11:46:18 crc kubenswrapper[4831]: I1204 11:46:18.901002 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sn9k8\" (UniqueName: \"kubernetes.io/projected/f2a1c979-d65c-4bb0-af13-231e63ba21b0-kube-api-access-sn9k8\") pod \"f2a1c979-d65c-4bb0-af13-231e63ba21b0\" (UID: \"f2a1c979-d65c-4bb0-af13-231e63ba21b0\") " Dec 04 11:46:18 crc kubenswrapper[4831]: I1204 11:46:18.901218 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f2a1c979-d65c-4bb0-af13-231e63ba21b0-host\") pod \"f2a1c979-d65c-4bb0-af13-231e63ba21b0\" (UID: \"f2a1c979-d65c-4bb0-af13-231e63ba21b0\") " Dec 04 11:46:18 crc kubenswrapper[4831]: I1204 11:46:18.901337 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f2a1c979-d65c-4bb0-af13-231e63ba21b0-host" (OuterVolumeSpecName: "host") pod "f2a1c979-d65c-4bb0-af13-231e63ba21b0" (UID: "f2a1c979-d65c-4bb0-af13-231e63ba21b0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 11:46:18 crc kubenswrapper[4831]: I1204 11:46:18.901927 4831 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f2a1c979-d65c-4bb0-af13-231e63ba21b0-host\") on node \"crc\" DevicePath \"\"" Dec 04 11:46:18 crc kubenswrapper[4831]: I1204 11:46:18.907195 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2a1c979-d65c-4bb0-af13-231e63ba21b0-kube-api-access-sn9k8" (OuterVolumeSpecName: "kube-api-access-sn9k8") pod "f2a1c979-d65c-4bb0-af13-231e63ba21b0" (UID: "f2a1c979-d65c-4bb0-af13-231e63ba21b0"). InnerVolumeSpecName "kube-api-access-sn9k8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:46:19 crc kubenswrapper[4831]: I1204 11:46:19.003211 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sn9k8\" (UniqueName: \"kubernetes.io/projected/f2a1c979-d65c-4bb0-af13-231e63ba21b0-kube-api-access-sn9k8\") on node \"crc\" DevicePath \"\"" Dec 04 11:46:19 crc kubenswrapper[4831]: I1204 11:46:19.295192 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2a1c979-d65c-4bb0-af13-231e63ba21b0" path="/var/lib/kubelet/pods/f2a1c979-d65c-4bb0-af13-231e63ba21b0/volumes" Dec 04 11:46:19 crc kubenswrapper[4831]: I1204 11:46:19.694468 4831 scope.go:117] "RemoveContainer" containerID="0bee97bea2a0097dff8c3a4236d8c1f660c86e6601af757b993b06759f308759" Dec 04 11:46:19 crc kubenswrapper[4831]: I1204 11:46:19.694526 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fn8lm/crc-debug-zdxpx" Dec 04 11:46:20 crc kubenswrapper[4831]: I1204 11:46:20.012319 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fn8lm/crc-debug-t9867"] Dec 04 11:46:20 crc kubenswrapper[4831]: E1204 11:46:20.013701 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b7bf3c1-2f08-4cb2-9e27-790a3189d643" containerName="extract-utilities" Dec 04 11:46:20 crc kubenswrapper[4831]: I1204 11:46:20.013826 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b7bf3c1-2f08-4cb2-9e27-790a3189d643" containerName="extract-utilities" Dec 04 11:46:20 crc kubenswrapper[4831]: E1204 11:46:20.013924 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b7bf3c1-2f08-4cb2-9e27-790a3189d643" containerName="extract-content" Dec 04 11:46:20 crc kubenswrapper[4831]: I1204 11:46:20.013985 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b7bf3c1-2f08-4cb2-9e27-790a3189d643" containerName="extract-content" Dec 04 11:46:20 crc kubenswrapper[4831]: E1204 11:46:20.014057 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b7bf3c1-2f08-4cb2-9e27-790a3189d643" containerName="registry-server" Dec 04 11:46:20 crc kubenswrapper[4831]: I1204 11:46:20.014115 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b7bf3c1-2f08-4cb2-9e27-790a3189d643" containerName="registry-server" Dec 04 11:46:20 crc kubenswrapper[4831]: E1204 11:46:20.014181 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2a1c979-d65c-4bb0-af13-231e63ba21b0" containerName="container-00" Dec 04 11:46:20 crc kubenswrapper[4831]: I1204 11:46:20.014237 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2a1c979-d65c-4bb0-af13-231e63ba21b0" containerName="container-00" Dec 04 11:46:20 crc kubenswrapper[4831]: I1204 11:46:20.014509 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b7bf3c1-2f08-4cb2-9e27-790a3189d643" containerName="registry-server" Dec 04 11:46:20 crc kubenswrapper[4831]: I1204 11:46:20.014598 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2a1c979-d65c-4bb0-af13-231e63ba21b0" containerName="container-00" Dec 04 11:46:20 crc kubenswrapper[4831]: I1204 11:46:20.015448 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fn8lm/crc-debug-t9867" Dec 04 11:46:20 crc kubenswrapper[4831]: I1204 11:46:20.128143 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2072ac31-35b4-4ba1-af4a-6f21e2aca79c-host\") pod \"crc-debug-t9867\" (UID: \"2072ac31-35b4-4ba1-af4a-6f21e2aca79c\") " pod="openshift-must-gather-fn8lm/crc-debug-t9867" Dec 04 11:46:20 crc kubenswrapper[4831]: I1204 11:46:20.128252 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzvhn\" (UniqueName: \"kubernetes.io/projected/2072ac31-35b4-4ba1-af4a-6f21e2aca79c-kube-api-access-zzvhn\") pod \"crc-debug-t9867\" (UID: \"2072ac31-35b4-4ba1-af4a-6f21e2aca79c\") " pod="openshift-must-gather-fn8lm/crc-debug-t9867" Dec 04 11:46:20 crc kubenswrapper[4831]: I1204 11:46:20.230221 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2072ac31-35b4-4ba1-af4a-6f21e2aca79c-host\") pod \"crc-debug-t9867\" (UID: \"2072ac31-35b4-4ba1-af4a-6f21e2aca79c\") " pod="openshift-must-gather-fn8lm/crc-debug-t9867" Dec 04 11:46:20 crc kubenswrapper[4831]: I1204 11:46:20.230325 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzvhn\" (UniqueName: \"kubernetes.io/projected/2072ac31-35b4-4ba1-af4a-6f21e2aca79c-kube-api-access-zzvhn\") pod \"crc-debug-t9867\" (UID: \"2072ac31-35b4-4ba1-af4a-6f21e2aca79c\") " pod="openshift-must-gather-fn8lm/crc-debug-t9867" Dec 04 11:46:20 crc kubenswrapper[4831]: I1204 11:46:20.230379 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2072ac31-35b4-4ba1-af4a-6f21e2aca79c-host\") pod \"crc-debug-t9867\" (UID: \"2072ac31-35b4-4ba1-af4a-6f21e2aca79c\") " pod="openshift-must-gather-fn8lm/crc-debug-t9867" Dec 04 11:46:20 crc kubenswrapper[4831]: I1204 11:46:20.249604 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzvhn\" (UniqueName: \"kubernetes.io/projected/2072ac31-35b4-4ba1-af4a-6f21e2aca79c-kube-api-access-zzvhn\") pod \"crc-debug-t9867\" (UID: \"2072ac31-35b4-4ba1-af4a-6f21e2aca79c\") " pod="openshift-must-gather-fn8lm/crc-debug-t9867" Dec 04 11:46:20 crc kubenswrapper[4831]: I1204 11:46:20.333007 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fn8lm/crc-debug-t9867" Dec 04 11:46:20 crc kubenswrapper[4831]: I1204 11:46:20.711604 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fn8lm/crc-debug-t9867" event={"ID":"2072ac31-35b4-4ba1-af4a-6f21e2aca79c","Type":"ContainerStarted","Data":"dc64faab6b989aba4421710c40a3ef9182621d2e52b39051186118746761395d"} Dec 04 11:46:20 crc kubenswrapper[4831]: I1204 11:46:20.711683 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fn8lm/crc-debug-t9867" event={"ID":"2072ac31-35b4-4ba1-af4a-6f21e2aca79c","Type":"ContainerStarted","Data":"e433618e4f874a72f3f2c2f6b080a59cc8708510970b1ea479c0512425da9e07"} Dec 04 11:46:20 crc kubenswrapper[4831]: I1204 11:46:20.728990 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fn8lm/crc-debug-t9867" podStartSLOduration=1.728972605 podStartE2EDuration="1.728972605s" podCreationTimestamp="2025-12-04 11:46:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 11:46:20.72323825 +0000 UTC m=+5477.672413584" watchObservedRunningTime="2025-12-04 11:46:20.728972605 +0000 UTC m=+5477.678147919" Dec 04 11:46:21 crc kubenswrapper[4831]: I1204 11:46:21.721709 4831 generic.go:334] "Generic (PLEG): container finished" podID="2072ac31-35b4-4ba1-af4a-6f21e2aca79c" containerID="dc64faab6b989aba4421710c40a3ef9182621d2e52b39051186118746761395d" exitCode=0 Dec 04 11:46:21 crc kubenswrapper[4831]: I1204 11:46:21.721761 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fn8lm/crc-debug-t9867" event={"ID":"2072ac31-35b4-4ba1-af4a-6f21e2aca79c","Type":"ContainerDied","Data":"dc64faab6b989aba4421710c40a3ef9182621d2e52b39051186118746761395d"} Dec 04 11:46:22 crc kubenswrapper[4831]: I1204 11:46:22.830317 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fn8lm/crc-debug-t9867" Dec 04 11:46:22 crc kubenswrapper[4831]: I1204 11:46:22.876482 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzvhn\" (UniqueName: \"kubernetes.io/projected/2072ac31-35b4-4ba1-af4a-6f21e2aca79c-kube-api-access-zzvhn\") pod \"2072ac31-35b4-4ba1-af4a-6f21e2aca79c\" (UID: \"2072ac31-35b4-4ba1-af4a-6f21e2aca79c\") " Dec 04 11:46:22 crc kubenswrapper[4831]: I1204 11:46:22.876561 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2072ac31-35b4-4ba1-af4a-6f21e2aca79c-host\") pod \"2072ac31-35b4-4ba1-af4a-6f21e2aca79c\" (UID: \"2072ac31-35b4-4ba1-af4a-6f21e2aca79c\") " Dec 04 11:46:22 crc kubenswrapper[4831]: I1204 11:46:22.877440 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2072ac31-35b4-4ba1-af4a-6f21e2aca79c-host" (OuterVolumeSpecName: "host") pod "2072ac31-35b4-4ba1-af4a-6f21e2aca79c" (UID: "2072ac31-35b4-4ba1-af4a-6f21e2aca79c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 11:46:22 crc kubenswrapper[4831]: I1204 11:46:22.892434 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2072ac31-35b4-4ba1-af4a-6f21e2aca79c-kube-api-access-zzvhn" (OuterVolumeSpecName: "kube-api-access-zzvhn") pod "2072ac31-35b4-4ba1-af4a-6f21e2aca79c" (UID: "2072ac31-35b4-4ba1-af4a-6f21e2aca79c"). InnerVolumeSpecName "kube-api-access-zzvhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:46:22 crc kubenswrapper[4831]: I1204 11:46:22.978882 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzvhn\" (UniqueName: \"kubernetes.io/projected/2072ac31-35b4-4ba1-af4a-6f21e2aca79c-kube-api-access-zzvhn\") on node \"crc\" DevicePath \"\"" Dec 04 11:46:22 crc kubenswrapper[4831]: I1204 11:46:22.978924 4831 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2072ac31-35b4-4ba1-af4a-6f21e2aca79c-host\") on node \"crc\" DevicePath \"\"" Dec 04 11:46:23 crc kubenswrapper[4831]: I1204 11:46:23.389420 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fn8lm/crc-debug-t9867"] Dec 04 11:46:23 crc kubenswrapper[4831]: I1204 11:46:23.399374 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fn8lm/crc-debug-t9867"] Dec 04 11:46:23 crc kubenswrapper[4831]: I1204 11:46:23.742520 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e433618e4f874a72f3f2c2f6b080a59cc8708510970b1ea479c0512425da9e07" Dec 04 11:46:23 crc kubenswrapper[4831]: I1204 11:46:23.742567 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fn8lm/crc-debug-t9867" Dec 04 11:46:24 crc kubenswrapper[4831]: I1204 11:46:24.559626 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fn8lm/crc-debug-xskz6"] Dec 04 11:46:24 crc kubenswrapper[4831]: E1204 11:46:24.560485 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2072ac31-35b4-4ba1-af4a-6f21e2aca79c" containerName="container-00" Dec 04 11:46:24 crc kubenswrapper[4831]: I1204 11:46:24.560499 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="2072ac31-35b4-4ba1-af4a-6f21e2aca79c" containerName="container-00" Dec 04 11:46:24 crc kubenswrapper[4831]: I1204 11:46:24.560708 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="2072ac31-35b4-4ba1-af4a-6f21e2aca79c" containerName="container-00" Dec 04 11:46:24 crc kubenswrapper[4831]: I1204 11:46:24.561433 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fn8lm/crc-debug-xskz6" Dec 04 11:46:24 crc kubenswrapper[4831]: I1204 11:46:24.616906 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kngl5\" (UniqueName: \"kubernetes.io/projected/b6b9f49f-b55a-4e75-961f-249e555d3d07-kube-api-access-kngl5\") pod \"crc-debug-xskz6\" (UID: \"b6b9f49f-b55a-4e75-961f-249e555d3d07\") " pod="openshift-must-gather-fn8lm/crc-debug-xskz6" Dec 04 11:46:24 crc kubenswrapper[4831]: I1204 11:46:24.617171 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b6b9f49f-b55a-4e75-961f-249e555d3d07-host\") pod \"crc-debug-xskz6\" (UID: \"b6b9f49f-b55a-4e75-961f-249e555d3d07\") " pod="openshift-must-gather-fn8lm/crc-debug-xskz6" Dec 04 11:46:24 crc kubenswrapper[4831]: I1204 11:46:24.718951 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kngl5\" (UniqueName: \"kubernetes.io/projected/b6b9f49f-b55a-4e75-961f-249e555d3d07-kube-api-access-kngl5\") pod \"crc-debug-xskz6\" (UID: \"b6b9f49f-b55a-4e75-961f-249e555d3d07\") " pod="openshift-must-gather-fn8lm/crc-debug-xskz6" Dec 04 11:46:24 crc kubenswrapper[4831]: I1204 11:46:24.719109 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b6b9f49f-b55a-4e75-961f-249e555d3d07-host\") pod \"crc-debug-xskz6\" (UID: \"b6b9f49f-b55a-4e75-961f-249e555d3d07\") " pod="openshift-must-gather-fn8lm/crc-debug-xskz6" Dec 04 11:46:24 crc kubenswrapper[4831]: I1204 11:46:24.719206 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b6b9f49f-b55a-4e75-961f-249e555d3d07-host\") pod \"crc-debug-xskz6\" (UID: \"b6b9f49f-b55a-4e75-961f-249e555d3d07\") " pod="openshift-must-gather-fn8lm/crc-debug-xskz6" Dec 04 11:46:24 crc kubenswrapper[4831]: I1204 11:46:24.738128 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kngl5\" (UniqueName: \"kubernetes.io/projected/b6b9f49f-b55a-4e75-961f-249e555d3d07-kube-api-access-kngl5\") pod \"crc-debug-xskz6\" (UID: \"b6b9f49f-b55a-4e75-961f-249e555d3d07\") " pod="openshift-must-gather-fn8lm/crc-debug-xskz6" Dec 04 11:46:24 crc kubenswrapper[4831]: I1204 11:46:24.888008 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fn8lm/crc-debug-xskz6" Dec 04 11:46:24 crc kubenswrapper[4831]: W1204 11:46:24.921218 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6b9f49f_b55a_4e75_961f_249e555d3d07.slice/crio-0b1ee3f79f25d9c523fefd3240f26b979b8c8c64962c43cdb12f9b66d2fd29f7 WatchSource:0}: Error finding container 0b1ee3f79f25d9c523fefd3240f26b979b8c8c64962c43cdb12f9b66d2fd29f7: Status 404 returned error can't find the container with id 0b1ee3f79f25d9c523fefd3240f26b979b8c8c64962c43cdb12f9b66d2fd29f7 Dec 04 11:46:25 crc kubenswrapper[4831]: I1204 11:46:25.290948 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2072ac31-35b4-4ba1-af4a-6f21e2aca79c" path="/var/lib/kubelet/pods/2072ac31-35b4-4ba1-af4a-6f21e2aca79c/volumes" Dec 04 11:46:25 crc kubenswrapper[4831]: I1204 11:46:25.767306 4831 generic.go:334] "Generic (PLEG): container finished" podID="b6b9f49f-b55a-4e75-961f-249e555d3d07" containerID="33e072f16cc6672f0315d2e179d864f80511429927e58f2f193c334e2af618c1" exitCode=0 Dec 04 11:46:25 crc kubenswrapper[4831]: I1204 11:46:25.767389 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fn8lm/crc-debug-xskz6" event={"ID":"b6b9f49f-b55a-4e75-961f-249e555d3d07","Type":"ContainerDied","Data":"33e072f16cc6672f0315d2e179d864f80511429927e58f2f193c334e2af618c1"} Dec 04 11:46:25 crc kubenswrapper[4831]: I1204 11:46:25.767726 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fn8lm/crc-debug-xskz6" event={"ID":"b6b9f49f-b55a-4e75-961f-249e555d3d07","Type":"ContainerStarted","Data":"0b1ee3f79f25d9c523fefd3240f26b979b8c8c64962c43cdb12f9b66d2fd29f7"} Dec 04 11:46:25 crc kubenswrapper[4831]: I1204 11:46:25.816920 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fn8lm/crc-debug-xskz6"] Dec 04 11:46:25 crc kubenswrapper[4831]: I1204 11:46:25.831565 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fn8lm/crc-debug-xskz6"] Dec 04 11:46:26 crc kubenswrapper[4831]: I1204 11:46:26.897104 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fn8lm/crc-debug-xskz6" Dec 04 11:46:26 crc kubenswrapper[4831]: I1204 11:46:26.967817 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b6b9f49f-b55a-4e75-961f-249e555d3d07-host\") pod \"b6b9f49f-b55a-4e75-961f-249e555d3d07\" (UID: \"b6b9f49f-b55a-4e75-961f-249e555d3d07\") " Dec 04 11:46:26 crc kubenswrapper[4831]: I1204 11:46:26.967963 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kngl5\" (UniqueName: \"kubernetes.io/projected/b6b9f49f-b55a-4e75-961f-249e555d3d07-kube-api-access-kngl5\") pod \"b6b9f49f-b55a-4e75-961f-249e555d3d07\" (UID: \"b6b9f49f-b55a-4e75-961f-249e555d3d07\") " Dec 04 11:46:26 crc kubenswrapper[4831]: I1204 11:46:26.967958 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b6b9f49f-b55a-4e75-961f-249e555d3d07-host" (OuterVolumeSpecName: "host") pod "b6b9f49f-b55a-4e75-961f-249e555d3d07" (UID: "b6b9f49f-b55a-4e75-961f-249e555d3d07"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 11:46:26 crc kubenswrapper[4831]: I1204 11:46:26.968397 4831 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b6b9f49f-b55a-4e75-961f-249e555d3d07-host\") on node \"crc\" DevicePath \"\"" Dec 04 11:46:27 crc kubenswrapper[4831]: I1204 11:46:27.015327 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6b9f49f-b55a-4e75-961f-249e555d3d07-kube-api-access-kngl5" (OuterVolumeSpecName: "kube-api-access-kngl5") pod "b6b9f49f-b55a-4e75-961f-249e555d3d07" (UID: "b6b9f49f-b55a-4e75-961f-249e555d3d07"). InnerVolumeSpecName "kube-api-access-kngl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:46:27 crc kubenswrapper[4831]: I1204 11:46:27.070226 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kngl5\" (UniqueName: \"kubernetes.io/projected/b6b9f49f-b55a-4e75-961f-249e555d3d07-kube-api-access-kngl5\") on node \"crc\" DevicePath \"\"" Dec 04 11:46:27 crc kubenswrapper[4831]: I1204 11:46:27.286795 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6b9f49f-b55a-4e75-961f-249e555d3d07" path="/var/lib/kubelet/pods/b6b9f49f-b55a-4e75-961f-249e555d3d07/volumes" Dec 04 11:46:27 crc kubenswrapper[4831]: I1204 11:46:27.802878 4831 scope.go:117] "RemoveContainer" containerID="33e072f16cc6672f0315d2e179d864f80511429927e58f2f193c334e2af618c1" Dec 04 11:46:27 crc kubenswrapper[4831]: I1204 11:46:27.802921 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fn8lm/crc-debug-xskz6" Dec 04 11:46:28 crc kubenswrapper[4831]: I1204 11:46:28.278533 4831 scope.go:117] "RemoveContainer" containerID="c10daa181ba567efa481df940488e4d09754266337731799be708a612b58e69b" Dec 04 11:46:28 crc kubenswrapper[4831]: E1204 11:46:28.279158 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:46:40 crc kubenswrapper[4831]: I1204 11:46:40.277770 4831 scope.go:117] "RemoveContainer" containerID="c10daa181ba567efa481df940488e4d09754266337731799be708a612b58e69b" Dec 04 11:46:40 crc kubenswrapper[4831]: E1204 11:46:40.278396 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:46:52 crc kubenswrapper[4831]: I1204 11:46:52.294233 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-78b878b7bb-lxbbq_dead6584-8c6b-4231-b0c6-54d83d05c250/barbican-api/0.log" Dec 04 11:46:52 crc kubenswrapper[4831]: I1204 11:46:52.490551 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-78b878b7bb-lxbbq_dead6584-8c6b-4231-b0c6-54d83d05c250/barbican-api-log/0.log" Dec 04 11:46:52 crc kubenswrapper[4831]: I1204 11:46:52.570150 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-69bc569cc4-dg5gj_734f14e8-f267-4c7c-a5eb-d76457ec9d69/barbican-keystone-listener/0.log" Dec 04 11:46:52 crc kubenswrapper[4831]: I1204 11:46:52.647001 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-69bc569cc4-dg5gj_734f14e8-f267-4c7c-a5eb-d76457ec9d69/barbican-keystone-listener-log/0.log" Dec 04 11:46:52 crc kubenswrapper[4831]: I1204 11:46:52.799018 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6896bbf7f5-dvgps_8c3dabff-2635-4e29-9651-8df5d84838f9/barbican-worker/0.log" Dec 04 11:46:52 crc kubenswrapper[4831]: I1204 11:46:52.811828 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6896bbf7f5-dvgps_8c3dabff-2635-4e29-9651-8df5d84838f9/barbican-worker-log/0.log" Dec 04 11:46:53 crc kubenswrapper[4831]: I1204 11:46:53.039239 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-9wp7s_7d4bb48f-fa66-44cf-ab52-2fd190bdde16/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:46:53 crc kubenswrapper[4831]: I1204 11:46:53.085964 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_840623b4-007c-441a-9c28-53ebf2e02b5c/ceilometer-central-agent/0.log" Dec 04 11:46:53 crc kubenswrapper[4831]: I1204 11:46:53.216969 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_840623b4-007c-441a-9c28-53ebf2e02b5c/ceilometer-notification-agent/0.log" Dec 04 11:46:53 crc kubenswrapper[4831]: I1204 11:46:53.282464 4831 scope.go:117] "RemoveContainer" containerID="c10daa181ba567efa481df940488e4d09754266337731799be708a612b58e69b" Dec 04 11:46:53 crc kubenswrapper[4831]: E1204 11:46:53.282767 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:46:53 crc kubenswrapper[4831]: I1204 11:46:53.286208 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_840623b4-007c-441a-9c28-53ebf2e02b5c/proxy-httpd/0.log" Dec 04 11:46:53 crc kubenswrapper[4831]: I1204 11:46:53.349726 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_840623b4-007c-441a-9c28-53ebf2e02b5c/sg-core/0.log" Dec 04 11:46:53 crc kubenswrapper[4831]: I1204 11:46:53.592074 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_f61bdc02-a856-448f-9cdf-f3c43efc4bfc/cinder-api-log/0.log" Dec 04 11:46:53 crc kubenswrapper[4831]: I1204 11:46:53.907218 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_610ff7e3-10d8-460e-a7a1-2ad48221b858/probe/0.log" Dec 04 11:46:53 crc kubenswrapper[4831]: I1204 11:46:53.939709 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_f61bdc02-a856-448f-9cdf-f3c43efc4bfc/cinder-api/0.log" Dec 04 11:46:53 crc kubenswrapper[4831]: I1204 11:46:53.997776 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_610ff7e3-10d8-460e-a7a1-2ad48221b858/cinder-backup/0.log" Dec 04 11:46:54 crc kubenswrapper[4831]: I1204 11:46:54.143492 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_5a0cad59-03e2-4ed0-84ee-c51f7c1b5e3f/cinder-scheduler/0.log" Dec 04 11:46:54 crc kubenswrapper[4831]: I1204 11:46:54.228447 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_5a0cad59-03e2-4ed0-84ee-c51f7c1b5e3f/probe/0.log" Dec 04 11:46:54 crc kubenswrapper[4831]: I1204 11:46:54.414216 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_6f75845a-f441-4d6a-a971-e24a8010d7fe/probe/0.log" Dec 04 11:46:54 crc kubenswrapper[4831]: I1204 11:46:54.438436 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_6f75845a-f441-4d6a-a971-e24a8010d7fe/cinder-volume/0.log" Dec 04 11:46:54 crc kubenswrapper[4831]: I1204 11:46:54.684556 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_c5605310-5110-4a33-84d3-56518bd49d56/cinder-volume/0.log" Dec 04 11:46:54 crc kubenswrapper[4831]: I1204 11:46:54.703844 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_c5605310-5110-4a33-84d3-56518bd49d56/probe/0.log" Dec 04 11:46:54 crc kubenswrapper[4831]: I1204 11:46:54.782730 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-xdn78_4acb6010-6e0f-45c8-b768-1bd3a9a090c8/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:46:54 crc kubenswrapper[4831]: I1204 11:46:54.925974 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-6glvd_62c2750f-dec9-4b21-b4c2-e1ee98b3b754/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:46:55 crc kubenswrapper[4831]: I1204 11:46:55.205832 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5fc4fb97c9-v2hrg_06199f1c-bca8-4702-8fe5-f7e6512884f6/init/0.log" Dec 04 11:46:55 crc kubenswrapper[4831]: I1204 11:46:55.476921 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5fc4fb97c9-v2hrg_06199f1c-bca8-4702-8fe5-f7e6512884f6/init/0.log" Dec 04 11:46:55 crc kubenswrapper[4831]: I1204 11:46:55.551589 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-b924b_bb1963f3-7a7f-40b9-a9b2-b74f220bebb2/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:46:55 crc kubenswrapper[4831]: I1204 11:46:55.647051 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5fc4fb97c9-v2hrg_06199f1c-bca8-4702-8fe5-f7e6512884f6/dnsmasq-dns/0.log" Dec 04 11:46:55 crc kubenswrapper[4831]: I1204 11:46:55.792542 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_110d0cb6-3ad6-4d48-ae88-1864408c86af/glance-httpd/0.log" Dec 04 11:46:55 crc kubenswrapper[4831]: I1204 11:46:55.815366 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_110d0cb6-3ad6-4d48-ae88-1864408c86af/glance-log/0.log" Dec 04 11:46:56 crc kubenswrapper[4831]: I1204 11:46:56.024105 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_8fb48518-ba79-45b0-8f47-51305a47805a/glance-log/0.log" Dec 04 11:46:56 crc kubenswrapper[4831]: I1204 11:46:56.030527 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_8fb48518-ba79-45b0-8f47-51305a47805a/glance-httpd/0.log" Dec 04 11:46:56 crc kubenswrapper[4831]: I1204 11:46:56.219208 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5cf8898794-dbfdf_f1f8e9de-4491-4e25-bee0-457da0500046/horizon/0.log" Dec 04 11:46:56 crc kubenswrapper[4831]: I1204 11:46:56.300836 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-2bbb6_2b1e2fb0-a6df-4568-a7d9-cce135438da5/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:46:56 crc kubenswrapper[4831]: I1204 11:46:56.433611 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-cdjgr_6caf006b-f650-4ea1-89bd-466b524c2049/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:46:56 crc kubenswrapper[4831]: I1204 11:46:56.666567 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29414101-mjsx6_5cd83458-aa47-4478-84dc-3dfeaf0829e1/keystone-cron/0.log" Dec 04 11:46:56 crc kubenswrapper[4831]: I1204 11:46:56.868598 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_bd604d68-70ec-4bc4-bd2a-8bc427ced498/kube-state-metrics/0.log" Dec 04 11:46:56 crc kubenswrapper[4831]: I1204 11:46:56.990250 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5cf8898794-dbfdf_f1f8e9de-4491-4e25-bee0-457da0500046/horizon-log/0.log" Dec 04 11:46:57 crc kubenswrapper[4831]: I1204 11:46:57.146631 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6f799f5dc-hv6fj_3d4c2366-4b32-485c-88d3-2e6ff2d19bc8/keystone-api/0.log" Dec 04 11:46:57 crc kubenswrapper[4831]: I1204 11:46:57.151556 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-rh529_edce9302-713f-454f-b725-e30e8f594cd3/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:46:57 crc kubenswrapper[4831]: I1204 11:46:57.621418 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfl8k_e2b14e5b-5d51-4f51-b199-7dc570d507b6/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:46:57 crc kubenswrapper[4831]: I1204 11:46:57.647718 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-66c4c75c85-69mpg_2f2179f9-7122-438d-85cc-012b724ccae8/neutron-httpd/0.log" Dec 04 11:46:57 crc kubenswrapper[4831]: I1204 11:46:57.717044 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-66c4c75c85-69mpg_2f2179f9-7122-438d-85cc-012b724ccae8/neutron-api/0.log" Dec 04 11:46:58 crc kubenswrapper[4831]: I1204 11:46:58.332566 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_8deabc31-a8c8-43f8-b854-d25d8e13f9ed/nova-cell0-conductor-conductor/0.log" Dec 04 11:46:58 crc kubenswrapper[4831]: I1204 11:46:58.711808 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_b8f4c049-2837-400e-8ee0-accb79c79fc5/nova-cell1-conductor-conductor/0.log" Dec 04 11:46:58 crc kubenswrapper[4831]: I1204 11:46:58.983225 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_5cc0cf0e-1e1e-4c0b-8b47-fafcdf58b57b/nova-cell1-novncproxy-novncproxy/0.log" Dec 04 11:46:59 crc kubenswrapper[4831]: I1204 11:46:59.248022 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a4df7ced-bd99-4850-aced-9704ea48a817/nova-api-log/0.log" Dec 04 11:46:59 crc kubenswrapper[4831]: I1204 11:46:59.275361 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-jmdvz_f93bea5d-e88b-491b-aafb-a86d7bdfa024/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:46:59 crc kubenswrapper[4831]: I1204 11:46:59.618627 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_11eb651c-b7cf-4ab7-af1c-4d9824621711/nova-metadata-log/0.log" Dec 04 11:46:59 crc kubenswrapper[4831]: I1204 11:46:59.855495 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a4df7ced-bd99-4850-aced-9704ea48a817/nova-api-api/0.log" Dec 04 11:47:00 crc kubenswrapper[4831]: I1204 11:47:00.153098 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_8b3ada8b-0b18-4561-a770-80c259283ce1/mysql-bootstrap/0.log" Dec 04 11:47:00 crc kubenswrapper[4831]: I1204 11:47:00.289060 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_d1c3c00f-9970-4177-913d-64eb1e895bec/nova-scheduler-scheduler/0.log" Dec 04 11:47:00 crc kubenswrapper[4831]: I1204 11:47:00.372737 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_8b3ada8b-0b18-4561-a770-80c259283ce1/galera/0.log" Dec 04 11:47:00 crc kubenswrapper[4831]: I1204 11:47:00.386305 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_8b3ada8b-0b18-4561-a770-80c259283ce1/mysql-bootstrap/0.log" Dec 04 11:47:00 crc kubenswrapper[4831]: I1204 11:47:00.667527 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_d6faa4d2-bf02-4ed3-baa8-4fcf903fa422/mysql-bootstrap/0.log" Dec 04 11:47:00 crc kubenswrapper[4831]: I1204 11:47:00.812545 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_d6faa4d2-bf02-4ed3-baa8-4fcf903fa422/mysql-bootstrap/0.log" Dec 04 11:47:00 crc kubenswrapper[4831]: I1204 11:47:00.844321 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_d6faa4d2-bf02-4ed3-baa8-4fcf903fa422/galera/0.log" Dec 04 11:47:01 crc kubenswrapper[4831]: I1204 11:47:01.064360 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_e13429a8-0182-4bfb-99b2-d1941a1e9af6/openstackclient/0.log" Dec 04 11:47:01 crc kubenswrapper[4831]: I1204 11:47:01.156651 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-2l4h5_3bc807f5-5d11-4f15-aff6-7c5377f10b33/ovn-controller/0.log" Dec 04 11:47:01 crc kubenswrapper[4831]: I1204 11:47:01.354363 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-lwc45_85ec4bf8-71be-4822-9afe-8d09a32d8a11/openstack-network-exporter/0.log" Dec 04 11:47:01 crc kubenswrapper[4831]: I1204 11:47:01.562185 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-p9dj4_104e4cc3-8dfb-4315-8a21-91454f3b1a45/ovsdb-server-init/0.log" Dec 04 11:47:01 crc kubenswrapper[4831]: I1204 11:47:01.790321 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-p9dj4_104e4cc3-8dfb-4315-8a21-91454f3b1a45/ovsdb-server-init/0.log" Dec 04 11:47:01 crc kubenswrapper[4831]: I1204 11:47:01.793842 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_11eb651c-b7cf-4ab7-af1c-4d9824621711/nova-metadata-metadata/0.log" Dec 04 11:47:01 crc kubenswrapper[4831]: I1204 11:47:01.826184 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-p9dj4_104e4cc3-8dfb-4315-8a21-91454f3b1a45/ovsdb-server/0.log" Dec 04 11:47:02 crc kubenswrapper[4831]: I1204 11:47:02.127465 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-c58gq_514cea2a-db2a-476e-aace-741121838112/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:47:02 crc kubenswrapper[4831]: I1204 11:47:02.239487 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_9959d097-1e28-411b-a24c-6040036e2f1a/openstack-network-exporter/0.log" Dec 04 11:47:02 crc kubenswrapper[4831]: I1204 11:47:02.252620 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-p9dj4_104e4cc3-8dfb-4315-8a21-91454f3b1a45/ovs-vswitchd/0.log" Dec 04 11:47:02 crc kubenswrapper[4831]: I1204 11:47:02.305893 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_9959d097-1e28-411b-a24c-6040036e2f1a/ovn-northd/0.log" Dec 04 11:47:02 crc kubenswrapper[4831]: I1204 11:47:02.511623 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d7c249d5-00a5-428b-b259-54d4147f8392/openstack-network-exporter/0.log" Dec 04 11:47:02 crc kubenswrapper[4831]: I1204 11:47:02.519145 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d7c249d5-00a5-428b-b259-54d4147f8392/ovsdbserver-nb/0.log" Dec 04 11:47:02 crc kubenswrapper[4831]: I1204 11:47:02.670196 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_0e48a25d-4544-4e4e-a835-8086fbb60f4d/openstack-network-exporter/0.log" Dec 04 11:47:02 crc kubenswrapper[4831]: I1204 11:47:02.808232 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_0e48a25d-4544-4e4e-a835-8086fbb60f4d/ovsdbserver-sb/0.log" Dec 04 11:47:03 crc kubenswrapper[4831]: I1204 11:47:03.129853 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_625718e4-29fd-4886-ae65-76091a6def3c/init-config-reloader/0.log" Dec 04 11:47:03 crc kubenswrapper[4831]: I1204 11:47:03.172565 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-744bfc5f58-hs9c9_24ad7f65-67eb-4f94-9d64-2d14e393c978/placement-api/0.log" Dec 04 11:47:03 crc kubenswrapper[4831]: I1204 11:47:03.220946 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-744bfc5f58-hs9c9_24ad7f65-67eb-4f94-9d64-2d14e393c978/placement-log/0.log" Dec 04 11:47:03 crc kubenswrapper[4831]: I1204 11:47:03.323798 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_625718e4-29fd-4886-ae65-76091a6def3c/init-config-reloader/0.log" Dec 04 11:47:03 crc kubenswrapper[4831]: I1204 11:47:03.375164 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_625718e4-29fd-4886-ae65-76091a6def3c/config-reloader/0.log" Dec 04 11:47:03 crc kubenswrapper[4831]: I1204 11:47:03.421441 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_625718e4-29fd-4886-ae65-76091a6def3c/prometheus/0.log" Dec 04 11:47:03 crc kubenswrapper[4831]: I1204 11:47:03.479557 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_625718e4-29fd-4886-ae65-76091a6def3c/thanos-sidecar/0.log" Dec 04 11:47:03 crc kubenswrapper[4831]: I1204 11:47:03.696846 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_7544cda9-54d6-47c9-8ba1-0834b882e674/setup-container/0.log" Dec 04 11:47:03 crc kubenswrapper[4831]: I1204 11:47:03.861305 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_7544cda9-54d6-47c9-8ba1-0834b882e674/setup-container/0.log" Dec 04 11:47:03 crc kubenswrapper[4831]: I1204 11:47:03.901960 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_d13ed0c0-494b-46b5-965d-1426a9575119/setup-container/0.log" Dec 04 11:47:03 crc kubenswrapper[4831]: I1204 11:47:03.962637 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_7544cda9-54d6-47c9-8ba1-0834b882e674/rabbitmq/0.log" Dec 04 11:47:04 crc kubenswrapper[4831]: I1204 11:47:04.174198 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_d13ed0c0-494b-46b5-965d-1426a9575119/setup-container/0.log" Dec 04 11:47:04 crc kubenswrapper[4831]: I1204 11:47:04.180802 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_d13ed0c0-494b-46b5-965d-1426a9575119/rabbitmq/0.log" Dec 04 11:47:04 crc kubenswrapper[4831]: I1204 11:47:04.263561 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_75f6fd12-4651-4b5f-9eec-d192367b85ad/setup-container/0.log" Dec 04 11:47:04 crc kubenswrapper[4831]: I1204 11:47:04.280281 4831 scope.go:117] "RemoveContainer" containerID="c10daa181ba567efa481df940488e4d09754266337731799be708a612b58e69b" Dec 04 11:47:04 crc kubenswrapper[4831]: E1204 11:47:04.280487 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:47:04 crc kubenswrapper[4831]: I1204 11:47:04.696633 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_75f6fd12-4651-4b5f-9eec-d192367b85ad/setup-container/0.log" Dec 04 11:47:04 crc kubenswrapper[4831]: I1204 11:47:04.792619 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_75f6fd12-4651-4b5f-9eec-d192367b85ad/rabbitmq/0.log" Dec 04 11:47:04 crc kubenswrapper[4831]: I1204 11:47:04.821636 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-592mg_a83de3fa-04fa-43f2-bfd0-48e9e3928c34/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:47:05 crc kubenswrapper[4831]: I1204 11:47:05.032995 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-sbrlp_e4d67c27-1b02-4823-8635-621753ee6278/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:47:05 crc kubenswrapper[4831]: I1204 11:47:05.038694 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-lrlqc_ea309f4e-4dea-4d32-b2ac-7ecf505d9341/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:47:05 crc kubenswrapper[4831]: I1204 11:47:05.321637 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-6wmvp_af857683-5749-4e71-8714-0049cd774f67/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:47:05 crc kubenswrapper[4831]: I1204 11:47:05.391930 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-rjtxp_b4e80c5d-a48e-4f72-b5f5-b3c5ed331758/ssh-known-hosts-edpm-deployment/0.log" Dec 04 11:47:05 crc kubenswrapper[4831]: I1204 11:47:05.699707 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-8c9449df7-jllzg_4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0/proxy-server/0.log" Dec 04 11:47:05 crc kubenswrapper[4831]: I1204 11:47:05.814486 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-xh7nf_6f9eb652-90e2-4231-a441-a9947e9fc782/swift-ring-rebalance/0.log" Dec 04 11:47:05 crc kubenswrapper[4831]: I1204 11:47:05.869799 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-8c9449df7-jllzg_4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0/proxy-httpd/0.log" Dec 04 11:47:05 crc kubenswrapper[4831]: I1204 11:47:05.952859 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b58305-3383-4cf9-9127-481c1bf16ba5/account-auditor/0.log" Dec 04 11:47:06 crc kubenswrapper[4831]: I1204 11:47:06.089766 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b58305-3383-4cf9-9127-481c1bf16ba5/account-reaper/0.log" Dec 04 11:47:06 crc kubenswrapper[4831]: I1204 11:47:06.130749 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b58305-3383-4cf9-9127-481c1bf16ba5/account-replicator/0.log" Dec 04 11:47:06 crc kubenswrapper[4831]: I1204 11:47:06.246514 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b58305-3383-4cf9-9127-481c1bf16ba5/account-server/0.log" Dec 04 11:47:06 crc kubenswrapper[4831]: I1204 11:47:06.266512 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b58305-3383-4cf9-9127-481c1bf16ba5/container-auditor/0.log" Dec 04 11:47:06 crc kubenswrapper[4831]: I1204 11:47:06.400238 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b58305-3383-4cf9-9127-481c1bf16ba5/container-server/0.log" Dec 04 11:47:06 crc kubenswrapper[4831]: I1204 11:47:06.407832 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b58305-3383-4cf9-9127-481c1bf16ba5/container-replicator/0.log" Dec 04 11:47:06 crc kubenswrapper[4831]: I1204 11:47:06.469092 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b58305-3383-4cf9-9127-481c1bf16ba5/container-updater/0.log" Dec 04 11:47:06 crc kubenswrapper[4831]: I1204 11:47:06.574706 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b58305-3383-4cf9-9127-481c1bf16ba5/object-auditor/0.log" Dec 04 11:47:06 crc kubenswrapper[4831]: I1204 11:47:06.629920 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b58305-3383-4cf9-9127-481c1bf16ba5/object-expirer/0.log" Dec 04 11:47:06 crc kubenswrapper[4831]: I1204 11:47:06.667989 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b58305-3383-4cf9-9127-481c1bf16ba5/object-server/0.log" Dec 04 11:47:06 crc kubenswrapper[4831]: I1204 11:47:06.682072 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b58305-3383-4cf9-9127-481c1bf16ba5/object-replicator/0.log" Dec 04 11:47:06 crc kubenswrapper[4831]: I1204 11:47:06.807451 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b58305-3383-4cf9-9127-481c1bf16ba5/object-updater/0.log" Dec 04 11:47:06 crc kubenswrapper[4831]: I1204 11:47:06.812464 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b58305-3383-4cf9-9127-481c1bf16ba5/rsync/0.log" Dec 04 11:47:06 crc kubenswrapper[4831]: I1204 11:47:06.905933 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b58305-3383-4cf9-9127-481c1bf16ba5/swift-recon-cron/0.log" Dec 04 11:47:06 crc kubenswrapper[4831]: I1204 11:47:06.986621 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_4e102e9d-f79c-447b-9dff-dfb887b330fe/memcached/0.log" Dec 04 11:47:07 crc kubenswrapper[4831]: I1204 11:47:07.101904 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-xbjd9_c7c8a31d-edf2-4e59-b66f-2e5ddab99661/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:47:07 crc kubenswrapper[4831]: I1204 11:47:07.198361 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_33b0c023-96fd-447b-bc56-2cc465eeeb09/test-operator-logs-container/0.log" Dec 04 11:47:07 crc kubenswrapper[4831]: I1204 11:47:07.384328 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-pgmvz_c5b9af01-df1b-475d-9333-b620a4eacf73/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:47:07 crc kubenswrapper[4831]: I1204 11:47:07.827219 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_b3441a94-3bf3-4956-8d5b-0b88f451404b/tempest-tests-tempest-tests-runner/0.log" Dec 04 11:47:08 crc kubenswrapper[4831]: I1204 11:47:08.398400 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_8d998428-237f-4a86-9be9-8e3f563003b9/watcher-applier/0.log" Dec 04 11:47:08 crc kubenswrapper[4831]: I1204 11:47:08.896910 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_5eab9136-8caf-4dc6-81e4-3d8544e3ad94/watcher-api-log/0.log" Dec 04 11:47:11 crc kubenswrapper[4831]: I1204 11:47:11.107581 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_0edaf6c9-d794-4d0c-ab0c-35d46001545a/watcher-decision-engine/0.log" Dec 04 11:47:12 crc kubenswrapper[4831]: I1204 11:47:12.027709 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_5eab9136-8caf-4dc6-81e4-3d8544e3ad94/watcher-api/0.log" Dec 04 11:47:18 crc kubenswrapper[4831]: I1204 11:47:18.277142 4831 scope.go:117] "RemoveContainer" containerID="c10daa181ba567efa481df940488e4d09754266337731799be708a612b58e69b" Dec 04 11:47:18 crc kubenswrapper[4831]: E1204 11:47:18.278998 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:47:29 crc kubenswrapper[4831]: I1204 11:47:29.277300 4831 scope.go:117] "RemoveContainer" containerID="c10daa181ba567efa481df940488e4d09754266337731799be708a612b58e69b" Dec 04 11:47:29 crc kubenswrapper[4831]: E1204 11:47:29.278051 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:47:32 crc kubenswrapper[4831]: I1204 11:47:32.633108 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2bb7770a7d2a9c343b136e044363c452b5840c9b8cf22f31979453c8f154n8q_b1dcbe95-c901-4e40-a828-144992b53376/util/0.log" Dec 04 11:47:32 crc kubenswrapper[4831]: I1204 11:47:32.813515 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2bb7770a7d2a9c343b136e044363c452b5840c9b8cf22f31979453c8f154n8q_b1dcbe95-c901-4e40-a828-144992b53376/pull/0.log" Dec 04 11:47:32 crc kubenswrapper[4831]: I1204 11:47:32.825037 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2bb7770a7d2a9c343b136e044363c452b5840c9b8cf22f31979453c8f154n8q_b1dcbe95-c901-4e40-a828-144992b53376/util/0.log" Dec 04 11:47:32 crc kubenswrapper[4831]: I1204 11:47:32.845687 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2bb7770a7d2a9c343b136e044363c452b5840c9b8cf22f31979453c8f154n8q_b1dcbe95-c901-4e40-a828-144992b53376/pull/0.log" Dec 04 11:47:32 crc kubenswrapper[4831]: I1204 11:47:32.996955 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2bb7770a7d2a9c343b136e044363c452b5840c9b8cf22f31979453c8f154n8q_b1dcbe95-c901-4e40-a828-144992b53376/pull/0.log" Dec 04 11:47:33 crc kubenswrapper[4831]: I1204 11:47:33.016766 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2bb7770a7d2a9c343b136e044363c452b5840c9b8cf22f31979453c8f154n8q_b1dcbe95-c901-4e40-a828-144992b53376/util/0.log" Dec 04 11:47:33 crc kubenswrapper[4831]: I1204 11:47:33.037099 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2bb7770a7d2a9c343b136e044363c452b5840c9b8cf22f31979453c8f154n8q_b1dcbe95-c901-4e40-a828-144992b53376/extract/0.log" Dec 04 11:47:33 crc kubenswrapper[4831]: I1204 11:47:33.203388 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5bfbbb859d-bcnp2_2fa5a388-dd09-43cb-90d4-01bc536e1e82/kube-rbac-proxy/0.log" Dec 04 11:47:33 crc kubenswrapper[4831]: I1204 11:47:33.329361 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-748967c98-qdltx_f635920e-b830-40f5-afbb-d3a21ac15900/kube-rbac-proxy/0.log" Dec 04 11:47:33 crc kubenswrapper[4831]: I1204 11:47:33.367721 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5bfbbb859d-bcnp2_2fa5a388-dd09-43cb-90d4-01bc536e1e82/manager/0.log" Dec 04 11:47:33 crc kubenswrapper[4831]: I1204 11:47:33.474626 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-748967c98-qdltx_f635920e-b830-40f5-afbb-d3a21ac15900/manager/0.log" Dec 04 11:47:33 crc kubenswrapper[4831]: I1204 11:47:33.518793 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6788cc6d75-szknd_5006fa30-e563-468f-87c1-d062ca2aacc9/kube-rbac-proxy/0.log" Dec 04 11:47:33 crc kubenswrapper[4831]: I1204 11:47:33.604850 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6788cc6d75-szknd_5006fa30-e563-468f-87c1-d062ca2aacc9/manager/0.log" Dec 04 11:47:33 crc kubenswrapper[4831]: I1204 11:47:33.735078 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-85fbd69fcd-pkdwc_2f1667cb-e22c-443f-ab25-594205cd0f52/kube-rbac-proxy/0.log" Dec 04 11:47:33 crc kubenswrapper[4831]: I1204 11:47:33.796932 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-85fbd69fcd-pkdwc_2f1667cb-e22c-443f-ab25-594205cd0f52/manager/0.log" Dec 04 11:47:33 crc kubenswrapper[4831]: I1204 11:47:33.920528 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-698d6fd7d6-8ff6n_fa61aacd-6e6d-4d05-a918-d81916f3f187/kube-rbac-proxy/0.log" Dec 04 11:47:33 crc kubenswrapper[4831]: I1204 11:47:33.968419 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-698d6fd7d6-8ff6n_fa61aacd-6e6d-4d05-a918-d81916f3f187/manager/0.log" Dec 04 11:47:34 crc kubenswrapper[4831]: I1204 11:47:34.062913 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-7d5d9fd47f-78d6x_50076597-4138-4f7c-ad31-378087dfc135/kube-rbac-proxy/0.log" Dec 04 11:47:34 crc kubenswrapper[4831]: I1204 11:47:34.163251 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-7d5d9fd47f-78d6x_50076597-4138-4f7c-ad31-378087dfc135/manager/0.log" Dec 04 11:47:34 crc kubenswrapper[4831]: I1204 11:47:34.285935 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-6c55d8d69b-5945b_16cb21e3-9b80-4fcf-9389-5dc1e2343fa4/kube-rbac-proxy/0.log" Dec 04 11:47:34 crc kubenswrapper[4831]: I1204 11:47:34.415131 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-54485f899-nj5xp_38d9a0e7-49b9-4f91-938e-c040eef2bb37/kube-rbac-proxy/0.log" Dec 04 11:47:34 crc kubenswrapper[4831]: I1204 11:47:34.453229 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-6c55d8d69b-5945b_16cb21e3-9b80-4fcf-9389-5dc1e2343fa4/manager/0.log" Dec 04 11:47:34 crc kubenswrapper[4831]: I1204 11:47:34.519622 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-54485f899-nj5xp_38d9a0e7-49b9-4f91-938e-c040eef2bb37/manager/0.log" Dec 04 11:47:34 crc kubenswrapper[4831]: I1204 11:47:34.654116 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-79cc9d59f5-kwgb2_72db73b6-9683-45a8-8b41-4ced2e11efdd/kube-rbac-proxy/0.log" Dec 04 11:47:34 crc kubenswrapper[4831]: I1204 11:47:34.738165 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-79cc9d59f5-kwgb2_72db73b6-9683-45a8-8b41-4ced2e11efdd/manager/0.log" Dec 04 11:47:34 crc kubenswrapper[4831]: I1204 11:47:34.848061 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5cbc8c7f96-l5fq4_09dfcf4a-2343-414a-a731-a64a914ab3db/kube-rbac-proxy/0.log" Dec 04 11:47:34 crc kubenswrapper[4831]: I1204 11:47:34.864504 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5cbc8c7f96-l5fq4_09dfcf4a-2343-414a-a731-a64a914ab3db/manager/0.log" Dec 04 11:47:34 crc kubenswrapper[4831]: I1204 11:47:34.966378 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-64d7c556cd-8dcqz_10402b7d-27c5-4672-ba4e-2e67fcb5bf68/kube-rbac-proxy/0.log" Dec 04 11:47:35 crc kubenswrapper[4831]: I1204 11:47:35.058613 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-64d7c556cd-8dcqz_10402b7d-27c5-4672-ba4e-2e67fcb5bf68/manager/0.log" Dec 04 11:47:35 crc kubenswrapper[4831]: I1204 11:47:35.161963 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-58879495c-8ffv5_b4b2ad0a-7308-4184-9d37-0c6b60ff873c/kube-rbac-proxy/0.log" Dec 04 11:47:35 crc kubenswrapper[4831]: I1204 11:47:35.240595 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-58879495c-8ffv5_b4b2ad0a-7308-4184-9d37-0c6b60ff873c/manager/0.log" Dec 04 11:47:35 crc kubenswrapper[4831]: I1204 11:47:35.259082 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79d658b66d-7djfw_2381f3a2-3514-4ff4-bfd7-47cb5587265c/kube-rbac-proxy/0.log" Dec 04 11:47:35 crc kubenswrapper[4831]: I1204 11:47:35.436754 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79d658b66d-7djfw_2381f3a2-3514-4ff4-bfd7-47cb5587265c/manager/0.log" Dec 04 11:47:35 crc kubenswrapper[4831]: I1204 11:47:35.508550 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-d5fb87cb8-zcm4n_9935df64-3d24-41fc-bc66-c2c576211287/kube-rbac-proxy/0.log" Dec 04 11:47:35 crc kubenswrapper[4831]: I1204 11:47:35.509169 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-d5fb87cb8-zcm4n_9935df64-3d24-41fc-bc66-c2c576211287/manager/0.log" Dec 04 11:47:35 crc kubenswrapper[4831]: I1204 11:47:35.646762 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-77868f484-2j4hr_2cce552d-d699-47f9-9405-1dd33478f23d/kube-rbac-proxy/0.log" Dec 04 11:47:35 crc kubenswrapper[4831]: I1204 11:47:35.710687 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-77868f484-2j4hr_2cce552d-d699-47f9-9405-1dd33478f23d/manager/0.log" Dec 04 11:47:35 crc kubenswrapper[4831]: I1204 11:47:35.874644 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-76898dc959-mxx27_8a70b911-37d6-41d5-81cf-9c631572a523/kube-rbac-proxy/0.log" Dec 04 11:47:35 crc kubenswrapper[4831]: I1204 11:47:35.971444 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-78f7b66457-gw6rr_308de017-ff36-4bae-95e7-e0b5c986e62e/kube-rbac-proxy/0.log" Dec 04 11:47:36 crc kubenswrapper[4831]: I1204 11:47:36.250741 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-gz9v7_1611e770-dd83-4a9d-a496-1c36c7246ef0/registry-server/0.log" Dec 04 11:47:36 crc kubenswrapper[4831]: I1204 11:47:36.358008 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-78f7b66457-gw6rr_308de017-ff36-4bae-95e7-e0b5c986e62e/operator/0.log" Dec 04 11:47:36 crc kubenswrapper[4831]: I1204 11:47:36.505209 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5b67cfc8fb-jcvk6_0c772a6a-6023-4eea-870a-904ae2d47896/kube-rbac-proxy/0.log" Dec 04 11:47:36 crc kubenswrapper[4831]: I1204 11:47:36.738421 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5b67cfc8fb-jcvk6_0c772a6a-6023-4eea-870a-904ae2d47896/manager/0.log" Dec 04 11:47:36 crc kubenswrapper[4831]: I1204 11:47:36.802288 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-867d87977b-m5ds4_de9a52da-79c5-43dd-8a20-eb9917314ad5/kube-rbac-proxy/0.log" Dec 04 11:47:37 crc kubenswrapper[4831]: I1204 11:47:37.021601 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-867d87977b-m5ds4_de9a52da-79c5-43dd-8a20-eb9917314ad5/manager/0.log" Dec 04 11:47:37 crc kubenswrapper[4831]: I1204 11:47:37.061688 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-cvbgc_63d30685-d6c8-44cf-a586-2bb8030844da/operator/0.log" Dec 04 11:47:37 crc kubenswrapper[4831]: I1204 11:47:37.214626 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-8f6687c44-6wj9b_f2bea76c-1144-4d84-b1f3-e00fd84aa09d/kube-rbac-proxy/0.log" Dec 04 11:47:37 crc kubenswrapper[4831]: I1204 11:47:37.314152 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-8f6687c44-6wj9b_f2bea76c-1144-4d84-b1f3-e00fd84aa09d/manager/0.log" Dec 04 11:47:37 crc kubenswrapper[4831]: I1204 11:47:37.361558 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-76898dc959-mxx27_8a70b911-37d6-41d5-81cf-9c631572a523/manager/0.log" Dec 04 11:47:37 crc kubenswrapper[4831]: I1204 11:47:37.393108 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-695797c565-7tbpv_b3bc64ee-f0c4-4c2a-99b7-0c7bd8a5a970/kube-rbac-proxy/0.log" Dec 04 11:47:37 crc kubenswrapper[4831]: I1204 11:47:37.568450 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-bb86466d8-x2g4s_578ef786-b985-42c9-ae2c-5d93b7359382/kube-rbac-proxy/0.log" Dec 04 11:47:37 crc kubenswrapper[4831]: I1204 11:47:37.614300 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-695797c565-7tbpv_b3bc64ee-f0c4-4c2a-99b7-0c7bd8a5a970/manager/0.log" Dec 04 11:47:37 crc kubenswrapper[4831]: I1204 11:47:37.650877 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-bb86466d8-x2g4s_578ef786-b985-42c9-ae2c-5d93b7359382/manager/0.log" Dec 04 11:47:37 crc kubenswrapper[4831]: I1204 11:47:37.740791 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-7f99c4b8d7-bl79v_1cd0c273-75dc-4c87-bbf5-0a68b76f8ca6/kube-rbac-proxy/0.log" Dec 04 11:47:37 crc kubenswrapper[4831]: I1204 11:47:37.829248 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-7f99c4b8d7-bl79v_1cd0c273-75dc-4c87-bbf5-0a68b76f8ca6/manager/0.log" Dec 04 11:47:44 crc kubenswrapper[4831]: I1204 11:47:44.277323 4831 scope.go:117] "RemoveContainer" containerID="c10daa181ba567efa481df940488e4d09754266337731799be708a612b58e69b" Dec 04 11:47:44 crc kubenswrapper[4831]: E1204 11:47:44.278146 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:47:53 crc kubenswrapper[4831]: I1204 11:47:53.040918 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-mgrdx_68cdd202-055b-41c7-ac5f-a13b918c44fc/control-plane-machine-set-operator/0.log" Dec 04 11:47:53 crc kubenswrapper[4831]: I1204 11:47:53.221545 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-4bj64_7da1b053-67d9-4a4c-842e-cafb5dce5017/kube-rbac-proxy/0.log" Dec 04 11:47:53 crc kubenswrapper[4831]: I1204 11:47:53.246142 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-4bj64_7da1b053-67d9-4a4c-842e-cafb5dce5017/machine-api-operator/0.log" Dec 04 11:47:56 crc kubenswrapper[4831]: I1204 11:47:56.276496 4831 scope.go:117] "RemoveContainer" containerID="c10daa181ba567efa481df940488e4d09754266337731799be708a612b58e69b" Dec 04 11:47:56 crc kubenswrapper[4831]: E1204 11:47:56.277314 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:48:04 crc kubenswrapper[4831]: I1204 11:48:04.762425 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-4t49t_b5c24c38-2e13-41db-87b8-187403880fba/cert-manager-controller/0.log" Dec 04 11:48:04 crc kubenswrapper[4831]: I1204 11:48:04.925603 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-hw7ft_d66df267-588e-4675-8081-046e41097b63/cert-manager-cainjector/0.log" Dec 04 11:48:05 crc kubenswrapper[4831]: I1204 11:48:05.003419 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-ktsxl_d6099ee7-1019-4da3-b3c2-6b47d3c81931/cert-manager-webhook/0.log" Dec 04 11:48:07 crc kubenswrapper[4831]: I1204 11:48:07.277191 4831 scope.go:117] "RemoveContainer" containerID="c10daa181ba567efa481df940488e4d09754266337731799be708a612b58e69b" Dec 04 11:48:07 crc kubenswrapper[4831]: E1204 11:48:07.277813 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:48:16 crc kubenswrapper[4831]: I1204 11:48:16.110234 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-pqz7x_662bab54-215c-4d55-89c1-9ae40f088ae8/nmstate-console-plugin/0.log" Dec 04 11:48:16 crc kubenswrapper[4831]: I1204 11:48:16.297203 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-xxbff_db769c39-2e84-4cf2-b604-73ca1c18c017/nmstate-handler/0.log" Dec 04 11:48:16 crc kubenswrapper[4831]: I1204 11:48:16.332707 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-5xmlv_7792787f-3a02-47e6-b818-8875b8c5b1d7/kube-rbac-proxy/0.log" Dec 04 11:48:16 crc kubenswrapper[4831]: I1204 11:48:16.389690 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-5xmlv_7792787f-3a02-47e6-b818-8875b8c5b1d7/nmstate-metrics/0.log" Dec 04 11:48:16 crc kubenswrapper[4831]: I1204 11:48:16.495752 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-5h7gk_28920e6c-c06d-4dd1-8cb4-5cc192c2e8a7/nmstate-operator/0.log" Dec 04 11:48:16 crc kubenswrapper[4831]: I1204 11:48:16.586260 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-xgl8t_61015d8b-cce7-496b-abc6-3b8728072665/nmstate-webhook/0.log" Dec 04 11:48:20 crc kubenswrapper[4831]: I1204 11:48:20.276809 4831 scope.go:117] "RemoveContainer" containerID="c10daa181ba567efa481df940488e4d09754266337731799be708a612b58e69b" Dec 04 11:48:20 crc kubenswrapper[4831]: E1204 11:48:20.277649 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:48:32 crc kubenswrapper[4831]: I1204 11:48:32.144621 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-m2fz5_fc7c8ea8-2f48-4b72-8209-3763c0fe74e4/kube-rbac-proxy/0.log" Dec 04 11:48:32 crc kubenswrapper[4831]: I1204 11:48:32.274372 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-m2fz5_fc7c8ea8-2f48-4b72-8209-3763c0fe74e4/controller/0.log" Dec 04 11:48:32 crc kubenswrapper[4831]: I1204 11:48:32.369946 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4d7f7_c082e7e3-e171-4637-84a9-84f8aa17b51e/cp-frr-files/0.log" Dec 04 11:48:32 crc kubenswrapper[4831]: I1204 11:48:32.660243 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4d7f7_c082e7e3-e171-4637-84a9-84f8aa17b51e/cp-frr-files/0.log" Dec 04 11:48:32 crc kubenswrapper[4831]: I1204 11:48:32.681235 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4d7f7_c082e7e3-e171-4637-84a9-84f8aa17b51e/cp-reloader/0.log" Dec 04 11:48:32 crc kubenswrapper[4831]: I1204 11:48:32.761217 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4d7f7_c082e7e3-e171-4637-84a9-84f8aa17b51e/cp-reloader/0.log" Dec 04 11:48:32 crc kubenswrapper[4831]: I1204 11:48:32.761768 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4d7f7_c082e7e3-e171-4637-84a9-84f8aa17b51e/cp-metrics/0.log" Dec 04 11:48:32 crc kubenswrapper[4831]: I1204 11:48:32.963359 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4d7f7_c082e7e3-e171-4637-84a9-84f8aa17b51e/cp-reloader/0.log" Dec 04 11:48:32 crc kubenswrapper[4831]: I1204 11:48:32.984971 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4d7f7_c082e7e3-e171-4637-84a9-84f8aa17b51e/cp-frr-files/0.log" Dec 04 11:48:32 crc kubenswrapper[4831]: I1204 11:48:32.990743 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4d7f7_c082e7e3-e171-4637-84a9-84f8aa17b51e/cp-metrics/0.log" Dec 04 11:48:32 crc kubenswrapper[4831]: I1204 11:48:32.991837 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4d7f7_c082e7e3-e171-4637-84a9-84f8aa17b51e/cp-metrics/0.log" Dec 04 11:48:33 crc kubenswrapper[4831]: I1204 11:48:33.262018 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4d7f7_c082e7e3-e171-4637-84a9-84f8aa17b51e/controller/0.log" Dec 04 11:48:33 crc kubenswrapper[4831]: I1204 11:48:33.272839 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4d7f7_c082e7e3-e171-4637-84a9-84f8aa17b51e/cp-reloader/0.log" Dec 04 11:48:33 crc kubenswrapper[4831]: I1204 11:48:33.313071 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4d7f7_c082e7e3-e171-4637-84a9-84f8aa17b51e/cp-metrics/0.log" Dec 04 11:48:33 crc kubenswrapper[4831]: I1204 11:48:33.316182 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4d7f7_c082e7e3-e171-4637-84a9-84f8aa17b51e/cp-frr-files/0.log" Dec 04 11:48:33 crc kubenswrapper[4831]: I1204 11:48:33.461225 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4d7f7_c082e7e3-e171-4637-84a9-84f8aa17b51e/frr-metrics/0.log" Dec 04 11:48:33 crc kubenswrapper[4831]: I1204 11:48:33.503133 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4d7f7_c082e7e3-e171-4637-84a9-84f8aa17b51e/kube-rbac-proxy-frr/0.log" Dec 04 11:48:33 crc kubenswrapper[4831]: I1204 11:48:33.508901 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4d7f7_c082e7e3-e171-4637-84a9-84f8aa17b51e/kube-rbac-proxy/0.log" Dec 04 11:48:33 crc kubenswrapper[4831]: I1204 11:48:33.711925 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4d7f7_c082e7e3-e171-4637-84a9-84f8aa17b51e/reloader/0.log" Dec 04 11:48:33 crc kubenswrapper[4831]: I1204 11:48:33.734948 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-fw678_f49f5bce-b752-4729-aa67-28847f9f04b1/frr-k8s-webhook-server/0.log" Dec 04 11:48:34 crc kubenswrapper[4831]: I1204 11:48:34.081201 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-797466985-hc4vf_2f3f650b-08d4-44fb-80c0-eed2144aa7fd/manager/0.log" Dec 04 11:48:34 crc kubenswrapper[4831]: I1204 11:48:34.152121 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-fb84d4944-xtmbd_513753c7-a982-42db-b4f9-a06f04c2f806/webhook-server/0.log" Dec 04 11:48:34 crc kubenswrapper[4831]: I1204 11:48:34.277892 4831 scope.go:117] "RemoveContainer" containerID="c10daa181ba567efa481df940488e4d09754266337731799be708a612b58e69b" Dec 04 11:48:34 crc kubenswrapper[4831]: I1204 11:48:34.455153 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-rgd8b_43cdb7df-ed6c-4f7b-b460-467a22bfe06c/kube-rbac-proxy/0.log" Dec 04 11:48:35 crc kubenswrapper[4831]: I1204 11:48:35.007063 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-rgd8b_43cdb7df-ed6c-4f7b-b460-467a22bfe06c/speaker/0.log" Dec 04 11:48:35 crc kubenswrapper[4831]: I1204 11:48:35.053513 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" event={"ID":"8475bb26-8864-4d49-935b-db7d4cb73387","Type":"ContainerStarted","Data":"ff414e055bbc22bc31dcdcf17a367f82d6b7294e5b2320dbfa07d71d0a78e2be"} Dec 04 11:48:35 crc kubenswrapper[4831]: I1204 11:48:35.325189 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4d7f7_c082e7e3-e171-4637-84a9-84f8aa17b51e/frr/0.log" Dec 04 11:48:48 crc kubenswrapper[4831]: I1204 11:48:48.309431 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbwhc2_72368101-d91a-4a33-b551-667f279020c6/util/0.log" Dec 04 11:48:48 crc kubenswrapper[4831]: I1204 11:48:48.456900 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbwhc2_72368101-d91a-4a33-b551-667f279020c6/util/0.log" Dec 04 11:48:48 crc kubenswrapper[4831]: I1204 11:48:48.509358 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbwhc2_72368101-d91a-4a33-b551-667f279020c6/pull/0.log" Dec 04 11:48:48 crc kubenswrapper[4831]: I1204 11:48:48.547511 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbwhc2_72368101-d91a-4a33-b551-667f279020c6/pull/0.log" Dec 04 11:48:48 crc kubenswrapper[4831]: I1204 11:48:48.733713 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbwhc2_72368101-d91a-4a33-b551-667f279020c6/util/0.log" Dec 04 11:48:48 crc kubenswrapper[4831]: I1204 11:48:48.768477 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbwhc2_72368101-d91a-4a33-b551-667f279020c6/pull/0.log" Dec 04 11:48:48 crc kubenswrapper[4831]: I1204 11:48:48.780301 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fbwhc2_72368101-d91a-4a33-b551-667f279020c6/extract/0.log" Dec 04 11:48:49 crc kubenswrapper[4831]: I1204 11:48:49.109176 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2ww4_a85d95a9-36ba-4a69-a690-6e2f3b5421c6/util/0.log" Dec 04 11:48:49 crc kubenswrapper[4831]: I1204 11:48:49.289214 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2ww4_a85d95a9-36ba-4a69-a690-6e2f3b5421c6/util/0.log" Dec 04 11:48:49 crc kubenswrapper[4831]: I1204 11:48:49.334546 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2ww4_a85d95a9-36ba-4a69-a690-6e2f3b5421c6/pull/0.log" Dec 04 11:48:49 crc kubenswrapper[4831]: I1204 11:48:49.392057 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2ww4_a85d95a9-36ba-4a69-a690-6e2f3b5421c6/pull/0.log" Dec 04 11:48:49 crc kubenswrapper[4831]: I1204 11:48:49.541427 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2ww4_a85d95a9-36ba-4a69-a690-6e2f3b5421c6/pull/0.log" Dec 04 11:48:49 crc kubenswrapper[4831]: I1204 11:48:49.572116 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2ww4_a85d95a9-36ba-4a69-a690-6e2f3b5421c6/extract/0.log" Dec 04 11:48:49 crc kubenswrapper[4831]: I1204 11:48:49.573619 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2ww4_a85d95a9-36ba-4a69-a690-6e2f3b5421c6/util/0.log" Dec 04 11:48:49 crc kubenswrapper[4831]: I1204 11:48:49.745538 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ljd2v_5359fc14-ea59-4f53-b1e4-c89f65453df0/util/0.log" Dec 04 11:48:49 crc kubenswrapper[4831]: I1204 11:48:49.944309 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ljd2v_5359fc14-ea59-4f53-b1e4-c89f65453df0/util/0.log" Dec 04 11:48:49 crc kubenswrapper[4831]: I1204 11:48:49.960731 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ljd2v_5359fc14-ea59-4f53-b1e4-c89f65453df0/pull/0.log" Dec 04 11:48:49 crc kubenswrapper[4831]: I1204 11:48:49.976324 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ljd2v_5359fc14-ea59-4f53-b1e4-c89f65453df0/pull/0.log" Dec 04 11:48:50 crc kubenswrapper[4831]: I1204 11:48:50.108436 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ljd2v_5359fc14-ea59-4f53-b1e4-c89f65453df0/pull/0.log" Dec 04 11:48:50 crc kubenswrapper[4831]: I1204 11:48:50.218511 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ljd2v_5359fc14-ea59-4f53-b1e4-c89f65453df0/extract/0.log" Dec 04 11:48:50 crc kubenswrapper[4831]: I1204 11:48:50.224465 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ljd2v_5359fc14-ea59-4f53-b1e4-c89f65453df0/util/0.log" Dec 04 11:48:50 crc kubenswrapper[4831]: I1204 11:48:50.354377 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-95vcr_910ac3eb-beda-4174-b3da-e3d708ffbcc3/extract-utilities/0.log" Dec 04 11:48:50 crc kubenswrapper[4831]: I1204 11:48:50.518848 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-95vcr_910ac3eb-beda-4174-b3da-e3d708ffbcc3/extract-content/0.log" Dec 04 11:48:50 crc kubenswrapper[4831]: I1204 11:48:50.531718 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-95vcr_910ac3eb-beda-4174-b3da-e3d708ffbcc3/extract-utilities/0.log" Dec 04 11:48:50 crc kubenswrapper[4831]: I1204 11:48:50.584179 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-95vcr_910ac3eb-beda-4174-b3da-e3d708ffbcc3/extract-content/0.log" Dec 04 11:48:50 crc kubenswrapper[4831]: I1204 11:48:50.762871 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-95vcr_910ac3eb-beda-4174-b3da-e3d708ffbcc3/extract-utilities/0.log" Dec 04 11:48:50 crc kubenswrapper[4831]: I1204 11:48:50.810797 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-95vcr_910ac3eb-beda-4174-b3da-e3d708ffbcc3/extract-content/0.log" Dec 04 11:48:51 crc kubenswrapper[4831]: I1204 11:48:51.001619 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99lqd_6a0d708c-5b81-4c5d-9fd7-98e2aed6e7f1/extract-utilities/0.log" Dec 04 11:48:51 crc kubenswrapper[4831]: I1204 11:48:51.239085 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99lqd_6a0d708c-5b81-4c5d-9fd7-98e2aed6e7f1/extract-utilities/0.log" Dec 04 11:48:51 crc kubenswrapper[4831]: I1204 11:48:51.249554 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99lqd_6a0d708c-5b81-4c5d-9fd7-98e2aed6e7f1/extract-content/0.log" Dec 04 11:48:51 crc kubenswrapper[4831]: I1204 11:48:51.346644 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99lqd_6a0d708c-5b81-4c5d-9fd7-98e2aed6e7f1/extract-content/0.log" Dec 04 11:48:51 crc kubenswrapper[4831]: I1204 11:48:51.548447 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99lqd_6a0d708c-5b81-4c5d-9fd7-98e2aed6e7f1/extract-utilities/0.log" Dec 04 11:48:51 crc kubenswrapper[4831]: I1204 11:48:51.557110 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99lqd_6a0d708c-5b81-4c5d-9fd7-98e2aed6e7f1/extract-content/0.log" Dec 04 11:48:51 crc kubenswrapper[4831]: I1204 11:48:51.625475 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-95vcr_910ac3eb-beda-4174-b3da-e3d708ffbcc3/registry-server/0.log" Dec 04 11:48:51 crc kubenswrapper[4831]: I1204 11:48:51.833949 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-jpjfl_398ad5ca-6c9b-4503-a64e-c31a5e34205a/marketplace-operator/0.log" Dec 04 11:48:52 crc kubenswrapper[4831]: I1204 11:48:52.061357 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j9qm7_a598e150-ad70-412c-bf06-9e7bd26a8422/extract-utilities/0.log" Dec 04 11:48:52 crc kubenswrapper[4831]: I1204 11:48:52.335918 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j9qm7_a598e150-ad70-412c-bf06-9e7bd26a8422/extract-content/0.log" Dec 04 11:48:52 crc kubenswrapper[4831]: I1204 11:48:52.338936 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j9qm7_a598e150-ad70-412c-bf06-9e7bd26a8422/extract-utilities/0.log" Dec 04 11:48:52 crc kubenswrapper[4831]: I1204 11:48:52.398695 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j9qm7_a598e150-ad70-412c-bf06-9e7bd26a8422/extract-content/0.log" Dec 04 11:48:52 crc kubenswrapper[4831]: I1204 11:48:52.602407 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j9qm7_a598e150-ad70-412c-bf06-9e7bd26a8422/extract-utilities/0.log" Dec 04 11:48:52 crc kubenswrapper[4831]: I1204 11:48:52.612203 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j9qm7_a598e150-ad70-412c-bf06-9e7bd26a8422/extract-content/0.log" Dec 04 11:48:52 crc kubenswrapper[4831]: I1204 11:48:52.799497 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-khfsg_1d852025-4ea6-4343-813c-2411dec5469f/extract-utilities/0.log" Dec 04 11:48:52 crc kubenswrapper[4831]: I1204 11:48:52.944879 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-khfsg_1d852025-4ea6-4343-813c-2411dec5469f/extract-utilities/0.log" Dec 04 11:48:52 crc kubenswrapper[4831]: I1204 11:48:52.966200 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-khfsg_1d852025-4ea6-4343-813c-2411dec5469f/extract-content/0.log" Dec 04 11:48:53 crc kubenswrapper[4831]: I1204 11:48:53.000719 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-khfsg_1d852025-4ea6-4343-813c-2411dec5469f/extract-content/0.log" Dec 04 11:48:53 crc kubenswrapper[4831]: I1204 11:48:53.169462 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-khfsg_1d852025-4ea6-4343-813c-2411dec5469f/extract-utilities/0.log" Dec 04 11:48:53 crc kubenswrapper[4831]: I1204 11:48:53.170777 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-khfsg_1d852025-4ea6-4343-813c-2411dec5469f/extract-content/0.log" Dec 04 11:48:53 crc kubenswrapper[4831]: I1204 11:48:53.826543 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j9qm7_a598e150-ad70-412c-bf06-9e7bd26a8422/registry-server/0.log" Dec 04 11:48:53 crc kubenswrapper[4831]: I1204 11:48:53.905714 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99lqd_6a0d708c-5b81-4c5d-9fd7-98e2aed6e7f1/registry-server/0.log" Dec 04 11:48:54 crc kubenswrapper[4831]: I1204 11:48:54.516205 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-khfsg_1d852025-4ea6-4343-813c-2411dec5469f/registry-server/0.log" Dec 04 11:49:06 crc kubenswrapper[4831]: I1204 11:49:06.313278 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-tmjrz_6aea0d1c-6c0f-4d5a-a071-c1c597eea91c/prometheus-operator/0.log" Dec 04 11:49:06 crc kubenswrapper[4831]: I1204 11:49:06.486463 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-74cccbd8d-6mtfn_392c5061-c622-463b-b71c-961e2495e965/prometheus-operator-admission-webhook/0.log" Dec 04 11:49:06 crc kubenswrapper[4831]: I1204 11:49:06.491124 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-74cccbd8d-4qkhw_cd429fe0-b539-4209-a587-a9534c8fcc74/prometheus-operator-admission-webhook/0.log" Dec 04 11:49:06 crc kubenswrapper[4831]: I1204 11:49:06.665510 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-j8hhw_a0aeeeda-2835-40be-bc77-78056739952f/operator/0.log" Dec 04 11:49:06 crc kubenswrapper[4831]: I1204 11:49:06.719557 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-b4ljp_515a4768-685f-45a1-b7e7-0b0087ba126e/perses-operator/0.log" Dec 04 11:50:51 crc kubenswrapper[4831]: I1204 11:50:51.972067 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:50:51 crc kubenswrapper[4831]: I1204 11:50:51.972655 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:51:07 crc kubenswrapper[4831]: I1204 11:51:07.836682 4831 generic.go:334] "Generic (PLEG): container finished" podID="4eb9715b-7d23-4281-8ee2-c4f5bbbbb222" containerID="9d7a244aa349ddccf8e79186f17317e7ec24ed12092ab571cf5b107c798cf964" exitCode=0 Dec 04 11:51:07 crc kubenswrapper[4831]: I1204 11:51:07.836772 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fn8lm/must-gather-fvqbw" event={"ID":"4eb9715b-7d23-4281-8ee2-c4f5bbbbb222","Type":"ContainerDied","Data":"9d7a244aa349ddccf8e79186f17317e7ec24ed12092ab571cf5b107c798cf964"} Dec 04 11:51:07 crc kubenswrapper[4831]: I1204 11:51:07.837823 4831 scope.go:117] "RemoveContainer" containerID="9d7a244aa349ddccf8e79186f17317e7ec24ed12092ab571cf5b107c798cf964" Dec 04 11:51:08 crc kubenswrapper[4831]: I1204 11:51:08.380345 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fn8lm_must-gather-fvqbw_4eb9715b-7d23-4281-8ee2-c4f5bbbbb222/gather/0.log" Dec 04 11:51:16 crc kubenswrapper[4831]: I1204 11:51:16.836056 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fn8lm/must-gather-fvqbw"] Dec 04 11:51:16 crc kubenswrapper[4831]: I1204 11:51:16.836924 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-fn8lm/must-gather-fvqbw" podUID="4eb9715b-7d23-4281-8ee2-c4f5bbbbb222" containerName="copy" containerID="cri-o://e151c4513af4e786fb38ea4e89f740a7509cb23d56977680f6f100ea4008b39d" gracePeriod=2 Dec 04 11:51:16 crc kubenswrapper[4831]: I1204 11:51:16.846835 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fn8lm/must-gather-fvqbw"] Dec 04 11:51:17 crc kubenswrapper[4831]: I1204 11:51:17.372842 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fn8lm_must-gather-fvqbw_4eb9715b-7d23-4281-8ee2-c4f5bbbbb222/copy/0.log" Dec 04 11:51:17 crc kubenswrapper[4831]: I1204 11:51:17.373672 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fn8lm/must-gather-fvqbw" Dec 04 11:51:17 crc kubenswrapper[4831]: I1204 11:51:17.449256 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4eb9715b-7d23-4281-8ee2-c4f5bbbbb222-must-gather-output\") pod \"4eb9715b-7d23-4281-8ee2-c4f5bbbbb222\" (UID: \"4eb9715b-7d23-4281-8ee2-c4f5bbbbb222\") " Dec 04 11:51:17 crc kubenswrapper[4831]: I1204 11:51:17.449343 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtv8g\" (UniqueName: \"kubernetes.io/projected/4eb9715b-7d23-4281-8ee2-c4f5bbbbb222-kube-api-access-rtv8g\") pod \"4eb9715b-7d23-4281-8ee2-c4f5bbbbb222\" (UID: \"4eb9715b-7d23-4281-8ee2-c4f5bbbbb222\") " Dec 04 11:51:17 crc kubenswrapper[4831]: I1204 11:51:17.456442 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eb9715b-7d23-4281-8ee2-c4f5bbbbb222-kube-api-access-rtv8g" (OuterVolumeSpecName: "kube-api-access-rtv8g") pod "4eb9715b-7d23-4281-8ee2-c4f5bbbbb222" (UID: "4eb9715b-7d23-4281-8ee2-c4f5bbbbb222"). InnerVolumeSpecName "kube-api-access-rtv8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:51:17 crc kubenswrapper[4831]: I1204 11:51:17.551407 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtv8g\" (UniqueName: \"kubernetes.io/projected/4eb9715b-7d23-4281-8ee2-c4f5bbbbb222-kube-api-access-rtv8g\") on node \"crc\" DevicePath \"\"" Dec 04 11:51:17 crc kubenswrapper[4831]: I1204 11:51:17.658283 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4eb9715b-7d23-4281-8ee2-c4f5bbbbb222-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "4eb9715b-7d23-4281-8ee2-c4f5bbbbb222" (UID: "4eb9715b-7d23-4281-8ee2-c4f5bbbbb222"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:51:17 crc kubenswrapper[4831]: I1204 11:51:17.755957 4831 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4eb9715b-7d23-4281-8ee2-c4f5bbbbb222-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 04 11:51:17 crc kubenswrapper[4831]: I1204 11:51:17.931792 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fn8lm_must-gather-fvqbw_4eb9715b-7d23-4281-8ee2-c4f5bbbbb222/copy/0.log" Dec 04 11:51:17 crc kubenswrapper[4831]: I1204 11:51:17.932320 4831 generic.go:334] "Generic (PLEG): container finished" podID="4eb9715b-7d23-4281-8ee2-c4f5bbbbb222" containerID="e151c4513af4e786fb38ea4e89f740a7509cb23d56977680f6f100ea4008b39d" exitCode=143 Dec 04 11:51:17 crc kubenswrapper[4831]: I1204 11:51:17.932387 4831 scope.go:117] "RemoveContainer" containerID="e151c4513af4e786fb38ea4e89f740a7509cb23d56977680f6f100ea4008b39d" Dec 04 11:51:17 crc kubenswrapper[4831]: I1204 11:51:17.932396 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fn8lm/must-gather-fvqbw" Dec 04 11:51:17 crc kubenswrapper[4831]: I1204 11:51:17.956020 4831 scope.go:117] "RemoveContainer" containerID="9d7a244aa349ddccf8e79186f17317e7ec24ed12092ab571cf5b107c798cf964" Dec 04 11:51:18 crc kubenswrapper[4831]: I1204 11:51:18.036034 4831 scope.go:117] "RemoveContainer" containerID="e151c4513af4e786fb38ea4e89f740a7509cb23d56977680f6f100ea4008b39d" Dec 04 11:51:18 crc kubenswrapper[4831]: E1204 11:51:18.036500 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e151c4513af4e786fb38ea4e89f740a7509cb23d56977680f6f100ea4008b39d\": container with ID starting with e151c4513af4e786fb38ea4e89f740a7509cb23d56977680f6f100ea4008b39d not found: ID does not exist" containerID="e151c4513af4e786fb38ea4e89f740a7509cb23d56977680f6f100ea4008b39d" Dec 04 11:51:18 crc kubenswrapper[4831]: I1204 11:51:18.036545 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e151c4513af4e786fb38ea4e89f740a7509cb23d56977680f6f100ea4008b39d"} err="failed to get container status \"e151c4513af4e786fb38ea4e89f740a7509cb23d56977680f6f100ea4008b39d\": rpc error: code = NotFound desc = could not find container \"e151c4513af4e786fb38ea4e89f740a7509cb23d56977680f6f100ea4008b39d\": container with ID starting with e151c4513af4e786fb38ea4e89f740a7509cb23d56977680f6f100ea4008b39d not found: ID does not exist" Dec 04 11:51:18 crc kubenswrapper[4831]: I1204 11:51:18.036571 4831 scope.go:117] "RemoveContainer" containerID="9d7a244aa349ddccf8e79186f17317e7ec24ed12092ab571cf5b107c798cf964" Dec 04 11:51:18 crc kubenswrapper[4831]: E1204 11:51:18.036900 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d7a244aa349ddccf8e79186f17317e7ec24ed12092ab571cf5b107c798cf964\": container with ID starting with 9d7a244aa349ddccf8e79186f17317e7ec24ed12092ab571cf5b107c798cf964 not found: ID does not exist" containerID="9d7a244aa349ddccf8e79186f17317e7ec24ed12092ab571cf5b107c798cf964" Dec 04 11:51:18 crc kubenswrapper[4831]: I1204 11:51:18.036950 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d7a244aa349ddccf8e79186f17317e7ec24ed12092ab571cf5b107c798cf964"} err="failed to get container status \"9d7a244aa349ddccf8e79186f17317e7ec24ed12092ab571cf5b107c798cf964\": rpc error: code = NotFound desc = could not find container \"9d7a244aa349ddccf8e79186f17317e7ec24ed12092ab571cf5b107c798cf964\": container with ID starting with 9d7a244aa349ddccf8e79186f17317e7ec24ed12092ab571cf5b107c798cf964 not found: ID does not exist" Dec 04 11:51:19 crc kubenswrapper[4831]: I1204 11:51:19.291068 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4eb9715b-7d23-4281-8ee2-c4f5bbbbb222" path="/var/lib/kubelet/pods/4eb9715b-7d23-4281-8ee2-c4f5bbbbb222/volumes" Dec 04 11:51:21 crc kubenswrapper[4831]: I1204 11:51:21.970995 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:51:21 crc kubenswrapper[4831]: I1204 11:51:21.971461 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:51:28 crc kubenswrapper[4831]: I1204 11:51:28.932884 4831 scope.go:117] "RemoveContainer" containerID="98f57286c02e70d87767adce8d08273ddfdf820caf2e1faf4b606a87898337d5" Dec 04 11:51:29 crc kubenswrapper[4831]: I1204 11:51:29.290107 4831 scope.go:117] "RemoveContainer" containerID="49a6dd85fdf42dbed91d8463e9bbf18898a8cbd728320201f2cc4f9f759c1ff0" Dec 04 11:51:29 crc kubenswrapper[4831]: I1204 11:51:29.315332 4831 scope.go:117] "RemoveContainer" containerID="e0d23523e51312b7828baff95af25d0c41aafd10dc0b79508ce9bcafc712ce94" Dec 04 11:51:51 crc kubenswrapper[4831]: I1204 11:51:51.972096 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:51:51 crc kubenswrapper[4831]: I1204 11:51:51.972798 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:51:51 crc kubenswrapper[4831]: I1204 11:51:51.972865 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" Dec 04 11:51:51 crc kubenswrapper[4831]: I1204 11:51:51.973949 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ff414e055bbc22bc31dcdcf17a367f82d6b7294e5b2320dbfa07d71d0a78e2be"} pod="openshift-machine-config-operator/machine-config-daemon-g76nn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 11:51:51 crc kubenswrapper[4831]: I1204 11:51:51.974043 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" containerID="cri-o://ff414e055bbc22bc31dcdcf17a367f82d6b7294e5b2320dbfa07d71d0a78e2be" gracePeriod=600 Dec 04 11:51:52 crc kubenswrapper[4831]: I1204 11:51:52.379946 4831 generic.go:334] "Generic (PLEG): container finished" podID="8475bb26-8864-4d49-935b-db7d4cb73387" containerID="ff414e055bbc22bc31dcdcf17a367f82d6b7294e5b2320dbfa07d71d0a78e2be" exitCode=0 Dec 04 11:51:52 crc kubenswrapper[4831]: I1204 11:51:52.380021 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" event={"ID":"8475bb26-8864-4d49-935b-db7d4cb73387","Type":"ContainerDied","Data":"ff414e055bbc22bc31dcdcf17a367f82d6b7294e5b2320dbfa07d71d0a78e2be"} Dec 04 11:51:52 crc kubenswrapper[4831]: I1204 11:51:52.380649 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" event={"ID":"8475bb26-8864-4d49-935b-db7d4cb73387","Type":"ContainerStarted","Data":"4f21f7414de02ac92f4a182ade6a48ec3bca0e71e30c3a2fe8a39329e663fab1"} Dec 04 11:51:52 crc kubenswrapper[4831]: I1204 11:51:52.380695 4831 scope.go:117] "RemoveContainer" containerID="c10daa181ba567efa481df940488e4d09754266337731799be708a612b58e69b" Dec 04 11:52:29 crc kubenswrapper[4831]: I1204 11:52:29.445312 4831 scope.go:117] "RemoveContainer" containerID="dc64faab6b989aba4421710c40a3ef9182621d2e52b39051186118746761395d" Dec 04 11:53:42 crc kubenswrapper[4831]: I1204 11:53:42.827110 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vgc6m"] Dec 04 11:53:42 crc kubenswrapper[4831]: E1204 11:53:42.829022 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eb9715b-7d23-4281-8ee2-c4f5bbbbb222" containerName="copy" Dec 04 11:53:42 crc kubenswrapper[4831]: I1204 11:53:42.829048 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eb9715b-7d23-4281-8ee2-c4f5bbbbb222" containerName="copy" Dec 04 11:53:42 crc kubenswrapper[4831]: E1204 11:53:42.829111 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6b9f49f-b55a-4e75-961f-249e555d3d07" containerName="container-00" Dec 04 11:53:42 crc kubenswrapper[4831]: I1204 11:53:42.829124 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6b9f49f-b55a-4e75-961f-249e555d3d07" containerName="container-00" Dec 04 11:53:42 crc kubenswrapper[4831]: E1204 11:53:42.829154 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eb9715b-7d23-4281-8ee2-c4f5bbbbb222" containerName="gather" Dec 04 11:53:42 crc kubenswrapper[4831]: I1204 11:53:42.829164 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eb9715b-7d23-4281-8ee2-c4f5bbbbb222" containerName="gather" Dec 04 11:53:42 crc kubenswrapper[4831]: I1204 11:53:42.829485 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eb9715b-7d23-4281-8ee2-c4f5bbbbb222" containerName="gather" Dec 04 11:53:42 crc kubenswrapper[4831]: I1204 11:53:42.829513 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6b9f49f-b55a-4e75-961f-249e555d3d07" containerName="container-00" Dec 04 11:53:42 crc kubenswrapper[4831]: I1204 11:53:42.829540 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eb9715b-7d23-4281-8ee2-c4f5bbbbb222" containerName="copy" Dec 04 11:53:42 crc kubenswrapper[4831]: I1204 11:53:42.832010 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vgc6m" Dec 04 11:53:42 crc kubenswrapper[4831]: I1204 11:53:42.848195 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vgc6m"] Dec 04 11:53:42 crc kubenswrapper[4831]: I1204 11:53:42.988280 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/069e13f7-90a0-47f9-9925-90dea640741e-utilities\") pod \"redhat-operators-vgc6m\" (UID: \"069e13f7-90a0-47f9-9925-90dea640741e\") " pod="openshift-marketplace/redhat-operators-vgc6m" Dec 04 11:53:42 crc kubenswrapper[4831]: I1204 11:53:42.988439 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/069e13f7-90a0-47f9-9925-90dea640741e-catalog-content\") pod \"redhat-operators-vgc6m\" (UID: \"069e13f7-90a0-47f9-9925-90dea640741e\") " pod="openshift-marketplace/redhat-operators-vgc6m" Dec 04 11:53:42 crc kubenswrapper[4831]: I1204 11:53:42.988474 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2w6m\" (UniqueName: \"kubernetes.io/projected/069e13f7-90a0-47f9-9925-90dea640741e-kube-api-access-g2w6m\") pod \"redhat-operators-vgc6m\" (UID: \"069e13f7-90a0-47f9-9925-90dea640741e\") " pod="openshift-marketplace/redhat-operators-vgc6m" Dec 04 11:53:43 crc kubenswrapper[4831]: I1204 11:53:43.089951 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/069e13f7-90a0-47f9-9925-90dea640741e-catalog-content\") pod \"redhat-operators-vgc6m\" (UID: \"069e13f7-90a0-47f9-9925-90dea640741e\") " pod="openshift-marketplace/redhat-operators-vgc6m" Dec 04 11:53:43 crc kubenswrapper[4831]: I1204 11:53:43.090274 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2w6m\" (UniqueName: \"kubernetes.io/projected/069e13f7-90a0-47f9-9925-90dea640741e-kube-api-access-g2w6m\") pod \"redhat-operators-vgc6m\" (UID: \"069e13f7-90a0-47f9-9925-90dea640741e\") " pod="openshift-marketplace/redhat-operators-vgc6m" Dec 04 11:53:43 crc kubenswrapper[4831]: I1204 11:53:43.090887 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/069e13f7-90a0-47f9-9925-90dea640741e-utilities\") pod \"redhat-operators-vgc6m\" (UID: \"069e13f7-90a0-47f9-9925-90dea640741e\") " pod="openshift-marketplace/redhat-operators-vgc6m" Dec 04 11:53:43 crc kubenswrapper[4831]: I1204 11:53:43.090606 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/069e13f7-90a0-47f9-9925-90dea640741e-catalog-content\") pod \"redhat-operators-vgc6m\" (UID: \"069e13f7-90a0-47f9-9925-90dea640741e\") " pod="openshift-marketplace/redhat-operators-vgc6m" Dec 04 11:53:43 crc kubenswrapper[4831]: I1204 11:53:43.091288 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/069e13f7-90a0-47f9-9925-90dea640741e-utilities\") pod \"redhat-operators-vgc6m\" (UID: \"069e13f7-90a0-47f9-9925-90dea640741e\") " pod="openshift-marketplace/redhat-operators-vgc6m" Dec 04 11:53:43 crc kubenswrapper[4831]: I1204 11:53:43.114317 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2w6m\" (UniqueName: \"kubernetes.io/projected/069e13f7-90a0-47f9-9925-90dea640741e-kube-api-access-g2w6m\") pod \"redhat-operators-vgc6m\" (UID: \"069e13f7-90a0-47f9-9925-90dea640741e\") " pod="openshift-marketplace/redhat-operators-vgc6m" Dec 04 11:53:43 crc kubenswrapper[4831]: I1204 11:53:43.172297 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vgc6m" Dec 04 11:53:43 crc kubenswrapper[4831]: I1204 11:53:43.742994 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vgc6m"] Dec 04 11:53:44 crc kubenswrapper[4831]: I1204 11:53:44.488803 4831 generic.go:334] "Generic (PLEG): container finished" podID="069e13f7-90a0-47f9-9925-90dea640741e" containerID="ab93a9de62cf6dde9e5d3e5dcaae994f7ea18dde0c7e1e8d00adac8b5d9f61a5" exitCode=0 Dec 04 11:53:44 crc kubenswrapper[4831]: I1204 11:53:44.488889 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vgc6m" event={"ID":"069e13f7-90a0-47f9-9925-90dea640741e","Type":"ContainerDied","Data":"ab93a9de62cf6dde9e5d3e5dcaae994f7ea18dde0c7e1e8d00adac8b5d9f61a5"} Dec 04 11:53:44 crc kubenswrapper[4831]: I1204 11:53:44.489342 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vgc6m" event={"ID":"069e13f7-90a0-47f9-9925-90dea640741e","Type":"ContainerStarted","Data":"01eb463d8ef5cd997f23bd27fa5d18faa92e566d0665216c882b51841b1a3c8a"} Dec 04 11:53:44 crc kubenswrapper[4831]: I1204 11:53:44.491061 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 11:53:46 crc kubenswrapper[4831]: I1204 11:53:46.509279 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vgc6m" event={"ID":"069e13f7-90a0-47f9-9925-90dea640741e","Type":"ContainerStarted","Data":"e01af1fc303712c9d7b5ff9db61b8e006cdf22e4d09c47f54debf8c288f21829"} Dec 04 11:53:48 crc kubenswrapper[4831]: I1204 11:53:48.539083 4831 generic.go:334] "Generic (PLEG): container finished" podID="069e13f7-90a0-47f9-9925-90dea640741e" containerID="e01af1fc303712c9d7b5ff9db61b8e006cdf22e4d09c47f54debf8c288f21829" exitCode=0 Dec 04 11:53:48 crc kubenswrapper[4831]: I1204 11:53:48.539209 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vgc6m" event={"ID":"069e13f7-90a0-47f9-9925-90dea640741e","Type":"ContainerDied","Data":"e01af1fc303712c9d7b5ff9db61b8e006cdf22e4d09c47f54debf8c288f21829"} Dec 04 11:53:50 crc kubenswrapper[4831]: I1204 11:53:50.558104 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vgc6m" event={"ID":"069e13f7-90a0-47f9-9925-90dea640741e","Type":"ContainerStarted","Data":"f281e423175978d9ed892506c21ad26d3624a10f2c32eedcc367e3924607d823"} Dec 04 11:53:50 crc kubenswrapper[4831]: I1204 11:53:50.586097 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vgc6m" podStartSLOduration=3.649968104 podStartE2EDuration="8.586067351s" podCreationTimestamp="2025-12-04 11:53:42 +0000 UTC" firstStartedPulling="2025-12-04 11:53:44.490839527 +0000 UTC m=+5921.440014841" lastFinishedPulling="2025-12-04 11:53:49.426938774 +0000 UTC m=+5926.376114088" observedRunningTime="2025-12-04 11:53:50.575715493 +0000 UTC m=+5927.524890817" watchObservedRunningTime="2025-12-04 11:53:50.586067351 +0000 UTC m=+5927.535242685" Dec 04 11:53:53 crc kubenswrapper[4831]: I1204 11:53:53.173231 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vgc6m" Dec 04 11:53:53 crc kubenswrapper[4831]: I1204 11:53:53.173633 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vgc6m" Dec 04 11:53:54 crc kubenswrapper[4831]: I1204 11:53:54.250424 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vgc6m" podUID="069e13f7-90a0-47f9-9925-90dea640741e" containerName="registry-server" probeResult="failure" output=< Dec 04 11:53:54 crc kubenswrapper[4831]: timeout: failed to connect service ":50051" within 1s Dec 04 11:53:54 crc kubenswrapper[4831]: > Dec 04 11:54:03 crc kubenswrapper[4831]: I1204 11:54:03.224469 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vgc6m" Dec 04 11:54:03 crc kubenswrapper[4831]: I1204 11:54:03.289067 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vgc6m" Dec 04 11:54:03 crc kubenswrapper[4831]: I1204 11:54:03.571212 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vgc6m"] Dec 04 11:54:04 crc kubenswrapper[4831]: I1204 11:54:04.695306 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vgc6m" podUID="069e13f7-90a0-47f9-9925-90dea640741e" containerName="registry-server" containerID="cri-o://f281e423175978d9ed892506c21ad26d3624a10f2c32eedcc367e3924607d823" gracePeriod=2 Dec 04 11:54:05 crc kubenswrapper[4831]: I1204 11:54:05.159777 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vgc6m" Dec 04 11:54:05 crc kubenswrapper[4831]: I1204 11:54:05.272923 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/069e13f7-90a0-47f9-9925-90dea640741e-catalog-content\") pod \"069e13f7-90a0-47f9-9925-90dea640741e\" (UID: \"069e13f7-90a0-47f9-9925-90dea640741e\") " Dec 04 11:54:05 crc kubenswrapper[4831]: I1204 11:54:05.273111 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/069e13f7-90a0-47f9-9925-90dea640741e-utilities\") pod \"069e13f7-90a0-47f9-9925-90dea640741e\" (UID: \"069e13f7-90a0-47f9-9925-90dea640741e\") " Dec 04 11:54:05 crc kubenswrapper[4831]: I1204 11:54:05.273254 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2w6m\" (UniqueName: \"kubernetes.io/projected/069e13f7-90a0-47f9-9925-90dea640741e-kube-api-access-g2w6m\") pod \"069e13f7-90a0-47f9-9925-90dea640741e\" (UID: \"069e13f7-90a0-47f9-9925-90dea640741e\") " Dec 04 11:54:05 crc kubenswrapper[4831]: I1204 11:54:05.274497 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/069e13f7-90a0-47f9-9925-90dea640741e-utilities" (OuterVolumeSpecName: "utilities") pod "069e13f7-90a0-47f9-9925-90dea640741e" (UID: "069e13f7-90a0-47f9-9925-90dea640741e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:54:05 crc kubenswrapper[4831]: I1204 11:54:05.278557 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/069e13f7-90a0-47f9-9925-90dea640741e-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 11:54:05 crc kubenswrapper[4831]: I1204 11:54:05.292762 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/069e13f7-90a0-47f9-9925-90dea640741e-kube-api-access-g2w6m" (OuterVolumeSpecName: "kube-api-access-g2w6m") pod "069e13f7-90a0-47f9-9925-90dea640741e" (UID: "069e13f7-90a0-47f9-9925-90dea640741e"). InnerVolumeSpecName "kube-api-access-g2w6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:54:05 crc kubenswrapper[4831]: I1204 11:54:05.380482 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2w6m\" (UniqueName: \"kubernetes.io/projected/069e13f7-90a0-47f9-9925-90dea640741e-kube-api-access-g2w6m\") on node \"crc\" DevicePath \"\"" Dec 04 11:54:05 crc kubenswrapper[4831]: I1204 11:54:05.385880 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/069e13f7-90a0-47f9-9925-90dea640741e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "069e13f7-90a0-47f9-9925-90dea640741e" (UID: "069e13f7-90a0-47f9-9925-90dea640741e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:54:05 crc kubenswrapper[4831]: I1204 11:54:05.482492 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/069e13f7-90a0-47f9-9925-90dea640741e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 11:54:05 crc kubenswrapper[4831]: I1204 11:54:05.706985 4831 generic.go:334] "Generic (PLEG): container finished" podID="069e13f7-90a0-47f9-9925-90dea640741e" containerID="f281e423175978d9ed892506c21ad26d3624a10f2c32eedcc367e3924607d823" exitCode=0 Dec 04 11:54:05 crc kubenswrapper[4831]: I1204 11:54:05.707031 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vgc6m" event={"ID":"069e13f7-90a0-47f9-9925-90dea640741e","Type":"ContainerDied","Data":"f281e423175978d9ed892506c21ad26d3624a10f2c32eedcc367e3924607d823"} Dec 04 11:54:05 crc kubenswrapper[4831]: I1204 11:54:05.707058 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vgc6m" event={"ID":"069e13f7-90a0-47f9-9925-90dea640741e","Type":"ContainerDied","Data":"01eb463d8ef5cd997f23bd27fa5d18faa92e566d0665216c882b51841b1a3c8a"} Dec 04 11:54:05 crc kubenswrapper[4831]: I1204 11:54:05.707076 4831 scope.go:117] "RemoveContainer" containerID="f281e423175978d9ed892506c21ad26d3624a10f2c32eedcc367e3924607d823" Dec 04 11:54:05 crc kubenswrapper[4831]: I1204 11:54:05.707215 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vgc6m" Dec 04 11:54:05 crc kubenswrapper[4831]: I1204 11:54:05.730171 4831 scope.go:117] "RemoveContainer" containerID="e01af1fc303712c9d7b5ff9db61b8e006cdf22e4d09c47f54debf8c288f21829" Dec 04 11:54:05 crc kubenswrapper[4831]: I1204 11:54:05.746920 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vgc6m"] Dec 04 11:54:05 crc kubenswrapper[4831]: I1204 11:54:05.756859 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vgc6m"] Dec 04 11:54:05 crc kubenswrapper[4831]: I1204 11:54:05.769816 4831 scope.go:117] "RemoveContainer" containerID="ab93a9de62cf6dde9e5d3e5dcaae994f7ea18dde0c7e1e8d00adac8b5d9f61a5" Dec 04 11:54:05 crc kubenswrapper[4831]: I1204 11:54:05.802624 4831 scope.go:117] "RemoveContainer" containerID="f281e423175978d9ed892506c21ad26d3624a10f2c32eedcc367e3924607d823" Dec 04 11:54:05 crc kubenswrapper[4831]: E1204 11:54:05.803177 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f281e423175978d9ed892506c21ad26d3624a10f2c32eedcc367e3924607d823\": container with ID starting with f281e423175978d9ed892506c21ad26d3624a10f2c32eedcc367e3924607d823 not found: ID does not exist" containerID="f281e423175978d9ed892506c21ad26d3624a10f2c32eedcc367e3924607d823" Dec 04 11:54:05 crc kubenswrapper[4831]: I1204 11:54:05.803226 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f281e423175978d9ed892506c21ad26d3624a10f2c32eedcc367e3924607d823"} err="failed to get container status \"f281e423175978d9ed892506c21ad26d3624a10f2c32eedcc367e3924607d823\": rpc error: code = NotFound desc = could not find container \"f281e423175978d9ed892506c21ad26d3624a10f2c32eedcc367e3924607d823\": container with ID starting with f281e423175978d9ed892506c21ad26d3624a10f2c32eedcc367e3924607d823 not found: ID does not exist" Dec 04 11:54:05 crc kubenswrapper[4831]: I1204 11:54:05.803254 4831 scope.go:117] "RemoveContainer" containerID="e01af1fc303712c9d7b5ff9db61b8e006cdf22e4d09c47f54debf8c288f21829" Dec 04 11:54:05 crc kubenswrapper[4831]: E1204 11:54:05.803475 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e01af1fc303712c9d7b5ff9db61b8e006cdf22e4d09c47f54debf8c288f21829\": container with ID starting with e01af1fc303712c9d7b5ff9db61b8e006cdf22e4d09c47f54debf8c288f21829 not found: ID does not exist" containerID="e01af1fc303712c9d7b5ff9db61b8e006cdf22e4d09c47f54debf8c288f21829" Dec 04 11:54:05 crc kubenswrapper[4831]: I1204 11:54:05.803504 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e01af1fc303712c9d7b5ff9db61b8e006cdf22e4d09c47f54debf8c288f21829"} err="failed to get container status \"e01af1fc303712c9d7b5ff9db61b8e006cdf22e4d09c47f54debf8c288f21829\": rpc error: code = NotFound desc = could not find container \"e01af1fc303712c9d7b5ff9db61b8e006cdf22e4d09c47f54debf8c288f21829\": container with ID starting with e01af1fc303712c9d7b5ff9db61b8e006cdf22e4d09c47f54debf8c288f21829 not found: ID does not exist" Dec 04 11:54:05 crc kubenswrapper[4831]: I1204 11:54:05.803522 4831 scope.go:117] "RemoveContainer" containerID="ab93a9de62cf6dde9e5d3e5dcaae994f7ea18dde0c7e1e8d00adac8b5d9f61a5" Dec 04 11:54:05 crc kubenswrapper[4831]: E1204 11:54:05.803811 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab93a9de62cf6dde9e5d3e5dcaae994f7ea18dde0c7e1e8d00adac8b5d9f61a5\": container with ID starting with ab93a9de62cf6dde9e5d3e5dcaae994f7ea18dde0c7e1e8d00adac8b5d9f61a5 not found: ID does not exist" containerID="ab93a9de62cf6dde9e5d3e5dcaae994f7ea18dde0c7e1e8d00adac8b5d9f61a5" Dec 04 11:54:05 crc kubenswrapper[4831]: I1204 11:54:05.803848 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab93a9de62cf6dde9e5d3e5dcaae994f7ea18dde0c7e1e8d00adac8b5d9f61a5"} err="failed to get container status \"ab93a9de62cf6dde9e5d3e5dcaae994f7ea18dde0c7e1e8d00adac8b5d9f61a5\": rpc error: code = NotFound desc = could not find container \"ab93a9de62cf6dde9e5d3e5dcaae994f7ea18dde0c7e1e8d00adac8b5d9f61a5\": container with ID starting with ab93a9de62cf6dde9e5d3e5dcaae994f7ea18dde0c7e1e8d00adac8b5d9f61a5 not found: ID does not exist" Dec 04 11:54:07 crc kubenswrapper[4831]: I1204 11:54:07.288853 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="069e13f7-90a0-47f9-9925-90dea640741e" path="/var/lib/kubelet/pods/069e13f7-90a0-47f9-9925-90dea640741e/volumes" Dec 04 11:54:21 crc kubenswrapper[4831]: I1204 11:54:21.971202 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:54:21 crc kubenswrapper[4831]: I1204 11:54:21.971815 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:54:41 crc kubenswrapper[4831]: I1204 11:54:41.303278 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ktrb6"] Dec 04 11:54:41 crc kubenswrapper[4831]: E1204 11:54:41.304341 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="069e13f7-90a0-47f9-9925-90dea640741e" containerName="extract-content" Dec 04 11:54:41 crc kubenswrapper[4831]: I1204 11:54:41.304357 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="069e13f7-90a0-47f9-9925-90dea640741e" containerName="extract-content" Dec 04 11:54:41 crc kubenswrapper[4831]: E1204 11:54:41.304383 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="069e13f7-90a0-47f9-9925-90dea640741e" containerName="extract-utilities" Dec 04 11:54:41 crc kubenswrapper[4831]: I1204 11:54:41.304390 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="069e13f7-90a0-47f9-9925-90dea640741e" containerName="extract-utilities" Dec 04 11:54:41 crc kubenswrapper[4831]: E1204 11:54:41.304415 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="069e13f7-90a0-47f9-9925-90dea640741e" containerName="registry-server" Dec 04 11:54:41 crc kubenswrapper[4831]: I1204 11:54:41.304422 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="069e13f7-90a0-47f9-9925-90dea640741e" containerName="registry-server" Dec 04 11:54:41 crc kubenswrapper[4831]: I1204 11:54:41.304703 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="069e13f7-90a0-47f9-9925-90dea640741e" containerName="registry-server" Dec 04 11:54:41 crc kubenswrapper[4831]: I1204 11:54:41.306504 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ktrb6"] Dec 04 11:54:41 crc kubenswrapper[4831]: I1204 11:54:41.306611 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ktrb6" Dec 04 11:54:41 crc kubenswrapper[4831]: I1204 11:54:41.427902 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqv8c\" (UniqueName: \"kubernetes.io/projected/d9fd7222-df0f-4511-8a50-6f8dbfc9c697-kube-api-access-fqv8c\") pod \"certified-operators-ktrb6\" (UID: \"d9fd7222-df0f-4511-8a50-6f8dbfc9c697\") " pod="openshift-marketplace/certified-operators-ktrb6" Dec 04 11:54:41 crc kubenswrapper[4831]: I1204 11:54:41.427964 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9fd7222-df0f-4511-8a50-6f8dbfc9c697-catalog-content\") pod \"certified-operators-ktrb6\" (UID: \"d9fd7222-df0f-4511-8a50-6f8dbfc9c697\") " pod="openshift-marketplace/certified-operators-ktrb6" Dec 04 11:54:41 crc kubenswrapper[4831]: I1204 11:54:41.428036 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9fd7222-df0f-4511-8a50-6f8dbfc9c697-utilities\") pod \"certified-operators-ktrb6\" (UID: \"d9fd7222-df0f-4511-8a50-6f8dbfc9c697\") " pod="openshift-marketplace/certified-operators-ktrb6" Dec 04 11:54:41 crc kubenswrapper[4831]: I1204 11:54:41.529888 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqv8c\" (UniqueName: \"kubernetes.io/projected/d9fd7222-df0f-4511-8a50-6f8dbfc9c697-kube-api-access-fqv8c\") pod \"certified-operators-ktrb6\" (UID: \"d9fd7222-df0f-4511-8a50-6f8dbfc9c697\") " pod="openshift-marketplace/certified-operators-ktrb6" Dec 04 11:54:41 crc kubenswrapper[4831]: I1204 11:54:41.529941 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9fd7222-df0f-4511-8a50-6f8dbfc9c697-catalog-content\") pod \"certified-operators-ktrb6\" (UID: \"d9fd7222-df0f-4511-8a50-6f8dbfc9c697\") " pod="openshift-marketplace/certified-operators-ktrb6" Dec 04 11:54:41 crc kubenswrapper[4831]: I1204 11:54:41.529988 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9fd7222-df0f-4511-8a50-6f8dbfc9c697-utilities\") pod \"certified-operators-ktrb6\" (UID: \"d9fd7222-df0f-4511-8a50-6f8dbfc9c697\") " pod="openshift-marketplace/certified-operators-ktrb6" Dec 04 11:54:41 crc kubenswrapper[4831]: I1204 11:54:41.530493 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9fd7222-df0f-4511-8a50-6f8dbfc9c697-catalog-content\") pod \"certified-operators-ktrb6\" (UID: \"d9fd7222-df0f-4511-8a50-6f8dbfc9c697\") " pod="openshift-marketplace/certified-operators-ktrb6" Dec 04 11:54:41 crc kubenswrapper[4831]: I1204 11:54:41.530606 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9fd7222-df0f-4511-8a50-6f8dbfc9c697-utilities\") pod \"certified-operators-ktrb6\" (UID: \"d9fd7222-df0f-4511-8a50-6f8dbfc9c697\") " pod="openshift-marketplace/certified-operators-ktrb6" Dec 04 11:54:41 crc kubenswrapper[4831]: I1204 11:54:41.553360 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqv8c\" (UniqueName: \"kubernetes.io/projected/d9fd7222-df0f-4511-8a50-6f8dbfc9c697-kube-api-access-fqv8c\") pod \"certified-operators-ktrb6\" (UID: \"d9fd7222-df0f-4511-8a50-6f8dbfc9c697\") " pod="openshift-marketplace/certified-operators-ktrb6" Dec 04 11:54:41 crc kubenswrapper[4831]: I1204 11:54:41.634331 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ktrb6" Dec 04 11:54:42 crc kubenswrapper[4831]: I1204 11:54:42.198300 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ktrb6"] Dec 04 11:54:43 crc kubenswrapper[4831]: I1204 11:54:43.083190 4831 generic.go:334] "Generic (PLEG): container finished" podID="d9fd7222-df0f-4511-8a50-6f8dbfc9c697" containerID="02c09ee52e5442f4446d5fe80a5ed78a1138cab828a21e6f0bbb6d574d9f996f" exitCode=0 Dec 04 11:54:43 crc kubenswrapper[4831]: I1204 11:54:43.083302 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ktrb6" event={"ID":"d9fd7222-df0f-4511-8a50-6f8dbfc9c697","Type":"ContainerDied","Data":"02c09ee52e5442f4446d5fe80a5ed78a1138cab828a21e6f0bbb6d574d9f996f"} Dec 04 11:54:43 crc kubenswrapper[4831]: I1204 11:54:43.083541 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ktrb6" event={"ID":"d9fd7222-df0f-4511-8a50-6f8dbfc9c697","Type":"ContainerStarted","Data":"b2b9330715367a83a9b6d9c096642028b53510919c045ce9969a4b94ce44822d"} Dec 04 11:54:44 crc kubenswrapper[4831]: I1204 11:54:44.097323 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ktrb6" event={"ID":"d9fd7222-df0f-4511-8a50-6f8dbfc9c697","Type":"ContainerStarted","Data":"0805d8170103b2cb80fe53436f4c977e7d46a67ff718aba7a4ec583a924cf918"} Dec 04 11:54:45 crc kubenswrapper[4831]: I1204 11:54:45.109756 4831 generic.go:334] "Generic (PLEG): container finished" podID="d9fd7222-df0f-4511-8a50-6f8dbfc9c697" containerID="0805d8170103b2cb80fe53436f4c977e7d46a67ff718aba7a4ec583a924cf918" exitCode=0 Dec 04 11:54:45 crc kubenswrapper[4831]: I1204 11:54:45.109862 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ktrb6" event={"ID":"d9fd7222-df0f-4511-8a50-6f8dbfc9c697","Type":"ContainerDied","Data":"0805d8170103b2cb80fe53436f4c977e7d46a67ff718aba7a4ec583a924cf918"} Dec 04 11:54:46 crc kubenswrapper[4831]: I1204 11:54:46.121307 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ktrb6" event={"ID":"d9fd7222-df0f-4511-8a50-6f8dbfc9c697","Type":"ContainerStarted","Data":"2e496a792a3dc736a1ca7e42174db0e037eac86695a0b379701db7423ffda096"} Dec 04 11:54:46 crc kubenswrapper[4831]: I1204 11:54:46.140523 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ktrb6" podStartSLOduration=2.647053777 podStartE2EDuration="5.140501704s" podCreationTimestamp="2025-12-04 11:54:41 +0000 UTC" firstStartedPulling="2025-12-04 11:54:43.08504139 +0000 UTC m=+5980.034216704" lastFinishedPulling="2025-12-04 11:54:45.578489317 +0000 UTC m=+5982.527664631" observedRunningTime="2025-12-04 11:54:46.136785614 +0000 UTC m=+5983.085960948" watchObservedRunningTime="2025-12-04 11:54:46.140501704 +0000 UTC m=+5983.089677018" Dec 04 11:54:51 crc kubenswrapper[4831]: I1204 11:54:51.634612 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ktrb6" Dec 04 11:54:51 crc kubenswrapper[4831]: I1204 11:54:51.635776 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ktrb6" Dec 04 11:54:51 crc kubenswrapper[4831]: I1204 11:54:51.686539 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ktrb6" Dec 04 11:54:51 crc kubenswrapper[4831]: I1204 11:54:51.972034 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:54:51 crc kubenswrapper[4831]: I1204 11:54:51.972103 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:54:52 crc kubenswrapper[4831]: I1204 11:54:52.243130 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ktrb6" Dec 04 11:54:52 crc kubenswrapper[4831]: I1204 11:54:52.302711 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ktrb6"] Dec 04 11:54:54 crc kubenswrapper[4831]: I1204 11:54:54.212746 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ktrb6" podUID="d9fd7222-df0f-4511-8a50-6f8dbfc9c697" containerName="registry-server" containerID="cri-o://2e496a792a3dc736a1ca7e42174db0e037eac86695a0b379701db7423ffda096" gracePeriod=2 Dec 04 11:54:55 crc kubenswrapper[4831]: I1204 11:54:55.230134 4831 generic.go:334] "Generic (PLEG): container finished" podID="d9fd7222-df0f-4511-8a50-6f8dbfc9c697" containerID="2e496a792a3dc736a1ca7e42174db0e037eac86695a0b379701db7423ffda096" exitCode=0 Dec 04 11:54:55 crc kubenswrapper[4831]: I1204 11:54:55.230424 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ktrb6" event={"ID":"d9fd7222-df0f-4511-8a50-6f8dbfc9c697","Type":"ContainerDied","Data":"2e496a792a3dc736a1ca7e42174db0e037eac86695a0b379701db7423ffda096"} Dec 04 11:54:55 crc kubenswrapper[4831]: I1204 11:54:55.857727 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ktrb6" Dec 04 11:54:56 crc kubenswrapper[4831]: I1204 11:54:56.055070 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9fd7222-df0f-4511-8a50-6f8dbfc9c697-catalog-content\") pod \"d9fd7222-df0f-4511-8a50-6f8dbfc9c697\" (UID: \"d9fd7222-df0f-4511-8a50-6f8dbfc9c697\") " Dec 04 11:54:56 crc kubenswrapper[4831]: I1204 11:54:56.055127 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9fd7222-df0f-4511-8a50-6f8dbfc9c697-utilities\") pod \"d9fd7222-df0f-4511-8a50-6f8dbfc9c697\" (UID: \"d9fd7222-df0f-4511-8a50-6f8dbfc9c697\") " Dec 04 11:54:56 crc kubenswrapper[4831]: I1204 11:54:56.055216 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqv8c\" (UniqueName: \"kubernetes.io/projected/d9fd7222-df0f-4511-8a50-6f8dbfc9c697-kube-api-access-fqv8c\") pod \"d9fd7222-df0f-4511-8a50-6f8dbfc9c697\" (UID: \"d9fd7222-df0f-4511-8a50-6f8dbfc9c697\") " Dec 04 11:54:56 crc kubenswrapper[4831]: I1204 11:54:56.056990 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9fd7222-df0f-4511-8a50-6f8dbfc9c697-utilities" (OuterVolumeSpecName: "utilities") pod "d9fd7222-df0f-4511-8a50-6f8dbfc9c697" (UID: "d9fd7222-df0f-4511-8a50-6f8dbfc9c697"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:54:56 crc kubenswrapper[4831]: I1204 11:54:56.062221 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9fd7222-df0f-4511-8a50-6f8dbfc9c697-kube-api-access-fqv8c" (OuterVolumeSpecName: "kube-api-access-fqv8c") pod "d9fd7222-df0f-4511-8a50-6f8dbfc9c697" (UID: "d9fd7222-df0f-4511-8a50-6f8dbfc9c697"). InnerVolumeSpecName "kube-api-access-fqv8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:54:56 crc kubenswrapper[4831]: I1204 11:54:56.109640 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9fd7222-df0f-4511-8a50-6f8dbfc9c697-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d9fd7222-df0f-4511-8a50-6f8dbfc9c697" (UID: "d9fd7222-df0f-4511-8a50-6f8dbfc9c697"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:54:56 crc kubenswrapper[4831]: I1204 11:54:56.157088 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9fd7222-df0f-4511-8a50-6f8dbfc9c697-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 11:54:56 crc kubenswrapper[4831]: I1204 11:54:56.157125 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9fd7222-df0f-4511-8a50-6f8dbfc9c697-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 11:54:56 crc kubenswrapper[4831]: I1204 11:54:56.157138 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqv8c\" (UniqueName: \"kubernetes.io/projected/d9fd7222-df0f-4511-8a50-6f8dbfc9c697-kube-api-access-fqv8c\") on node \"crc\" DevicePath \"\"" Dec 04 11:54:56 crc kubenswrapper[4831]: I1204 11:54:56.242957 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ktrb6" event={"ID":"d9fd7222-df0f-4511-8a50-6f8dbfc9c697","Type":"ContainerDied","Data":"b2b9330715367a83a9b6d9c096642028b53510919c045ce9969a4b94ce44822d"} Dec 04 11:54:56 crc kubenswrapper[4831]: I1204 11:54:56.243026 4831 scope.go:117] "RemoveContainer" containerID="2e496a792a3dc736a1ca7e42174db0e037eac86695a0b379701db7423ffda096" Dec 04 11:54:56 crc kubenswrapper[4831]: I1204 11:54:56.243198 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ktrb6" Dec 04 11:54:56 crc kubenswrapper[4831]: I1204 11:54:56.305302 4831 scope.go:117] "RemoveContainer" containerID="0805d8170103b2cb80fe53436f4c977e7d46a67ff718aba7a4ec583a924cf918" Dec 04 11:54:56 crc kubenswrapper[4831]: I1204 11:54:56.323233 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ktrb6"] Dec 04 11:54:56 crc kubenswrapper[4831]: I1204 11:54:56.330972 4831 scope.go:117] "RemoveContainer" containerID="02c09ee52e5442f4446d5fe80a5ed78a1138cab828a21e6f0bbb6d574d9f996f" Dec 04 11:54:56 crc kubenswrapper[4831]: I1204 11:54:56.333642 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ktrb6"] Dec 04 11:54:57 crc kubenswrapper[4831]: I1204 11:54:57.288145 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9fd7222-df0f-4511-8a50-6f8dbfc9c697" path="/var/lib/kubelet/pods/d9fd7222-df0f-4511-8a50-6f8dbfc9c697/volumes" Dec 04 11:55:21 crc kubenswrapper[4831]: I1204 11:55:21.971437 4831 patch_prober.go:28] interesting pod/machine-config-daemon-g76nn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:55:21 crc kubenswrapper[4831]: I1204 11:55:21.972043 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:55:21 crc kubenswrapper[4831]: I1204 11:55:21.972095 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" Dec 04 11:55:21 crc kubenswrapper[4831]: I1204 11:55:21.973078 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4f21f7414de02ac92f4a182ade6a48ec3bca0e71e30c3a2fe8a39329e663fab1"} pod="openshift-machine-config-operator/machine-config-daemon-g76nn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 11:55:21 crc kubenswrapper[4831]: I1204 11:55:21.973176 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" containerName="machine-config-daemon" containerID="cri-o://4f21f7414de02ac92f4a182ade6a48ec3bca0e71e30c3a2fe8a39329e663fab1" gracePeriod=600 Dec 04 11:55:22 crc kubenswrapper[4831]: E1204 11:55:22.101580 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:55:22 crc kubenswrapper[4831]: I1204 11:55:22.489207 4831 generic.go:334] "Generic (PLEG): container finished" podID="8475bb26-8864-4d49-935b-db7d4cb73387" containerID="4f21f7414de02ac92f4a182ade6a48ec3bca0e71e30c3a2fe8a39329e663fab1" exitCode=0 Dec 04 11:55:22 crc kubenswrapper[4831]: I1204 11:55:22.489274 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" event={"ID":"8475bb26-8864-4d49-935b-db7d4cb73387","Type":"ContainerDied","Data":"4f21f7414de02ac92f4a182ade6a48ec3bca0e71e30c3a2fe8a39329e663fab1"} Dec 04 11:55:22 crc kubenswrapper[4831]: I1204 11:55:22.489566 4831 scope.go:117] "RemoveContainer" containerID="ff414e055bbc22bc31dcdcf17a367f82d6b7294e5b2320dbfa07d71d0a78e2be" Dec 04 11:55:22 crc kubenswrapper[4831]: I1204 11:55:22.490242 4831 scope.go:117] "RemoveContainer" containerID="4f21f7414de02ac92f4a182ade6a48ec3bca0e71e30c3a2fe8a39329e663fab1" Dec 04 11:55:22 crc kubenswrapper[4831]: E1204 11:55:22.490553 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:55:33 crc kubenswrapper[4831]: I1204 11:55:33.284460 4831 scope.go:117] "RemoveContainer" containerID="4f21f7414de02ac92f4a182ade6a48ec3bca0e71e30c3a2fe8a39329e663fab1" Dec 04 11:55:33 crc kubenswrapper[4831]: E1204 11:55:33.285508 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:55:48 crc kubenswrapper[4831]: I1204 11:55:48.276816 4831 scope.go:117] "RemoveContainer" containerID="4f21f7414de02ac92f4a182ade6a48ec3bca0e71e30c3a2fe8a39329e663fab1" Dec 04 11:55:48 crc kubenswrapper[4831]: E1204 11:55:48.278524 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:55:59 crc kubenswrapper[4831]: I1204 11:55:59.276531 4831 scope.go:117] "RemoveContainer" containerID="4f21f7414de02ac92f4a182ade6a48ec3bca0e71e30c3a2fe8a39329e663fab1" Dec 04 11:55:59 crc kubenswrapper[4831]: E1204 11:55:59.277488 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:56:11 crc kubenswrapper[4831]: I1204 11:56:11.469304 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8kxr5"] Dec 04 11:56:11 crc kubenswrapper[4831]: E1204 11:56:11.470529 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9fd7222-df0f-4511-8a50-6f8dbfc9c697" containerName="registry-server" Dec 04 11:56:11 crc kubenswrapper[4831]: I1204 11:56:11.470547 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9fd7222-df0f-4511-8a50-6f8dbfc9c697" containerName="registry-server" Dec 04 11:56:11 crc kubenswrapper[4831]: E1204 11:56:11.470590 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9fd7222-df0f-4511-8a50-6f8dbfc9c697" containerName="extract-utilities" Dec 04 11:56:11 crc kubenswrapper[4831]: I1204 11:56:11.470599 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9fd7222-df0f-4511-8a50-6f8dbfc9c697" containerName="extract-utilities" Dec 04 11:56:11 crc kubenswrapper[4831]: E1204 11:56:11.470628 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9fd7222-df0f-4511-8a50-6f8dbfc9c697" containerName="extract-content" Dec 04 11:56:11 crc kubenswrapper[4831]: I1204 11:56:11.470637 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9fd7222-df0f-4511-8a50-6f8dbfc9c697" containerName="extract-content" Dec 04 11:56:11 crc kubenswrapper[4831]: I1204 11:56:11.470964 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9fd7222-df0f-4511-8a50-6f8dbfc9c697" containerName="registry-server" Dec 04 11:56:11 crc kubenswrapper[4831]: I1204 11:56:11.473675 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8kxr5" Dec 04 11:56:11 crc kubenswrapper[4831]: I1204 11:56:11.479633 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8kxr5"] Dec 04 11:56:11 crc kubenswrapper[4831]: I1204 11:56:11.547561 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6xqh\" (UniqueName: \"kubernetes.io/projected/8135008d-f125-4852-ba58-84b9fb4c395a-kube-api-access-k6xqh\") pod \"redhat-marketplace-8kxr5\" (UID: \"8135008d-f125-4852-ba58-84b9fb4c395a\") " pod="openshift-marketplace/redhat-marketplace-8kxr5" Dec 04 11:56:11 crc kubenswrapper[4831]: I1204 11:56:11.547822 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8135008d-f125-4852-ba58-84b9fb4c395a-catalog-content\") pod \"redhat-marketplace-8kxr5\" (UID: \"8135008d-f125-4852-ba58-84b9fb4c395a\") " pod="openshift-marketplace/redhat-marketplace-8kxr5" Dec 04 11:56:11 crc kubenswrapper[4831]: I1204 11:56:11.548056 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8135008d-f125-4852-ba58-84b9fb4c395a-utilities\") pod \"redhat-marketplace-8kxr5\" (UID: \"8135008d-f125-4852-ba58-84b9fb4c395a\") " pod="openshift-marketplace/redhat-marketplace-8kxr5" Dec 04 11:56:11 crc kubenswrapper[4831]: I1204 11:56:11.652019 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8135008d-f125-4852-ba58-84b9fb4c395a-catalog-content\") pod \"redhat-marketplace-8kxr5\" (UID: \"8135008d-f125-4852-ba58-84b9fb4c395a\") " pod="openshift-marketplace/redhat-marketplace-8kxr5" Dec 04 11:56:11 crc kubenswrapper[4831]: I1204 11:56:11.652127 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8135008d-f125-4852-ba58-84b9fb4c395a-utilities\") pod \"redhat-marketplace-8kxr5\" (UID: \"8135008d-f125-4852-ba58-84b9fb4c395a\") " pod="openshift-marketplace/redhat-marketplace-8kxr5" Dec 04 11:56:11 crc kubenswrapper[4831]: I1204 11:56:11.652182 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6xqh\" (UniqueName: \"kubernetes.io/projected/8135008d-f125-4852-ba58-84b9fb4c395a-kube-api-access-k6xqh\") pod \"redhat-marketplace-8kxr5\" (UID: \"8135008d-f125-4852-ba58-84b9fb4c395a\") " pod="openshift-marketplace/redhat-marketplace-8kxr5" Dec 04 11:56:11 crc kubenswrapper[4831]: I1204 11:56:11.653622 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8135008d-f125-4852-ba58-84b9fb4c395a-catalog-content\") pod \"redhat-marketplace-8kxr5\" (UID: \"8135008d-f125-4852-ba58-84b9fb4c395a\") " pod="openshift-marketplace/redhat-marketplace-8kxr5" Dec 04 11:56:11 crc kubenswrapper[4831]: I1204 11:56:11.655553 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8135008d-f125-4852-ba58-84b9fb4c395a-utilities\") pod \"redhat-marketplace-8kxr5\" (UID: \"8135008d-f125-4852-ba58-84b9fb4c395a\") " pod="openshift-marketplace/redhat-marketplace-8kxr5" Dec 04 11:56:11 crc kubenswrapper[4831]: I1204 11:56:11.674530 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6xqh\" (UniqueName: \"kubernetes.io/projected/8135008d-f125-4852-ba58-84b9fb4c395a-kube-api-access-k6xqh\") pod \"redhat-marketplace-8kxr5\" (UID: \"8135008d-f125-4852-ba58-84b9fb4c395a\") " pod="openshift-marketplace/redhat-marketplace-8kxr5" Dec 04 11:56:11 crc kubenswrapper[4831]: I1204 11:56:11.808073 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8kxr5" Dec 04 11:56:12 crc kubenswrapper[4831]: I1204 11:56:12.325156 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8kxr5"] Dec 04 11:56:12 crc kubenswrapper[4831]: I1204 11:56:12.980517 4831 generic.go:334] "Generic (PLEG): container finished" podID="8135008d-f125-4852-ba58-84b9fb4c395a" containerID="1086beee152870539b9484050a53fcb57f7f752d157147a6e0752baaad836f7c" exitCode=0 Dec 04 11:56:12 crc kubenswrapper[4831]: I1204 11:56:12.980566 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8kxr5" event={"ID":"8135008d-f125-4852-ba58-84b9fb4c395a","Type":"ContainerDied","Data":"1086beee152870539b9484050a53fcb57f7f752d157147a6e0752baaad836f7c"} Dec 04 11:56:12 crc kubenswrapper[4831]: I1204 11:56:12.980598 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8kxr5" event={"ID":"8135008d-f125-4852-ba58-84b9fb4c395a","Type":"ContainerStarted","Data":"5afd5a7a137fd96940d0e7545a84dd39569ecf950ec2458259b0bd5c8bcb96da"} Dec 04 11:56:13 crc kubenswrapper[4831]: I1204 11:56:13.990477 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8kxr5" event={"ID":"8135008d-f125-4852-ba58-84b9fb4c395a","Type":"ContainerStarted","Data":"ddb8742c7c1d1e1e25b0bf3149d199a6dce8f6eff14c8b0edd324b843535a8c8"} Dec 04 11:56:14 crc kubenswrapper[4831]: I1204 11:56:14.277765 4831 scope.go:117] "RemoveContainer" containerID="4f21f7414de02ac92f4a182ade6a48ec3bca0e71e30c3a2fe8a39329e663fab1" Dec 04 11:56:14 crc kubenswrapper[4831]: E1204 11:56:14.278467 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:56:15 crc kubenswrapper[4831]: I1204 11:56:15.002247 4831 generic.go:334] "Generic (PLEG): container finished" podID="8135008d-f125-4852-ba58-84b9fb4c395a" containerID="ddb8742c7c1d1e1e25b0bf3149d199a6dce8f6eff14c8b0edd324b843535a8c8" exitCode=0 Dec 04 11:56:15 crc kubenswrapper[4831]: I1204 11:56:15.002331 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8kxr5" event={"ID":"8135008d-f125-4852-ba58-84b9fb4c395a","Type":"ContainerDied","Data":"ddb8742c7c1d1e1e25b0bf3149d199a6dce8f6eff14c8b0edd324b843535a8c8"} Dec 04 11:56:16 crc kubenswrapper[4831]: I1204 11:56:16.016593 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8kxr5" event={"ID":"8135008d-f125-4852-ba58-84b9fb4c395a","Type":"ContainerStarted","Data":"59c93998a048f7dc66954b3c92beb3c182e7ec98a4c6ae2c14e482e57c864c36"} Dec 04 11:56:16 crc kubenswrapper[4831]: I1204 11:56:16.039279 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8kxr5" podStartSLOduration=2.629908308 podStartE2EDuration="5.039248396s" podCreationTimestamp="2025-12-04 11:56:11 +0000 UTC" firstStartedPulling="2025-12-04 11:56:12.983536967 +0000 UTC m=+6069.932712281" lastFinishedPulling="2025-12-04 11:56:15.392877055 +0000 UTC m=+6072.342052369" observedRunningTime="2025-12-04 11:56:16.033563263 +0000 UTC m=+6072.982738577" watchObservedRunningTime="2025-12-04 11:56:16.039248396 +0000 UTC m=+6072.988423710" Dec 04 11:56:16 crc kubenswrapper[4831]: I1204 11:56:16.733056 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-8c9449df7-jllzg" podUID="4dc854a6-99fe-4c37-b4a7-7cbbb5f2bde0" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Dec 04 11:56:21 crc kubenswrapper[4831]: I1204 11:56:21.808912 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8kxr5" Dec 04 11:56:21 crc kubenswrapper[4831]: I1204 11:56:21.809445 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8kxr5" Dec 04 11:56:21 crc kubenswrapper[4831]: I1204 11:56:21.855714 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8kxr5" Dec 04 11:56:22 crc kubenswrapper[4831]: I1204 11:56:22.118326 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8kxr5" Dec 04 11:56:22 crc kubenswrapper[4831]: I1204 11:56:22.166811 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8kxr5"] Dec 04 11:56:24 crc kubenswrapper[4831]: I1204 11:56:24.097087 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8kxr5" podUID="8135008d-f125-4852-ba58-84b9fb4c395a" containerName="registry-server" containerID="cri-o://59c93998a048f7dc66954b3c92beb3c182e7ec98a4c6ae2c14e482e57c864c36" gracePeriod=2 Dec 04 11:56:24 crc kubenswrapper[4831]: I1204 11:56:24.595354 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8kxr5" Dec 04 11:56:24 crc kubenswrapper[4831]: I1204 11:56:24.752065 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8135008d-f125-4852-ba58-84b9fb4c395a-catalog-content\") pod \"8135008d-f125-4852-ba58-84b9fb4c395a\" (UID: \"8135008d-f125-4852-ba58-84b9fb4c395a\") " Dec 04 11:56:24 crc kubenswrapper[4831]: I1204 11:56:24.752223 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8135008d-f125-4852-ba58-84b9fb4c395a-utilities\") pod \"8135008d-f125-4852-ba58-84b9fb4c395a\" (UID: \"8135008d-f125-4852-ba58-84b9fb4c395a\") " Dec 04 11:56:24 crc kubenswrapper[4831]: I1204 11:56:24.752287 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6xqh\" (UniqueName: \"kubernetes.io/projected/8135008d-f125-4852-ba58-84b9fb4c395a-kube-api-access-k6xqh\") pod \"8135008d-f125-4852-ba58-84b9fb4c395a\" (UID: \"8135008d-f125-4852-ba58-84b9fb4c395a\") " Dec 04 11:56:24 crc kubenswrapper[4831]: I1204 11:56:24.753015 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8135008d-f125-4852-ba58-84b9fb4c395a-utilities" (OuterVolumeSpecName: "utilities") pod "8135008d-f125-4852-ba58-84b9fb4c395a" (UID: "8135008d-f125-4852-ba58-84b9fb4c395a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:56:24 crc kubenswrapper[4831]: I1204 11:56:24.767438 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8135008d-f125-4852-ba58-84b9fb4c395a-kube-api-access-k6xqh" (OuterVolumeSpecName: "kube-api-access-k6xqh") pod "8135008d-f125-4852-ba58-84b9fb4c395a" (UID: "8135008d-f125-4852-ba58-84b9fb4c395a"). InnerVolumeSpecName "kube-api-access-k6xqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:56:24 crc kubenswrapper[4831]: I1204 11:56:24.785962 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8135008d-f125-4852-ba58-84b9fb4c395a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8135008d-f125-4852-ba58-84b9fb4c395a" (UID: "8135008d-f125-4852-ba58-84b9fb4c395a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:56:24 crc kubenswrapper[4831]: I1204 11:56:24.855193 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8135008d-f125-4852-ba58-84b9fb4c395a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 11:56:24 crc kubenswrapper[4831]: I1204 11:56:24.855228 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8135008d-f125-4852-ba58-84b9fb4c395a-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 11:56:24 crc kubenswrapper[4831]: I1204 11:56:24.855238 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6xqh\" (UniqueName: \"kubernetes.io/projected/8135008d-f125-4852-ba58-84b9fb4c395a-kube-api-access-k6xqh\") on node \"crc\" DevicePath \"\"" Dec 04 11:56:25 crc kubenswrapper[4831]: I1204 11:56:25.108893 4831 generic.go:334] "Generic (PLEG): container finished" podID="8135008d-f125-4852-ba58-84b9fb4c395a" containerID="59c93998a048f7dc66954b3c92beb3c182e7ec98a4c6ae2c14e482e57c864c36" exitCode=0 Dec 04 11:56:25 crc kubenswrapper[4831]: I1204 11:56:25.109001 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8kxr5" Dec 04 11:56:25 crc kubenswrapper[4831]: I1204 11:56:25.109001 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8kxr5" event={"ID":"8135008d-f125-4852-ba58-84b9fb4c395a","Type":"ContainerDied","Data":"59c93998a048f7dc66954b3c92beb3c182e7ec98a4c6ae2c14e482e57c864c36"} Dec 04 11:56:25 crc kubenswrapper[4831]: I1204 11:56:25.115163 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8kxr5" event={"ID":"8135008d-f125-4852-ba58-84b9fb4c395a","Type":"ContainerDied","Data":"5afd5a7a137fd96940d0e7545a84dd39569ecf950ec2458259b0bd5c8bcb96da"} Dec 04 11:56:25 crc kubenswrapper[4831]: I1204 11:56:25.115202 4831 scope.go:117] "RemoveContainer" containerID="59c93998a048f7dc66954b3c92beb3c182e7ec98a4c6ae2c14e482e57c864c36" Dec 04 11:56:25 crc kubenswrapper[4831]: I1204 11:56:25.140612 4831 scope.go:117] "RemoveContainer" containerID="ddb8742c7c1d1e1e25b0bf3149d199a6dce8f6eff14c8b0edd324b843535a8c8" Dec 04 11:56:25 crc kubenswrapper[4831]: I1204 11:56:25.150162 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8kxr5"] Dec 04 11:56:25 crc kubenswrapper[4831]: I1204 11:56:25.162095 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8kxr5"] Dec 04 11:56:25 crc kubenswrapper[4831]: I1204 11:56:25.165784 4831 scope.go:117] "RemoveContainer" containerID="1086beee152870539b9484050a53fcb57f7f752d157147a6e0752baaad836f7c" Dec 04 11:56:25 crc kubenswrapper[4831]: I1204 11:56:25.219066 4831 scope.go:117] "RemoveContainer" containerID="59c93998a048f7dc66954b3c92beb3c182e7ec98a4c6ae2c14e482e57c864c36" Dec 04 11:56:25 crc kubenswrapper[4831]: E1204 11:56:25.219780 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59c93998a048f7dc66954b3c92beb3c182e7ec98a4c6ae2c14e482e57c864c36\": container with ID starting with 59c93998a048f7dc66954b3c92beb3c182e7ec98a4c6ae2c14e482e57c864c36 not found: ID does not exist" containerID="59c93998a048f7dc66954b3c92beb3c182e7ec98a4c6ae2c14e482e57c864c36" Dec 04 11:56:25 crc kubenswrapper[4831]: I1204 11:56:25.219813 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59c93998a048f7dc66954b3c92beb3c182e7ec98a4c6ae2c14e482e57c864c36"} err="failed to get container status \"59c93998a048f7dc66954b3c92beb3c182e7ec98a4c6ae2c14e482e57c864c36\": rpc error: code = NotFound desc = could not find container \"59c93998a048f7dc66954b3c92beb3c182e7ec98a4c6ae2c14e482e57c864c36\": container with ID starting with 59c93998a048f7dc66954b3c92beb3c182e7ec98a4c6ae2c14e482e57c864c36 not found: ID does not exist" Dec 04 11:56:25 crc kubenswrapper[4831]: I1204 11:56:25.219841 4831 scope.go:117] "RemoveContainer" containerID="ddb8742c7c1d1e1e25b0bf3149d199a6dce8f6eff14c8b0edd324b843535a8c8" Dec 04 11:56:25 crc kubenswrapper[4831]: E1204 11:56:25.220064 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddb8742c7c1d1e1e25b0bf3149d199a6dce8f6eff14c8b0edd324b843535a8c8\": container with ID starting with ddb8742c7c1d1e1e25b0bf3149d199a6dce8f6eff14c8b0edd324b843535a8c8 not found: ID does not exist" containerID="ddb8742c7c1d1e1e25b0bf3149d199a6dce8f6eff14c8b0edd324b843535a8c8" Dec 04 11:56:25 crc kubenswrapper[4831]: I1204 11:56:25.220089 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddb8742c7c1d1e1e25b0bf3149d199a6dce8f6eff14c8b0edd324b843535a8c8"} err="failed to get container status \"ddb8742c7c1d1e1e25b0bf3149d199a6dce8f6eff14c8b0edd324b843535a8c8\": rpc error: code = NotFound desc = could not find container \"ddb8742c7c1d1e1e25b0bf3149d199a6dce8f6eff14c8b0edd324b843535a8c8\": container with ID starting with ddb8742c7c1d1e1e25b0bf3149d199a6dce8f6eff14c8b0edd324b843535a8c8 not found: ID does not exist" Dec 04 11:56:25 crc kubenswrapper[4831]: I1204 11:56:25.220103 4831 scope.go:117] "RemoveContainer" containerID="1086beee152870539b9484050a53fcb57f7f752d157147a6e0752baaad836f7c" Dec 04 11:56:25 crc kubenswrapper[4831]: E1204 11:56:25.220342 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1086beee152870539b9484050a53fcb57f7f752d157147a6e0752baaad836f7c\": container with ID starting with 1086beee152870539b9484050a53fcb57f7f752d157147a6e0752baaad836f7c not found: ID does not exist" containerID="1086beee152870539b9484050a53fcb57f7f752d157147a6e0752baaad836f7c" Dec 04 11:56:25 crc kubenswrapper[4831]: I1204 11:56:25.220370 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1086beee152870539b9484050a53fcb57f7f752d157147a6e0752baaad836f7c"} err="failed to get container status \"1086beee152870539b9484050a53fcb57f7f752d157147a6e0752baaad836f7c\": rpc error: code = NotFound desc = could not find container \"1086beee152870539b9484050a53fcb57f7f752d157147a6e0752baaad836f7c\": container with ID starting with 1086beee152870539b9484050a53fcb57f7f752d157147a6e0752baaad836f7c not found: ID does not exist" Dec 04 11:56:25 crc kubenswrapper[4831]: I1204 11:56:25.291985 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8135008d-f125-4852-ba58-84b9fb4c395a" path="/var/lib/kubelet/pods/8135008d-f125-4852-ba58-84b9fb4c395a/volumes" Dec 04 11:56:29 crc kubenswrapper[4831]: I1204 11:56:29.277616 4831 scope.go:117] "RemoveContainer" containerID="4f21f7414de02ac92f4a182ade6a48ec3bca0e71e30c3a2fe8a39329e663fab1" Dec 04 11:56:29 crc kubenswrapper[4831]: E1204 11:56:29.278115 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:56:41 crc kubenswrapper[4831]: I1204 11:56:41.277310 4831 scope.go:117] "RemoveContainer" containerID="4f21f7414de02ac92f4a182ade6a48ec3bca0e71e30c3a2fe8a39329e663fab1" Dec 04 11:56:41 crc kubenswrapper[4831]: E1204 11:56:41.278882 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:56:56 crc kubenswrapper[4831]: I1204 11:56:56.276252 4831 scope.go:117] "RemoveContainer" containerID="4f21f7414de02ac92f4a182ade6a48ec3bca0e71e30c3a2fe8a39329e663fab1" Dec 04 11:56:56 crc kubenswrapper[4831]: E1204 11:56:56.278153 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:57:02 crc kubenswrapper[4831]: I1204 11:57:02.401961 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rfq7f"] Dec 04 11:57:02 crc kubenswrapper[4831]: E1204 11:57:02.403224 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8135008d-f125-4852-ba58-84b9fb4c395a" containerName="extract-utilities" Dec 04 11:57:02 crc kubenswrapper[4831]: I1204 11:57:02.403246 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="8135008d-f125-4852-ba58-84b9fb4c395a" containerName="extract-utilities" Dec 04 11:57:02 crc kubenswrapper[4831]: E1204 11:57:02.403272 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8135008d-f125-4852-ba58-84b9fb4c395a" containerName="registry-server" Dec 04 11:57:02 crc kubenswrapper[4831]: I1204 11:57:02.403281 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="8135008d-f125-4852-ba58-84b9fb4c395a" containerName="registry-server" Dec 04 11:57:02 crc kubenswrapper[4831]: E1204 11:57:02.403309 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8135008d-f125-4852-ba58-84b9fb4c395a" containerName="extract-content" Dec 04 11:57:02 crc kubenswrapper[4831]: I1204 11:57:02.403319 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="8135008d-f125-4852-ba58-84b9fb4c395a" containerName="extract-content" Dec 04 11:57:02 crc kubenswrapper[4831]: I1204 11:57:02.403580 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="8135008d-f125-4852-ba58-84b9fb4c395a" containerName="registry-server" Dec 04 11:57:02 crc kubenswrapper[4831]: I1204 11:57:02.422146 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rfq7f" Dec 04 11:57:02 crc kubenswrapper[4831]: I1204 11:57:02.424518 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rfq7f"] Dec 04 11:57:02 crc kubenswrapper[4831]: I1204 11:57:02.458594 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1-catalog-content\") pod \"community-operators-rfq7f\" (UID: \"f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1\") " pod="openshift-marketplace/community-operators-rfq7f" Dec 04 11:57:02 crc kubenswrapper[4831]: I1204 11:57:02.458737 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmnvx\" (UniqueName: \"kubernetes.io/projected/f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1-kube-api-access-lmnvx\") pod \"community-operators-rfq7f\" (UID: \"f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1\") " pod="openshift-marketplace/community-operators-rfq7f" Dec 04 11:57:02 crc kubenswrapper[4831]: I1204 11:57:02.458825 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1-utilities\") pod \"community-operators-rfq7f\" (UID: \"f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1\") " pod="openshift-marketplace/community-operators-rfq7f" Dec 04 11:57:02 crc kubenswrapper[4831]: I1204 11:57:02.561182 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1-catalog-content\") pod \"community-operators-rfq7f\" (UID: \"f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1\") " pod="openshift-marketplace/community-operators-rfq7f" Dec 04 11:57:02 crc kubenswrapper[4831]: I1204 11:57:02.561231 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmnvx\" (UniqueName: \"kubernetes.io/projected/f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1-kube-api-access-lmnvx\") pod \"community-operators-rfq7f\" (UID: \"f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1\") " pod="openshift-marketplace/community-operators-rfq7f" Dec 04 11:57:02 crc kubenswrapper[4831]: I1204 11:57:02.561264 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1-utilities\") pod \"community-operators-rfq7f\" (UID: \"f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1\") " pod="openshift-marketplace/community-operators-rfq7f" Dec 04 11:57:02 crc kubenswrapper[4831]: I1204 11:57:02.561736 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1-catalog-content\") pod \"community-operators-rfq7f\" (UID: \"f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1\") " pod="openshift-marketplace/community-operators-rfq7f" Dec 04 11:57:02 crc kubenswrapper[4831]: I1204 11:57:02.561911 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1-utilities\") pod \"community-operators-rfq7f\" (UID: \"f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1\") " pod="openshift-marketplace/community-operators-rfq7f" Dec 04 11:57:02 crc kubenswrapper[4831]: I1204 11:57:02.586562 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmnvx\" (UniqueName: \"kubernetes.io/projected/f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1-kube-api-access-lmnvx\") pod \"community-operators-rfq7f\" (UID: \"f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1\") " pod="openshift-marketplace/community-operators-rfq7f" Dec 04 11:57:02 crc kubenswrapper[4831]: I1204 11:57:02.758328 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rfq7f" Dec 04 11:57:03 crc kubenswrapper[4831]: I1204 11:57:03.335994 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rfq7f"] Dec 04 11:57:03 crc kubenswrapper[4831]: I1204 11:57:03.508452 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rfq7f" event={"ID":"f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1","Type":"ContainerStarted","Data":"f0e4154cbecb002f4dea6a14b9a84869b220bf9159b6eddeba9d7186eeed1387"} Dec 04 11:57:04 crc kubenswrapper[4831]: I1204 11:57:04.519571 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rfq7f" event={"ID":"f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1","Type":"ContainerDied","Data":"f47d4f33de62a95d308d2d68530b81d19ab72316b9c6e8e41b9db46e5a7e0158"} Dec 04 11:57:04 crc kubenswrapper[4831]: I1204 11:57:04.519533 4831 generic.go:334] "Generic (PLEG): container finished" podID="f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1" containerID="f47d4f33de62a95d308d2d68530b81d19ab72316b9c6e8e41b9db46e5a7e0158" exitCode=0 Dec 04 11:57:05 crc kubenswrapper[4831]: I1204 11:57:05.544452 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rfq7f" event={"ID":"f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1","Type":"ContainerStarted","Data":"1781645853c8b512b5afb3a8432d5ab0f18a8b5f93b6d04b0684614e0943983a"} Dec 04 11:57:07 crc kubenswrapper[4831]: I1204 11:57:07.277459 4831 scope.go:117] "RemoveContainer" containerID="4f21f7414de02ac92f4a182ade6a48ec3bca0e71e30c3a2fe8a39329e663fab1" Dec 04 11:57:07 crc kubenswrapper[4831]: E1204 11:57:07.278054 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:57:07 crc kubenswrapper[4831]: I1204 11:57:07.566282 4831 generic.go:334] "Generic (PLEG): container finished" podID="f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1" containerID="1781645853c8b512b5afb3a8432d5ab0f18a8b5f93b6d04b0684614e0943983a" exitCode=0 Dec 04 11:57:07 crc kubenswrapper[4831]: I1204 11:57:07.566337 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rfq7f" event={"ID":"f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1","Type":"ContainerDied","Data":"1781645853c8b512b5afb3a8432d5ab0f18a8b5f93b6d04b0684614e0943983a"} Dec 04 11:57:08 crc kubenswrapper[4831]: I1204 11:57:08.577431 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rfq7f" event={"ID":"f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1","Type":"ContainerStarted","Data":"d69130ebdb982a4a4674a58089da5ad5a74c32fdc1267c854d3a6af3d8f05881"} Dec 04 11:57:08 crc kubenswrapper[4831]: I1204 11:57:08.603407 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rfq7f" podStartSLOduration=3.133910964 podStartE2EDuration="6.603385891s" podCreationTimestamp="2025-12-04 11:57:02 +0000 UTC" firstStartedPulling="2025-12-04 11:57:04.523530748 +0000 UTC m=+6121.472706062" lastFinishedPulling="2025-12-04 11:57:07.993005675 +0000 UTC m=+6124.942180989" observedRunningTime="2025-12-04 11:57:08.595791667 +0000 UTC m=+6125.544967001" watchObservedRunningTime="2025-12-04 11:57:08.603385891 +0000 UTC m=+6125.552561195" Dec 04 11:57:12 crc kubenswrapper[4831]: I1204 11:57:12.758964 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rfq7f" Dec 04 11:57:12 crc kubenswrapper[4831]: I1204 11:57:12.760827 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rfq7f" Dec 04 11:57:12 crc kubenswrapper[4831]: I1204 11:57:12.818445 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rfq7f" Dec 04 11:57:13 crc kubenswrapper[4831]: I1204 11:57:13.688894 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rfq7f" Dec 04 11:57:13 crc kubenswrapper[4831]: I1204 11:57:13.737570 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rfq7f"] Dec 04 11:57:15 crc kubenswrapper[4831]: I1204 11:57:15.663831 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rfq7f" podUID="f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1" containerName="registry-server" containerID="cri-o://d69130ebdb982a4a4674a58089da5ad5a74c32fdc1267c854d3a6af3d8f05881" gracePeriod=2 Dec 04 11:57:16 crc kubenswrapper[4831]: I1204 11:57:16.252925 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rfq7f" Dec 04 11:57:16 crc kubenswrapper[4831]: I1204 11:57:16.372857 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1-catalog-content\") pod \"f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1\" (UID: \"f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1\") " Dec 04 11:57:16 crc kubenswrapper[4831]: I1204 11:57:16.373057 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmnvx\" (UniqueName: \"kubernetes.io/projected/f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1-kube-api-access-lmnvx\") pod \"f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1\" (UID: \"f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1\") " Dec 04 11:57:16 crc kubenswrapper[4831]: I1204 11:57:16.373092 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1-utilities\") pod \"f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1\" (UID: \"f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1\") " Dec 04 11:57:16 crc kubenswrapper[4831]: I1204 11:57:16.374971 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1-utilities" (OuterVolumeSpecName: "utilities") pod "f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1" (UID: "f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:57:16 crc kubenswrapper[4831]: I1204 11:57:16.391951 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1-kube-api-access-lmnvx" (OuterVolumeSpecName: "kube-api-access-lmnvx") pod "f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1" (UID: "f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1"). InnerVolumeSpecName "kube-api-access-lmnvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:57:16 crc kubenswrapper[4831]: I1204 11:57:16.447706 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1" (UID: "f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:57:16 crc kubenswrapper[4831]: I1204 11:57:16.475719 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 11:57:16 crc kubenswrapper[4831]: I1204 11:57:16.475994 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmnvx\" (UniqueName: \"kubernetes.io/projected/f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1-kube-api-access-lmnvx\") on node \"crc\" DevicePath \"\"" Dec 04 11:57:16 crc kubenswrapper[4831]: I1204 11:57:16.476091 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 11:57:16 crc kubenswrapper[4831]: I1204 11:57:16.678389 4831 generic.go:334] "Generic (PLEG): container finished" podID="f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1" containerID="d69130ebdb982a4a4674a58089da5ad5a74c32fdc1267c854d3a6af3d8f05881" exitCode=0 Dec 04 11:57:16 crc kubenswrapper[4831]: I1204 11:57:16.678435 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rfq7f" event={"ID":"f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1","Type":"ContainerDied","Data":"d69130ebdb982a4a4674a58089da5ad5a74c32fdc1267c854d3a6af3d8f05881"} Dec 04 11:57:16 crc kubenswrapper[4831]: I1204 11:57:16.678463 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rfq7f" event={"ID":"f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1","Type":"ContainerDied","Data":"f0e4154cbecb002f4dea6a14b9a84869b220bf9159b6eddeba9d7186eeed1387"} Dec 04 11:57:16 crc kubenswrapper[4831]: I1204 11:57:16.678461 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rfq7f" Dec 04 11:57:16 crc kubenswrapper[4831]: I1204 11:57:16.678481 4831 scope.go:117] "RemoveContainer" containerID="d69130ebdb982a4a4674a58089da5ad5a74c32fdc1267c854d3a6af3d8f05881" Dec 04 11:57:16 crc kubenswrapper[4831]: I1204 11:57:16.704611 4831 scope.go:117] "RemoveContainer" containerID="1781645853c8b512b5afb3a8432d5ab0f18a8b5f93b6d04b0684614e0943983a" Dec 04 11:57:16 crc kubenswrapper[4831]: I1204 11:57:16.724804 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rfq7f"] Dec 04 11:57:16 crc kubenswrapper[4831]: I1204 11:57:16.736584 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rfq7f"] Dec 04 11:57:16 crc kubenswrapper[4831]: I1204 11:57:16.742577 4831 scope.go:117] "RemoveContainer" containerID="f47d4f33de62a95d308d2d68530b81d19ab72316b9c6e8e41b9db46e5a7e0158" Dec 04 11:57:16 crc kubenswrapper[4831]: I1204 11:57:16.790267 4831 scope.go:117] "RemoveContainer" containerID="d69130ebdb982a4a4674a58089da5ad5a74c32fdc1267c854d3a6af3d8f05881" Dec 04 11:57:16 crc kubenswrapper[4831]: E1204 11:57:16.791030 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d69130ebdb982a4a4674a58089da5ad5a74c32fdc1267c854d3a6af3d8f05881\": container with ID starting with d69130ebdb982a4a4674a58089da5ad5a74c32fdc1267c854d3a6af3d8f05881 not found: ID does not exist" containerID="d69130ebdb982a4a4674a58089da5ad5a74c32fdc1267c854d3a6af3d8f05881" Dec 04 11:57:16 crc kubenswrapper[4831]: I1204 11:57:16.791103 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d69130ebdb982a4a4674a58089da5ad5a74c32fdc1267c854d3a6af3d8f05881"} err="failed to get container status \"d69130ebdb982a4a4674a58089da5ad5a74c32fdc1267c854d3a6af3d8f05881\": rpc error: code = NotFound desc = could not find container \"d69130ebdb982a4a4674a58089da5ad5a74c32fdc1267c854d3a6af3d8f05881\": container with ID starting with d69130ebdb982a4a4674a58089da5ad5a74c32fdc1267c854d3a6af3d8f05881 not found: ID does not exist" Dec 04 11:57:16 crc kubenswrapper[4831]: I1204 11:57:16.791148 4831 scope.go:117] "RemoveContainer" containerID="1781645853c8b512b5afb3a8432d5ab0f18a8b5f93b6d04b0684614e0943983a" Dec 04 11:57:16 crc kubenswrapper[4831]: E1204 11:57:16.791936 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1781645853c8b512b5afb3a8432d5ab0f18a8b5f93b6d04b0684614e0943983a\": container with ID starting with 1781645853c8b512b5afb3a8432d5ab0f18a8b5f93b6d04b0684614e0943983a not found: ID does not exist" containerID="1781645853c8b512b5afb3a8432d5ab0f18a8b5f93b6d04b0684614e0943983a" Dec 04 11:57:16 crc kubenswrapper[4831]: I1204 11:57:16.791995 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1781645853c8b512b5afb3a8432d5ab0f18a8b5f93b6d04b0684614e0943983a"} err="failed to get container status \"1781645853c8b512b5afb3a8432d5ab0f18a8b5f93b6d04b0684614e0943983a\": rpc error: code = NotFound desc = could not find container \"1781645853c8b512b5afb3a8432d5ab0f18a8b5f93b6d04b0684614e0943983a\": container with ID starting with 1781645853c8b512b5afb3a8432d5ab0f18a8b5f93b6d04b0684614e0943983a not found: ID does not exist" Dec 04 11:57:16 crc kubenswrapper[4831]: I1204 11:57:16.792031 4831 scope.go:117] "RemoveContainer" containerID="f47d4f33de62a95d308d2d68530b81d19ab72316b9c6e8e41b9db46e5a7e0158" Dec 04 11:57:16 crc kubenswrapper[4831]: E1204 11:57:16.792368 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f47d4f33de62a95d308d2d68530b81d19ab72316b9c6e8e41b9db46e5a7e0158\": container with ID starting with f47d4f33de62a95d308d2d68530b81d19ab72316b9c6e8e41b9db46e5a7e0158 not found: ID does not exist" containerID="f47d4f33de62a95d308d2d68530b81d19ab72316b9c6e8e41b9db46e5a7e0158" Dec 04 11:57:16 crc kubenswrapper[4831]: I1204 11:57:16.792390 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f47d4f33de62a95d308d2d68530b81d19ab72316b9c6e8e41b9db46e5a7e0158"} err="failed to get container status \"f47d4f33de62a95d308d2d68530b81d19ab72316b9c6e8e41b9db46e5a7e0158\": rpc error: code = NotFound desc = could not find container \"f47d4f33de62a95d308d2d68530b81d19ab72316b9c6e8e41b9db46e5a7e0158\": container with ID starting with f47d4f33de62a95d308d2d68530b81d19ab72316b9c6e8e41b9db46e5a7e0158 not found: ID does not exist" Dec 04 11:57:17 crc kubenswrapper[4831]: I1204 11:57:17.295513 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1" path="/var/lib/kubelet/pods/f5627ac6-3bc9-4fdc-bbe7-3f07d207b7d1/volumes" Dec 04 11:57:21 crc kubenswrapper[4831]: I1204 11:57:21.276854 4831 scope.go:117] "RemoveContainer" containerID="4f21f7414de02ac92f4a182ade6a48ec3bca0e71e30c3a2fe8a39329e663fab1" Dec 04 11:57:21 crc kubenswrapper[4831]: E1204 11:57:21.277748 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:57:35 crc kubenswrapper[4831]: I1204 11:57:35.276688 4831 scope.go:117] "RemoveContainer" containerID="4f21f7414de02ac92f4a182ade6a48ec3bca0e71e30c3a2fe8a39329e663fab1" Dec 04 11:57:35 crc kubenswrapper[4831]: E1204 11:57:35.277502 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387" Dec 04 11:57:47 crc kubenswrapper[4831]: I1204 11:57:47.276925 4831 scope.go:117] "RemoveContainer" containerID="4f21f7414de02ac92f4a182ade6a48ec3bca0e71e30c3a2fe8a39329e663fab1" Dec 04 11:57:47 crc kubenswrapper[4831]: E1204 11:57:47.278379 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g76nn_openshift-machine-config-operator(8475bb26-8864-4d49-935b-db7d4cb73387)\"" pod="openshift-machine-config-operator/machine-config-daemon-g76nn" podUID="8475bb26-8864-4d49-935b-db7d4cb73387"